GPT-3 (Generative Pre-training Transformer, version 3) is a form of Artificial Intelligence known as machine learning. It has been trained on an incredibly large data set with 175 billions parameters derived from 500 billion words (tokens). To put this in perspective, all of Wikipedia resolves to 3 billion tokens, and it is one small part (less than 1%) of the training data. The rest comes from thousands of books and a huge amount of internet content. The resulting capabilities can be mesmerizing.
Videdia is your video encyclopedia and your place to learn about everything – Visit the Table of Contents to find lots more topics. If you want to learn more about this topic, try these tips:
- If you like a particular video, visit the video’s channel. Subscribe to the channel if you want to see new content or to show your support.
- Look for related videos. If you pull up the video in YouTube, then YouTube will often recommend related videos.
- Search YouTube and Google for more information on the topic.
Come back to Videdia every day to learn new things.