Gemini Rising
Have you ever struggled to get ChatGPT to understand context?
In the world of artificial intelligence language models, that context is limited by ‘tokens’.
Tokens are contextual pieces of information like individual words, parts of words and punctuation symbols.
ChatGPT 3.5 (the free version) has a maximum token length of 4096. That means if your text conversation or request exceeds that, you’re going to have to shorten it for ChatGPT to give you a useful response.
An AI model’s “context window” is the amount of information it can see at once. The larger the context window, the more tokens it can process at a time. This means it can understand and work with more information when given a prompt, leading to more reliable and relevant responses. Essentially, a bigger context window helps the AI produce better results.
Currently, Google’s competitor Gemini 1.5 (formally known as Bard) has a contextual understanding of 138,000 tokens.
However, as with all things AI, Gemini 1.5 exponentially improves on its predecessor with a whopping 1,000,000 token limit.
This means you can upload multiple hours of movies, entire novels in pdfs, and codebases, and Gemini 1.5 will consume it all in preparation for its output.
It will soon be available for third parties and consumers. It’s more efficient to train and can complete a broad spectrum of tasks. It’s not available yet, as tweaks are still being made to improve the latency and polish the user experience.
Google’s suite of products is about to get a whole lot more useful.
TLDR: Gemini 1.5 is enabling more useful and capable applications of artificial intelligence.