Alphabet’s AI research company DeepMind has released the next generation of its language model, and it says that it has close to the reading comprehension of a high schooler — a startling claim.
It says the language model, called Gopher, was able to significantly improve its reading comprehension by ingesting massive repositories of texts online.
DeepMind boasts that its algorithm, an “ultra-large language model,” has 280 billion parameters, which are a measure of size and complexity. That means it falls somewhere between OpenAI’s GPT-3 (175 billion parameters) and Microsoft and NVIDIA’s Megatron, which features 530 billion parameters, The Verge points out.