Exclusive Video Marketing Software for Startups 

Google trained a trillion-parameter AI language model

For example, OpenAI’s GPT-3, one of the largest language models ever trained at 175 billion parameters, can make primitive analogies, generate recipes, and even complete basic code. They say that their 1.6-trillion-parameter model, which appears to be the largest of its size to date, achieved an up to 4 times speedup over the previous-largest Google-developed language model (T5-XXL). The Switch Transformer builds on mixture of experts, an AI model paradigm first proposed in the early ’90s. This being the case, the Switch Transformer led to gains in a number of downstream tasks. Unfortunately, the researchers’ work didn’t take into account the impact of these large language models in the real world.

Startup Around is now on Telegram and Whatsapp. For the best startup stories & resources, subscribe us on Telegram & Whatsapp.
World-class video marketing software for startups