It’s the Largest GPT-Like Neural Network in Open Source.
Yandex has published a new, large-scale transformer-based language model in open source, claiming it to be the largest GPT-like neural network to be open-source for the developer community.
In their Medium article, authored by Mikhail Khrushchev, a Senior Developer in YaLM, Yandex has said:
We’ve been using YaLM family of language models in our Alice voice assistant and Yandex Search for more than a year now. Today, we have made our largest ever YaLM model, which leverages 100 billion parameters, available for free.
The article, including how to access YaLM 100B, how it was trained, and the experiences behind it can be read here.
Leave a Reply