What is a large language model????

Large Language Models (LLM ) are nothing more than machine learning models that are capable of performing a variety of Natural Language Processing (NLP) tasks. They are trained on huge data sets so that they are able to answer questions, generate their own content, properly classify it, summarize it or translate it into foreign languages.🍹

The appearance of their next generations of such models is proof of the rapid progress in the development of artificial intelligence.

It is estimated that the size of large language models has increased tenfold each year in recent years. As their size and, consequently, the level of complexity increase, so do their capabilities.

This is perfectly visible on the example of ChatGPT, which in its previous version was not so precise.🎺

--------------------------&---------------------------------

It couldn’t handle even longer written forms, it was often repetitive and as a result it didn’t deliver the values ​​expected by the end user. These imperfections have been largely eliminated in the currently available version of the model, but it is still far from perfect. However, this does not change the fact that its capabilities are impressive. Also impressive is the work done by the algorithms thanks to which it is able to surprise users and change our reality.

To build a model to support ChatGPT, OpenAI used a Microsoft-provided tens of millions of dollars worth of supercomputer , which at the time was among the top five most powerful machines in the world.

Thanks in advance 😭

undefined

New comment