Transfer Learning: The Democratization of Transformers

Hugging Face library makes Transformers affordable for everyone

Jordi TORRES.AI
3 min readDec 1, 2021
(Original version of this post in catalan)

We have already seen that Transformers neural networks are currently the dominant technology in the field of Natural Language Processing, being able to be used in tasks such as classification, extraction, generation, etc. For example, it allows you to classify a text according to whether it has been written with a positive or negative sentiment, extract a summary of it or generate its translation into another language. In addition, the use of Transformers is not only limited to written text; they can also be used at the same time in combination with voice or images, for example generating a description of the content of an image or a transcription of an audio note.

But we have already seen the high computational cost of training Transformers neural networks, making this technology only affordable for big tech companies that have sufficient financial resources to access high-performance computing infrastructures. Therefore, one of the main challenges at a practical level is that the Transformers can be more accessible for business applications of any company.

This “democratization” of the Transformers is being achieved thanks to the collaborative spirit of the Artificial Intelligence research…

--

--

Jordi TORRES.AI

Professor at UPC Barcelona Tech & Barcelona Supercomputing Center. Research focuses on Supercomputing & Artificial Intelligence https://torres.ai @JordiTorresAI