From the course: AI Text Summarization with Hugging Face

Unlock this course with a free trial

Join today to access over 24,700 courses taught by industry experts.

Transformers in Hugging Face

Transformers in Hugging Face

When we discussed Hugging Face earlier, we mentioned that Hugging Face is best known for developing and maintaining the Hugging Face Transformers library, an open-source NLP library built on top of PyTorch and TensorFlow. In the demos that follow, we'll be using models from this library for abstractive summarization. The Hugging Face Transformers library makes it very easy to download and use a state-of-the-art NLP model for inference. Hugging Face allows you to access these models using a very simple Python API that you can simply pip install on your machine. All models on Hugging Face are generally PyTorch or TensorFlow models, so you're working with frameworks that you're likely familiar with. And you will see in just a bit how simple and straightforward it is for you to use models hosted on Hugging Face. When you work with NLP models on Hugging Face, you'll be instantiating and using a Hugging Face pipeline. A pipeline is a high-level interface provided by the Hugging Face…

Contents