Hugging Face Transformers is a powerful library designed for natural language processing (NLP), computer vision, and audio tasks. It provides state-of-the-art pretrained models and tools for fine-tuning and deploying these models across various machine learning frameworks like PyTorch, TensorFlow, and JAX.

Key Features:

Pretrained Models: The library offers a vast collection of pretrained models for tasks such as text classification, summarization, translation, image classification, and speech recognition. These models save time and resources by eliminating the need to train from scratch.

  1. Pipeline API: This high-level API simplifies the use of pretrained models for inference. For example, you can perform sentiment analysis, text generation, or image classification with just a few lines of code.
  2. Multimodal Capabilities: Hugging Face Transformers supports tasks that combine multiple data types, such as visual question answering and document question answering.
  3. Customizability: Each model architecture is modular, allowing researchers and developers to experiment and customize models for specific needs.

Community and Resources: Hugging Face has an active community and extensive documentation, making it easier for beginners and experts alike to get started.

How to use Pretrained Models using transformers:

Using pretrained models with Hugging Face Transformers is straightforward and versatile. Here’s a step-by-step guide:

Steps to Use Pretrained Models

  1. Install the Library: First, ensure you have the Hugging Face Transformers library installed. Use the following command:
  2. pip install transformers
  3. Import Necessary Modules: Import the required classes for the task you’re working on. For example, if you want to use a text classification model:
  4. from transformers import pipeline
  5. Choose and Load a Pretrained Model: Use the pipeline API to load a model. For sentiment analysis, for instance:
  6. classifier = pipeline(“sentiment-analysis”)

Hugging Face provides a wide range of pretrained models for tasks such as:

  • Sentiment analysis
  • Text summarization
  • Translation
  • Question answering
  • Text generation
  • Perform Inference: Once the model is loaded, you can use it to process your input data. For example:
  • result = classifier(“I love using Hugging Face Transformers!”)
  • print(result)

This will give you predictions like the sentiment of the text.Fine-Tune or Customize (Optional): If the pretrained model needs to be adapted to specific data.

Hugging Face offers tools to fine-tune the model. For this, you might need to use PyTorch, TensorFlow, or JAX frameworks along with datasets.

Example for Text Generation:

Here’s another example using a text-generation model like GPT:

from transformers import pipeline

generator = pipeline("text-generation", model="gpt2")
result = generator("Once upon a time", max_length=50, num_return_sequences=1)
print(result)

Above code will generates a continuation of the input text.

Few more code example of Transformers:

Important points:

Follow below steps to configure virtual environment first:

  1. Configure virtual env for Python
  2. pip install transformers => used to install transformers latest version.
  3. Pip install torch => used to install Pytorch
  4. Pip install hf_xet 
  5. Now run above code and you will see the output. Enjoy !!.

Program-1

As per the entered text it will create next text for max of 50 lengths. Just play with this code sample:

from transformers import pipeline

generator = pipeline("text-generation", model="gpt2")
result = generator("Once upon a time, a story", max_length=50, num_return_sequences=1)
print(result)

    Program-2

    Below program is checking sentiment of the given sentence. Please give a try:

    from transformers import pipeline
    
    classifier = pipeline("sentiment-analysis", model="distilbert-base-uncased-finetuned-sst-2-english")
    result = classifier("I love using Hugging Face Transformers!")
    print(result)
    

    Explore More

    AI Roadmap using Python

    Python is a great foundation for diving into AI! To take the next step in your AI learning journey, here’s a roadmap to guide you. 1. Strengthen Your Math Skills

    What are Large Language Models

    LLM stands for Large Language Model in the context of artificial intelligence. It refers to a type of AI model designed to understand, generate, and manipulate human language on a

    Python 10 Programs for Beginners

    Program 1:Write a program to create a 2D array using NumPy. Output: Program 2: Write a program to convert a python list to a NumPy array. Output: Program 3: Write a program to create matrix of 3×3 from 11 to 30. Output: Program 4: Write a program to create a data frame to store data of candidates who appeared in