Objective
By the end of this session, you’ll gain hands-on experience with Hugging Face, understand its capabilities, and solve real-world problems using pre-trained models. Let’s explore how Hugging Face makes cutting-edge AI accessible and practical. ????
Part 1: What is Hugging Face?
Hugging Face is like a library of pre-trained AI models, ready to solve tasks such as text summarization, sentiment analysis, translation, and more.
Why Use Hugging Face?
- Ease of Use: No need to train models from scratch.
- Diverse Models: Includes models for text, audio, and image tasks.
- Open Source: Community-driven, allowing innovation and collaboration.
Real-World Applications
- Customer Sentiment Analysis:
- Example: A company analyzes customer reviews to improve its products.
- Text Summarization:
- Example: News platforms summarize lengthy articles for quick reads.
- Chatbot Enhancements:
- Example: Hugging Face models improve chatbot responses with natural conversations.
Part 2: Setting Up Hugging Face
Step 1: Install the Library
Run the following command in your terminal:
!pip install transformers
Requirement already satisfied: transformers in /usr/local/lib/python3.11/dist-packages (4.47.1)
Requirement already satisfied: filelock in /usr/local/lib/python3.11/dist-packages (from transformers) (3.17.0)
Requirement already satisfied: huggingface-hub<1.0,>=0.24.0 in /usr/local/lib/python3.11/dist-packages (from transformers) (0.27.1)
Requirement already satisfied: numpy>=1.17 in /usr/local/lib/python3.11/dist-packages (from transformers) (1.26.4)
Requirement already satisfied: packaging>=20.0 in /usr/local/lib/python3.11/dist-packages (from transformers) (24.2)
Requirement already satisfied: pyyaml>=5.1 in /usr/local/lib/python3.11/dist-packages (from transformers) (6.0.2)
Requirement already satisfied: regex!=2019.12.17 in /usr/local/lib/python3.11/dist-packages (from transformers) (2024.11.6)
Requirement already satisfied: requests in /usr/local/lib/python3.11/dist-packages (from transformers) (2.32.3)
Requirement already satisfied: tokenizers<0.22,>=0.21 in /usr/local/lib/python3.11/dist-packages (from transformers) (0.21.0)
Requirement already satisfied: safetensors>=0.4.1 in /usr/local/lib/python3.11/dist-packages (from transformers) (0.5.2)
Requirement already satisfied: tqdm>=4.27 in /usr/local/lib/python3.11/dist-packages (from transformers) (4.67.1)
Requirement already satisfied: fsspec>=2023.5.0 in /usr/local/lib/python3.11/dist-packages (from huggingface-hub<1.0,>=0.24.0->transformers) (2024.10.0)
Requirement already satisfied: typing-extensions>=3.7.4.3 in /usr/local/lib/python3.11/dist-packages (from huggingface-hub<1.0,>=0.24.0->transformers) (4.12.2)
Requirement already satisfied: charset-normalizer<4,>=2 in /usr/local/lib/python3.11/dist-packages (from requests->transformers) (3.4.1)
Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.11/dist-packages (from requests->transformers) (3.10)
Requirement already satisfied: urllib3<3,>=1.21.1 in /usr/local/lib/python3.11/dist-packages (from requests->transformers) (2.3.0)
Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.11/dist-packages (from requests->transformers) (2024.12.14)
Step 2: Importing Hugging Face Tools
from transformers import pipeline
Part 3: Hands-On Activities
Activity 1: Sentiment Analysis
Analyze customer reviews to determine sentiment.
Code Example
from transformers import pipeline
# Load a pre-trained sentiment analysis pipeline
sentiment_analyzer = pipeline("sentiment-analysis")
# Input text
text = "The product quality is amazing! Highly recommend it."
# Analyze sentiment
result = sentiment_analyzer(text)
print("Sentiment:", result[0]['label'])
print("Confidence Score:", result[0]['score'])
No model was supplied, defaulted to distilbert/distilbert-base-uncased-finetuned-sst-2-english and revision 714eb0f (https://huggingface.co/distilbert/distilbert-base-uncased-finetuned-sst-2-english).
Using a pipeline without specifying a model name and revision in production is not recommended.
Device set to use cpu
Sentiment: POSITIVE
Confidence Score: 0.9998847246170044
Output:
Sentiment: POSITIVE
Confidence Score: 0.99987
Activity 2: Text Summarization
Summarize long articles or text passages.
Code Example
from transformers import pipeline
# Load a pre-trained summarization pipeline
summarizer = pipeline("summarization")
# Input text
article = """
The rapid advancements in AI have revolutionized multiple industries,
from healthcare to finance. By leveraging machine learning, organizations
are automating processes and uncovering insights like never before.
"""
# Generate summary
summary = summarizer(article, max_length=50, min_length=20, do_sample=False)
print("Summary:", summary[0]['summary_text'])
No model was supplied, defaulted to sshleifer/distilbart-cnn-12-6 and revision a4f8f3e (https://huggingface.co/sshleifer/distilbart-cnn-12-6).
Using a pipeline without specifying a model name and revision in production is not recommended.
Device set to use cpu
Your max_length is set to 50, but your input_length is only 41. Since this is a summarization task, where outputs shorter than the input are typically wanted, you might consider decreasing max_length manually, e.g. summarizer('...', max_length=20)
Summary: The rapid advancements in AI have revolutionized multiple industries, from healthcare to finance . By leveraging machine learning, organizations are automating processes and uncovering insights like never before .
Output:
“AI advancements have transformed industries, automating processes and providing new insights.”
Activity 3: Question-Answering
Answer questions based on a given context.
Code Example
from transformers import pipeline
# Load a pre-trained question-answering pipeline
qa = pipeline("question-answering")
# Input context
context = """
The Eiffel Tower is a wrought-iron lattice tower in Paris, France.
It was named after the engineer Gustave Eiffel, whose company designed and built the tower.
"""
# Ask a question
question = "Who designed the Eiffel Tower?"
# Get the answer
answer = qa(question=question, context=context)
print("Answer:", answer['answer'])
No model was supplied, defaulted to distilbert/distilbert-base-cased-distilled-squad and revision 564e9b5 (https://huggingface.co/distilbert/distilbert-base-cased-distilled-squad).
Using a pipeline without specifying a model name and revision in production is not recommended.
Device set to use cpu
Answer: Gustave Eiffel
Output:
“Gustave Eiffel”
Part 4: Solving Real-World Problems
Problem: Analyzing Product Reviews
- Use Hugging Face to extract sentiment from hundreds of customer reviews in seconds.
- Example Code: Use
sentiment-analysis pipeline in a loop to process multiple reviews.
Part 5: Discussion
Q: What Makes Hugging Face Powerful?
- Pre-trained models reduce time and effort.
- Flexible pipelines for various AI tasks.
- Open-source community fosters innovation.
Q: What Are Its Challenges?
- Large models may require powerful hardware.
- Fine-tuning models for specific tasks can be complex.
Part 6: Wrap-Up & Reflection
What We Learned
- Overview of Hugging Face and its tools.
- Real-world applications like sentiment analysis and text summarization.
- Practical coding activities to solve real-world problems.
Quick Activity
- Which Hugging Face model intrigued you the most and why?
- Share ideas for how it can be applied to a specific domain like education or healthcare.