How to run HugginFace models in Python

How to run HuggingFace models in Python


Hugging Face

Hugging face is a machine learning community where you can share pre trained models and explore other trained models, which are suited for many use cases.

Many of the models already trained well and ready to use. Without delay get started!

Python setup

Open your Jupiter notebook or for easy startup without python installation on local machine, use Colab.

In this example we are using mistralai/Mistral-7B-v0.1 model. Let’s install the transformers module which is the primary dependency for HuggingFace.

conda install transformers

Following script will work like a charm.

# mistrel model test

from transformers import AutoModelForCausalLM, AutoTokenizer

model_id = "mistralai/Mistral-7B-v0.1"
tokenizer = AutoTokenizer.from_pretrained(model_id)

model = AutoModelForCausalLM.from_pretrained(model_id)

text = "Hello my name is"
inputs = tokenizer(text, return_tensors="pt")

outputs = model.generate(**inputs, max_new_tokens=20)
print(tokenizer.decode(outputs[0], skip_special_tokens=True)) 

mistrel model will complete the sentence for you.

Author: Manoj

Developer and a self-learner, love to work with Reactjs, Angular, Node, Python and C#.Net

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.