Get embeddings
How to create embeddings¶
In this notebook, we will show you how to create embeddings for your own text data and and open source model from Hugging Face hosted as an endpoint on Hugging Face Inference API.
1. Import the easyllm library¶
In [ ]:
Copied!
# if needed, install and/or upgrade to the latest version of the OpenAI Python library
%pip install --upgrade easyllm
# if needed, install and/or upgrade to the latest version of the OpenAI Python library
%pip install --upgrade easyllm
In [ ]:
Copied!
# import the EasyLLM Python library for calling the EasyLLM API
import easyllm
# import the EasyLLM Python library for calling the EasyLLM API
import easyllm
2. An example chat API call¶
A embedding API call has two required inputs:
model
: the name of the model you want to use (e.g.,sentence-transformers/all-MiniLM-L6-v2
) or leave it empty to just call the apiinput
: a string or list of strings you want to embed
Let's look at an example API calls to see how the chat format works in practice.
In [2]:
Copied!
# import os
# os.environ["HUGGINGFACE_TOKEN"] = "hf_xxx" # Use Environment Variable
from easyllm.clients import huggingface
# The module automatically loads the HuggingFace API key from the environment variable HUGGINGFACE_TOKEN or from the HuggingFace CLI configuration file.
# huggingface.api_key="hf_xxx"
embedding = huggingface.Embedding.create(
model="sentence-transformers/all-MiniLM-L6-v2",
input="That's a nice car.",
)
len(embedding["data"][0]["embedding"])
# import os
# os.environ["HUGGINGFACE_TOKEN"] = "hf_xxx" # Use Environment Variable
from easyllm.clients import huggingface
# The module automatically loads the HuggingFace API key from the environment variable HUGGINGFACE_TOKEN or from the HuggingFace CLI configuration file.
# huggingface.api_key="hf_xxx"
embedding = huggingface.Embedding.create(
model="sentence-transformers/all-MiniLM-L6-v2",
input="That's a nice car.",
)
len(embedding["data"][0]["embedding"])
Out[2]:
384
Batched Request
In [3]:
Copied!
from easyllm.clients import huggingface
# The module automatically loads the HuggingFace API key from the environment variable HUGGINGFACE_TOKEN or from the HuggingFace CLI configuration file.
# huggingface.api_key="hf_xxx"
embedding = huggingface.Embedding.create(
model="sentence-transformers/all-MiniLM-L6-v2",
input=["What is the meaning of life?","test"],
)
len(embedding["data"])
from easyllm.clients import huggingface
# The module automatically loads the HuggingFace API key from the environment variable HUGGINGFACE_TOKEN or from the HuggingFace CLI configuration file.
# huggingface.api_key="hf_xxx"
embedding = huggingface.Embedding.create(
model="sentence-transformers/all-MiniLM-L6-v2",
input=["What is the meaning of life?","test"],
)
len(embedding["data"])
Out[3]:
2
In [ ]:
Copied!