Embeddings are representations of values or objects like text, images, and audio that are designed to be consumed by machine learning models and semantic search algorithms. Gleef uses embeddings to store existing localization keys. If you want to add existing keys to your project, please refer to our tutorial.
If you’re using the Gleef CLI and Gleef Studio, translation memory is managed automatically, you don’t have to do anything!

Storing existing keys

How existing keys are stored

Existing keys are stored in a vector database. This database is used to find the most similar keys to the one you are trying to translate. The similarity is calculated using the cosine similarity between the embeddings of the keys.

How the AI works with existing keys

The AI leverages embeddings to:
  1. Find semantically similar existing keys using cosine similarity
  2. Use these similar keys as context to generate accurate translations
  3. Prevent duplicate key creation by detecting similar existing keys