EmbeddingGemma - Open Model for On-Device Text Embeddings

Overview

[AI Summary]: Google has released EmbeddingGemma, a state-of-the-art 308 million parameter text embedding model designed to power generative AI experiences directly on hardware devices. This open model is optimized for mobile-first AI applications with only 300MB RAM footprint, enabling features like semantic search, information retrieval, and on-device RAG (Retrieval-Augmented Generation) capabilities. The model achieves best-in-class MTEB (Massive Text Embedding Benchmark) scores while maintaining privacy through on-device processing and offline functionality.

  • Developer: Google for Developers
  • License: Open source
  • Platform: Mobile devices, web browsers, edge devices
  • Model Size: 308 million parameters (~300MB RAM)