GenAI Studio
  • Edge AI SDK/GenAI Studio
  • Getting Started
    • About GenAI Studio
    • Quickstart
      • Prerequisite
      • Installation
      • Utilities
    • Feature Overview
      • Inference Chat
      • Fine-tuning
      • Model Management
      • Application
    • Version History
      • Version 1.1
      • Version 1.0
  • Inference
    • Chat Inference
    • AI Agents
  • Finetune
    • Text-to-Text
      • Overview
      • Full Parameter
      • LoRA
    • Text-to-Image (Coming Soon)
    • Dataset Management
    • Schedule
  • Model
    • Model Management
  • Validation
  • Convert
  • Administration
    • Resource Monitoring
  • System Configuration
    • AI Providers
      • LLM Setup
      • Embedder Setup
      • Vector DB
      • Transcription Setup
    • System Administration
      • Users
      • Workspace Chats
      • Invites
      • GPU Resource
      • Register an App
    • Appearance Customization
    • Tools
      • Embedded Chat Widgets
      • Event Logs
      • Security & Access
  • Application
    • Text to Image
    • Background Removal
    • OCR
  • FAQ
    • Technical
Powered by GitBook
On this page
  • Supported Embedding Model Providers
  • Local Embedding Model Providers
  1. System Configuration
  2. AI Providers

Embedder Setup

PreviousLLM SetupNextVector DB

Last updated 2 months ago

GenAI Studio supports many embedding model providers out of the box with very little, if any setup. Embedding models are specific types of models that turn text into vectors, which can be stored and searched in a vector database - which is the foundation of RAG.

Supported Embedding Model Providers

Local Embedding Model Providers

  • Ollama and bge-m3 embedding model (Built-in by default)