Repository Context
The repository context is used to connect Tabby with a source code repository from Git, GitHub, GitLab, etc. Tabby fetches the source code from the repository, parses it into an AST, and stores it in the index. During LLM inference, this context is utilized for code completion, as well as chat and search functionalities.
Adding a Repository through Admin UIβ
- Navigate to the Integrations > Repository Providers page.
![Repository Providers](/assets/images/repository-providers-36c3bd97a76529f341a207067b6cfc84.png)
- Click Create to begin the process of adding a repository provider.
-
For Git, you only need to fill in the name and the URL of the repository.
Local repositories are supported via the file:// protocol, but if running from a Docker container, you need to make it accessible with the
--volume
flag and use the internal Docker path. -
For GitHub / GitLab, a personal access token is required to access private repositories.
- Check the instructions in the corresponding tab to create a token.
- Once the token is set, you can add the repository by selecting it from the dropdown list.
Adding a Repository through configuration fileβ
~/.tabby/config.toml
is the configuration file for Tabby. You can add repositories with it as well, and it's also the only way to add a repository for the Tabby OSS.
[[repositories]]
name = "tabby"
git_url = "https://github.com/TabbyML/tabby.git"
# git through ssh protocol.
[[repositories]]
name = "CTranslate2"
git_url = "git@github.com:OpenNMT/CTranslate2.git"
# local directory is also supported!
[[repositories]]
name = "repository_a"
git_url = "file:///home/users/repository_a"
Verifying the Repository Providerβ
Once connected, the indexing job will start automatically. You can check the status of the indexing job on the Information > Jobs page.
Additionally, you can also visit the Code Browser page to view the connected repository.
'
Internal: Vector Indexβ
When adding a document, it is converted into vectors that help quickly find relevant context. During searches or chats, queries and messages are also converted into vectors to locate the most similar documents.
Use the default embedding modelβ
The default embedding model is "Nomic-Embed-Text", which is a high-performing open embedding model with a large token context window.
Currently, "Nomic-Embed-Text" is the only supported local embedding model.
Using a remote embedding model providerβ
You can add also a remote embedding model provider by adding a new section to the ~/.tabby/config.toml
file.
[model.embedding.http]
kind = "openai/embedding"
api_endpoint = "https://api.openai.com"
api_key = "sk-..."
model_name = "text-embedding-3-small"
Following embedding model providers are supported:
openai/embedding
voyageai/embedding
llama.cpp/embedding
ollama/embedding