Archive
Archive
2024
- January 24 - Running Tabby Locally with AMD ROCm
- February 5 - Create Tabby extension with Language Server Protocol
- March 25 - Deploy Tabby in Air-Gapped Environment with Docker
- March 26 - Tabby with Replicas and a Reverse Proxy
- April 8 - Connect Private GitHub Repository to Tabby
- May 1 - Vulkan Support: LLMs for Everyone
- June 11 - Rank Fusion for improved Code Context in Tabby
- July 9 - Introducing the Codestral Integration in Tabby
2023
- August 31 - Introducing First Stable Release: v0.0.1
- September 5 - Deploying a Tabby Instance in Hugging Face Spaces
- September 18 - Tabby v0.1.1: Metal inference and StarCoder supports!
- September 30 - Stream laziness in Tabby
- October 14 - Announcing our $3.2M seed round, and the long-awaited RAG release in Tabby v0.3.0
- October 16 - Repository context for LLM assisted code completion
- October 21 - Decode the Decoding in Tabby
- November 13 - Cracking the Coding Evaluation
- November 23 - Introducing the Coding LLM Leaderboard