We’re thrilled to announce that Codestral, the newest high-performance model from Mistral, is now available on Tabnine. Starting today, you can use Codestral to power code generation, code explanations, documentation generation, AI-created tests, and much more. When you use Codestral as the LLM underpinning Tabnine, its outsized 32k context window will deliver quick response times for Tabnine’s personalized AI coding recommendations.
Released on May 29, 2024, Codestral is Mistral’s first-ever code model that’s fluent in 80+ programming languages. Tabnine has observed excellent coding performance with popular languages including Python, Java, C, C++, and Bash, and it also performs well on less common languages like Swift and Fortran.
Mistral’s announcement blog post shared some fascinating data on the performance of Codestral benchmarked against three much larger models: CodeLlama 70B, DeepSeek Coder 33B, and Llama 3 70B. They tested it using HumanEval pass@1, MBPP sanitized pass@1, CruxEval, RepoBench EM, and the Spider benchmark. The languages they compared performance on included Python, SQL, C++, Bash, Java, PHP, Typescript, and C#.
Based on Mistral’s performance benchmarking, you can expect Codestral to significantly outperform the other tested models in Python, Bash, Java, and PHP, with on-par performance on the other languages tested. The really fascinating innovation with Codestral is that it delivers high performance with the highest observed efficiency. Codestral gives you a great cost-to-performance ratio.
We launched the switchable models capability for Tabnine in April 2024, originally offering our customers two Tabnine models plus the most popular models from OpenAI. Since then, we’ve released support for GPT-4o, and with the addition of Codestral, Tabnine users now have six available models to select from:
One of our goals is to always provide our users with immediate access to cutting-edge models as soon as they become available. The switchable models capability puts you in the driver’s seat and lets you choose the best model for each task, project, and team. You’re never locked into any one model and can switch instantly between them using the model selector in Tabnine.
During model selection, Tabnine provides transparency into the behaviors and characteristics of each of the available models to help you decide which is right for your situation. The underlying LLM can be changed with just a few clicks — and Tabnine Chat adapts instantly.
Check out this short video that shows the Codestral model in action:
Starting today, the Codestral model is available to all Tabnine Pro users at no additional cost. Please make sure to use the latest version of the Tabnine plugin for your IDE to get access to the Codestral model. The Codestral model will be available soon for Enterprise users — contact your account representative for more details.