Skip to content

Releases: getcellm/cellm

v0.4.1

15 Nov 20:25
79cbe0f

Choose a tag to compare

This is a maintenance release with documentation improvements and a critical bug fix.

Bug Fixes

  • Fixed Cellm proxy base address: Resolved an issue with the Cellm-managed models proxy configuration that could prevent proper connection to the service.

What's Changed

Full Changelog: v0.4.0...v0.4.1

v0.4.0

27 Oct 18:43
28b1f01

Choose a tag to compare

This release is a big one! We're introducing Cellm-managed models, structured outputs that make organizing your spreadsheets easier, and many performance improvements and bug fixes.

We're beginning to invite users from our waitlist to try Cellm-managed models. Your first 30 days are on us, and we've built the system with privacy as our top priority: all data is processed exclusively within the EU using Mistral models. If you haven't signed up yet, head over to getcellm.com to join the waitlist—there's still time to get in!

What's New

  • Cellm-Managed Models: Once invited, you can now sign in to your Cellm account and access AI models without managing API keys yourself. We handle the provider connections, so you can focus on your work. Until you get an invite, you can still download Cellm and use your own API keys with popular providers like Anthropic, OpenAI, DeepSeek, and Mistral.

  • MCP Support: You can connect your spreadsheets to Model Context Protocol (MCP) servers, giving your Excel formulas access to external data sources and tools. This transforms Excel into a low-code automation platform where you can query databases, send emails, or pull data from APIs.

  • Structured Outputs: New functions (PROMPT.TOROW, PROMPT.TOCOLUMN, PROMPT.TORANGE) let you control exactly how AI responses flow into your spreadsheet. Need a single value? A row of data? A column? A table? Just pick the right function and Cellm handles the formatting. This makes it much easier to build dynamic spreadsheets that transform AI responses into usable data structures.

  • More Provider Options: Added support for Google Gemini, Azure OpenAI, and AWS Bedrock. Combined with our existing integrations, you now have access to nearly every major AI provider from within Excel. We've also updated model lists across providers to include the latest releases.

  • Installer: The new MSI installer automatically sets up Cellm on your machine and registers the plugin with Excel so that it automatically loads on startup.

Performance Improvements

  • Better Excel Responsiveness: Moved more processing off Excel's main thread, so the application stays responsive even when you're running multiple prompts simultaneously. You should notice this especially when working with many cells at once.

  • Smarter Retry Logic: Tuned the resilience pipeline to handle API failures more gracefully, with clearer error messages when something goes wrong (like missing API keys).

Statistics: Added prompt counting, token counting, average tokens per second, and average request per second statistics.

Bug Fixes

  • Fixed an issue where concurrent calls with identical arguments would incorrectly reuse responses
  • Resolved temperature parsing issues in non-US cultures
  • Fixed several MCP-related concurrency and caching issues
  • Improved handling of structured outputs when tools are enabled
  • Corrected token counting and speed statistics

What's Changed

Read more

v0.3.0

12 Apr 14:40
c492df8

Choose a tag to compare

This release introduces new capabilities for connecting Cellm to external systems and improves underlying integrations with model providers.

What's New

  • Add support for Model Context Protocol (MCP): You can now connect Cellm to MCP-compatible servers, allowing your Excel sheets to interact with external data sources or trigger actions via Cellm functions. Effectively, it turns Excel into a low-code automation tool, enabling workflows orchestrated directly from your spreadsheet. We're still trying to wrap our head around the possibilities this unlocks. They are freakin' endless.
  • Adopt Anthropic.SDK for Anthropic Calls: Previously, we rolled our own custom Anthropic client because of the lack of an official SDK. We've migrated our Anthropic integration to the community-driven Anthropic.SDK (github.com/tghamm/Anthropic.SDK). This decision was driven by Anthropic's adoption of this SDK for their official .NET MCP library. This move ensures better standardization, aligns Cellm with Microsoft.Extensions.AI patterns, and leverages ongoing community improvements.
  • Telemetry: To help us identify bug and understand usage patterns, we now send anonymized crash reports and model interaction details to Sentry. We never capture any data from your spreadsheet. Still, you can opt out of this at any time by adding the following to your appsettings.Local.json:
{
    "SentryConfiguration": {
        "IsEnabled": false
    }
}

Bug fixes

  • Fixed a regression that inadvertently broke the use of tools with AI models.
  • Adjusted the prompt caching mechanism to correctly invalidate the cache when tools are added or removed.
  • Tuned the default settings for the rate limiter (Retry and Circuit Breaker policies) to work together more effectively during periods of high activity. These defaults prioritize stability and avoiding upstream provider rate limits. You can always crank up the limits or remove them altogether, but aggressive settings may lead to more errors from the AI provider.

What's Changed

New Contributors

Full Changelog: v0.2.0...v0.3.0

v0.2.0

19 Mar 20:17
5bd4e30

Choose a tag to compare

This release addresses several issues when working with multiple prompts simultaneously and introduces comprehensive documentation!

What's New:

  • Official Documentation: Our new documentation is now available!

Bug Fixes:

  • UI Responsiveness: Fixed an issue where the interface would become unresponsive when sending multiple prompts
  • UI Configuration: Configuration changes now apply immediately without requiring application restart
  • Smarter Rate Limiting: Implemented centralized rate limiting that applies individually to each provider, creating a smoother experience when sending multiple prompts to different models simultaneously

These improvements should make your workflow more efficient and reliable when working with multiple prompts across different models. Enjoy!

What's Changed

New Contributors

Full Changelog: v0.1.1...v0.2.0

v0.1.1

06 Feb 08:32
30a288c

Choose a tag to compare

This is a minor release that mainly fixes a bug where changing API keys either via UI or appsettings.Local.json would not get picked until Excel was restarted.

What's Changed

Full Changelog: v0.1.0...v0.1.1

v0.1.0

30 Jan 22:18
e7856b5

Choose a tag to compare

Release v0.1.0

Cellm is an Excel extension that lets you use Large Language Models (LLMs) like ChatGPT in cell formulas. It is designed for automation of repetitive text-based tasks and comes with

  • Local and hosted models: Defaults to free local inference (Gemma 2 2B via Ollama) while supporting commercial APIs
  • Formula-driven workflow: =PROMPT() and =PROMPTWITH() functions for drag-and-fill operations across cell ranges.

Install

  1. Download Cellm-AddIn64-packed.xll and appsettings.json. Put them in the same folder.

  2. Double-click on Cellm-AddIn64-packed.xll. Excel will open and install Cellm.

  3. Download and install Ollama. Cellm uses Ollama and the Gemma 2 2B model by default. Gemma 2 2B will be downloaded automatically the first time you call =PROMPT(). To call other models, see the Models section in the README.

Uninstall

  1. In Excel, go to File > Options > Add-Ins.
  2. In the Manage drop-down menu, select Excel Add-ins and click Go....
  3. Uncheck Cellm-AddIn64-packed.xll and click OK.

Known Limitations

  1. Windows-only: No macOS/Linux support planned for initial versions
  2. Input constraints:
    • Formula arguments limited to 8,192 characters (Excel string limit)
    • No native support for multi-turn conversations
  3. Model variability: Output quality depends on selected LLM (validate critically)

Contribution & Feedback

Report issues or suggest improvements via GitHub Issues.

Install

Download Cellm-AddIn64-packed.xll and appsettings.json and put it in the same folder. Then double-click on Cellm-AddIn64-packed.xll. Excel will open with Cellm installed.


License: Fair Core License
Full Documentation: README

What's Changed

Full Changelog: https://github.com/getcellm/cellm/commits/v0.1.0