Releases: getcellm/cellm
v0.4.1
This is a maintenance release with documentation improvements and a critical bug fix.
Bug Fixes
- Fixed Cellm proxy base address: Resolved an issue with the Cellm-managed models proxy configuration that could prevent proper connection to the service.
What's Changed
- docs: Improve README.md by @kaspermarstal in #300
- build: Bump version by @kaspermarstal in #301
- docs: v0.4.0 by @kaspermarstal in #303
- docs(v0.4.0): Fix overview, quickstart by @kaspermarstal in #304
- fix: Cellm proxy base address by @kaspermarstal in #305
- chore: Bump version to v0.4.1 by @kaspermarstal in #306
Full Changelog: v0.4.0...v0.4.1
v0.4.0
This release is a big one! We're introducing Cellm-managed models, structured outputs that make organizing your spreadsheets easier, and many performance improvements and bug fixes.
We're beginning to invite users from our waitlist to try Cellm-managed models. Your first 30 days are on us, and we've built the system with privacy as our top priority: all data is processed exclusively within the EU using Mistral models. If you haven't signed up yet, head over to getcellm.com to join the waitlist—there's still time to get in!
What's New
-
Cellm-Managed Models: Once invited, you can now sign in to your Cellm account and access AI models without managing API keys yourself. We handle the provider connections, so you can focus on your work. Until you get an invite, you can still download Cellm and use your own API keys with popular providers like Anthropic, OpenAI, DeepSeek, and Mistral.
-
MCP Support: You can connect your spreadsheets to Model Context Protocol (MCP) servers, giving your Excel formulas access to external data sources and tools. This transforms Excel into a low-code automation platform where you can query databases, send emails, or pull data from APIs.
-
Structured Outputs: New functions (
PROMPT.TOROW,PROMPT.TOCOLUMN,PROMPT.TORANGE) let you control exactly how AI responses flow into your spreadsheet. Need a single value? A row of data? A column? A table? Just pick the right function and Cellm handles the formatting. This makes it much easier to build dynamic spreadsheets that transform AI responses into usable data structures. -
More Provider Options: Added support for Google Gemini, Azure OpenAI, and AWS Bedrock. Combined with our existing integrations, you now have access to nearly every major AI provider from within Excel. We've also updated model lists across providers to include the latest releases.
-
Installer: The new MSI installer automatically sets up Cellm on your machine and registers the plugin with Excel so that it automatically loads on startup.
Performance Improvements
-
Better Excel Responsiveness: Moved more processing off Excel's main thread, so the application stays responsive even when you're running multiple prompts simultaneously. You should notice this especially when working with many cells at once.
-
Smarter Retry Logic: Tuned the resilience pipeline to handle API failures more gracefully, with clearer error messages when something goes wrong (like missing API keys).
Statistics: Added prompt counting, token counting, average tokens per second, and average request per second statistics.
Bug Fixes
- Fixed an issue where concurrent calls with identical arguments would incorrectly reuse responses
- Resolved temperature parsing issues in non-US cultures
- Fixed several MCP-related concurrency and caching issues
- Improved handling of structured outputs when tools are enabled
- Corrected token counting and speed statistics
What's Changed
- fix: Sentry traces sample rate by @kaspermarstal in #144
- feat: Add entitlements and Cellm provider by @kaspermarstal in #145
- refactor: Remove Serde class by @kaspermarstal in #146
- refactor: Remove dedicated Llamafile provider, use OpenAiCompatible provider instead by @kaspermarstal in #147
- feat: Add max output tokens to PromptBuilder by @kaspermarstal in #148
- refactor: Services by @kaspermarstal in #149
- refactor: Remove appsettings.Local.{provider].json, use UI instead by @kaspermarstal in #150
- feat: Do not auto-download Ollama models by @kaspermarstal in #151
- refactor: Use ChatClientFactory in a single RequestHandler by @kaspermarstal in #152
- refactor: Drop separate shared source project for models by @kaspermarstal in #154
- style: Adopt "-Async" postfix naming convention by @kaspermarstal in #155
- fix: Add basic auth to Cellm provider HttpClient by @kaspermarstal in #156
- feat: Add MCP entitlement by @kaspermarstal in #157
- feat: Add account UI by @kaspermarstal in #158
- fix: Concurrent access to MCP tool cache by @kaspermarstal in #159
- style: Use System.Net.Http auth header by @kaspermarstal in #160
- refactor: Model Group UI by @kaspermarstal in #161
- feat: Add MCP menu by @kaspermarstal in #163
- feat: Use Observer to show "#GETTING_DATA" while awaiting model response by @kaspermarstal in #166
- fix: Concurrent MCP client instantiation by @kaspermarstal in #164
- fix: Ollama and OpenAI model defaults by @kaspermarstal in #165
- fix: Models would sometimes output cell coordinates by @kaspermarstal in #167
- fix: Request cancellation by @kaspermarstal in #168
- deps: Update Microsoft.Extensions.AI version to 9.4.4-preview.1.25259.16 by @MackinnonBuck in #162
- feat: Move more work off the main Excel thread by @kaspermarstal in #169
- refactor: appsettings by @kaspermarstal in #172
- feat: Add .msi installer by @kaspermarstal in #173
- refactor: Further move work off the main thread by @kaspermarstal in #180
- docs: Improve README.md by @kaspermarstal in #181
- feat: Fully unblock Excel's main thread when starting many prompts by @kaspermarstal in #183
- fix: Async resilience pipeline by @kaspermarstal in #184
- feat: Add outer pipeline retry, remove circuit breaker which was difficult to reason about by @kaspermarstal in #185
- fix: Native tools configuration paths in UI by @kaspermarstal in #186
- fix: Temporarily disable MCP by @kaspermarstal in #187
- feat: Use ExcelAsyncUtil.RunTaskWithCancellation for better performance by @kaspermarstal in #188
- feat: Add provider/model to logging statements by @kaspermarstal in #189
- feat: Enable entitlements by @kaspermarstal in #190
- feat: Add Google Gemini provider by @kaspermarstal in #191
- feat: Installer scripts for node and playwright by @kaspermarstal in #192
- deps: Use OllamaSharp instead of deprecated Microsoft.Extensions.AI.Ollama by @kaspermarstal in #193
- feat: Add Azure provider by @kaspermarstal in #194
- feat: Add AWS provider by @kaspermarstal in #195
- feat: Adapt Gemini temperature via provider behaviors by @kaspermarstal in #196
- fix: Playwright mcp client tools caching by @kaspermarstal in #197
- feat: Use Semantic Kernel's IChatClient for Google Gemini by @kaspermarstal in #198
- fix: Display Azure, AWS, and Gemini base addresses in provider settings by @kaspermarstal in #199
- fix: Editable base addresses by @kaspermarstal in #200
- docs: Add license notice to account configuration by @kaspermarstal in #201
- feat: Add Node.js and Playwright to installer; fix: Excel/Options/OPEN registry key sequence by @kaspermarstal in #202
- docs: Update README.md by @kaspermarstal in #203
- feat: Add temperature keywords by @kaspermarstal in #205
- feat: Add row, column, and table structured outputs by @kaspermarstal in #206
- refactor: Structured output UI by @kaspermarstal in #207
- feat: Add Gemini Flash 2.5 Lite by @kaspermarstal in #209
- docs: Update readme by @kaspermarstal in #208
- feat: Add token consumption and speed stats to model UI by @kaspermarstal in #210
- fix: Calculate RPS as requests per total busy seconds by @kaspermarstal in #211
- refactor: UI statistics by @kaspermarstal in #212
- feat: Add prompt count to UI statistics by @kaspermarstal in #213
- feat: Add support for signing installer by @kaspermarstal in #214
- feat: Improve system prompt by @kaspermarstal in #215
- feat: Add structured output via {PROMPT, PROMPTMODEL}{ROW, COLUMN, SINGLE} functions by @kaspermarstal in #216
- fix: Parsing temperature with comma decimal separator by @kaspe...
v0.3.0
This release introduces new capabilities for connecting Cellm to external systems and improves underlying integrations with model providers.
What's New
- Add support for Model Context Protocol (MCP): You can now connect Cellm to MCP-compatible servers, allowing your Excel sheets to interact with external data sources or trigger actions via Cellm functions. Effectively, it turns Excel into a low-code automation tool, enabling workflows orchestrated directly from your spreadsheet. We're still trying to wrap our head around the possibilities this unlocks. They are freakin' endless.
- Adopt
Anthropic.SDKfor Anthropic Calls: Previously, we rolled our own custom Anthropic client because of the lack of an official SDK. We've migrated our Anthropic integration to the community-drivenAnthropic.SDK(github.com/tghamm/Anthropic.SDK). This decision was driven by Anthropic's adoption of this SDK for their official .NET MCP library. This move ensures better standardization, aligns Cellm withMicrosoft.Extensions.AIpatterns, and leverages ongoing community improvements. - Telemetry: To help us identify bug and understand usage patterns, we now send anonymized crash reports and model interaction details to Sentry. We never capture any data from your spreadsheet. Still, you can opt out of this at any time by adding the following to your appsettings.Local.json:
{
"SentryConfiguration": {
"IsEnabled": false
}
}Bug fixes
- Fixed a regression that inadvertently broke the use of tools with AI models.
- Adjusted the prompt caching mechanism to correctly invalidate the cache when tools are added or removed.
- Tuned the default settings for the rate limiter (Retry and Circuit Breaker policies) to work together more effectively during periods of high activity. These defaults prioritize stability and avoiding upstream provider rate limits. You can always crank up the limits or remove them altogether, but aggressive settings may lead to more errors from the AI provider.
What's Changed
- Merge develop into main by @kaspermarstal in #129
- fix: Prompts involving tool calls would return the tool call message, not the assistant's reply to the tool call result by @kaspermarstal in #130
- build: Update Microsoft.Extensions.AI version to 9.4.0-preview.1.25207.5 by @MackinnonBuck in #133
- Merge dev by @kaspermarstal in #136
- build: Update dependencies by @kaspermarstal in #138
- fix: Retry on broken circuit and Anthropic's rate limits exceeded, guarantee first retry waits until opened circuit is closed, and reduce aggresiveness of circuit breaker by @kaspermarstal in #140
- fix: Adding/removing tools did not invalidate cache by @kaspermarstal in #139
- refactor: Sentry PipelineBehavior by @kaspermarstal in #137
- fix: Sentry transaction contexts by @kaspermarstal in #142
- build: Bump version by @kaspermarstal in #143
New Contributors
- @MackinnonBuck made their first contribution in #133
Full Changelog: v0.2.0...v0.3.0
v0.2.0
This release addresses several issues when working with multiple prompts simultaneously and introduces comprehensive documentation!
What's New:
- Official Documentation: Our new documentation is now available!
Bug Fixes:
- UI Responsiveness: Fixed an issue where the interface would become unresponsive when sending multiple prompts
- UI Configuration: Configuration changes now apply immediately without requiring application restart
- Smarter Rate Limiting: Implemented centralized rate limiting that applies individually to each provider, creating a smoother experience when sending multiple prompts to different models simultaneously
These improvements should make your workflow more efficient and reliable when working with multiple prompts across different models. Enjoy!
What's Changed
- refactor: Move Cellm.Models to shared project by @kaspermarstal in #111
- docs: Improve README by @kaspermarstal in #112
- feat: Add retry mechanism to Ollama and OpenAiCompatible providers by @kaspermarstal in #114
- fix: Prompt not cached because cache key length sometimes too long by @kaspermarstal in #115
- refactor: Add IChatClient with transient lifetime for all providers by @kaspermarstal in #116
- docs: Add documentation by @kaspermarstal in #117
- docs: Tighten up README, link to docs by @kaspermarstal in #118
- docs: Update R2 domain by @zachasme in #119
- fix: Unblock UI thread when sending many prompts by @kaspermarstal in #120
- refactor: Apply rate limit across providers by @kaspermarstal in #121
- fix: Increase default rate limit by @kaspermarstal in #125
New Contributors
Full Changelog: v0.1.1...v0.2.0
v0.1.1
This is a minor release that mainly fixes a bug where changing API keys either via UI or appsettings.Local.json would not get picked until Excel was restarted.
What's Changed
- docs: Add CLA by @kaspermarstal in #102
- docs: Fix README typos by @kaspermarstal in #103
- build: Add global.json to specify .NET SDK version 9.X.X by @johnnyoshika in #101
- fix: Changing API keys while app is running by @kaspermarstal in #109
Full Changelog: v0.1.0...v0.1.1
v0.1.0
Release v0.1.0
Cellm is an Excel extension that lets you use Large Language Models (LLMs) like ChatGPT in cell formulas. It is designed for automation of repetitive text-based tasks and comes with
- Local and hosted models: Defaults to free local inference (Gemma 2 2B via Ollama) while supporting commercial APIs
- Formula-driven workflow:
=PROMPT()and=PROMPTWITH()functions for drag-and-fill operations across cell ranges.
Install
-
Download
Cellm-AddIn64-packed.xllandappsettings.json. Put them in the same folder. -
Double-click on
Cellm-AddIn64-packed.xll. Excel will open and install Cellm. -
Download and install Ollama. Cellm uses Ollama and the Gemma 2 2B model by default. Gemma 2 2B will be downloaded automatically the first time you call
=PROMPT(). To call other models, see the Models section in the README.
Uninstall
- In Excel, go to File > Options > Add-Ins.
- In the
Managedrop-down menu, selectExcel Add-insand clickGo.... - Uncheck
Cellm-AddIn64-packed.xlland clickOK.
Known Limitations
- Windows-only: No macOS/Linux support planned for initial versions
- Input constraints:
- Formula arguments limited to 8,192 characters (Excel string limit)
- No native support for multi-turn conversations
- Model variability: Output quality depends on selected LLM (validate critically)
Contribution & Feedback
Report issues or suggest improvements via GitHub Issues.
Install
Download Cellm-AddIn64-packed.xll and appsettings.json and put it in the same folder. Then double-click on Cellm-AddIn64-packed.xll. Excel will open with Cellm installed.
License: Fair Core License
Full Documentation: README
What's Changed
- feat: Add LlamafileClient by @kaspermarstal in #1
- bug: Fix AddSystemMessage by @kaspermarstal in #4
- bug: Fix llamafile health uri by @kaspermarstal in #3
- models: Add qwen-0.5b by @kaspermarstal in #2
- docs: Tighten up README by @kaspermarstal in #6
- feat: Manually dispose of ServiceLocator by @kaspermarstal in #7
- bug: By default assign telemetry to default model of provider, not default model of Cellm; refactor: Rename GoogleClient to GoogleAiClient by @kaspermarstal in #8
- docs: Add support for Mistral by @kaspermarstal in #9
- bug: Disable sentry by default until fix for missing immutable arrays is identified by @kaspermarstal in #11
- feat: Add concurrency rate limiting by @kaspermarstal in #10
- feat: Add support for running multiple Llamafiles simultaneously by @kaspermarstal in #12
- build: Enforce code style in build by @kaspermarstal in #13
- git: Add Excel files to .gitignore by @kaspermarstal in #14
- docs: Improve README by @kaspermarstal in #15
- Prompt: Further optimize system prompt for small models with limited instruction-following capability. Larger models will understand anyway by @kaspermarstal in #16
- docs: Proof-read README.md by @kaspermarstal in #17
- feat: Add support for OpenAI tools by @kaspermarstal in #18
- feat: Upgrade default Anthropic model to claude-3-5-sonnet-20241022 by @kaspermarstal in #20
- refactor: Add provider enum by @kaspermarstal in #19
- refactor: Rename CellmFunctions to Functions and CellPrompts to SystemMessages by @kaspermarstal in #21
- ci: Add conventional commits lint by @kaspermarstal in #22
- build: Make internals visible to Cellm.Tests by @kaspermarstal in #23
- refactor: Pull out XllPath into settable property by @kaspermarstal in #24
- feat: Add SentryBehavior and CachingBehavior to model request pipeline by @kaspermarstal in #25
- refactor: Splot Tools into ToolRunner and ToolFactory by @kaspermarstal in #26
- feat: Upgrade Claude 3.5 Sonnet by @kaspermarstal in #27
- refactor: Remove superfluous interfaces by @kaspermarstal in #28
- feat: Add FileReader tool by @kaspermarstal in #29
- ci: Add dependabot by @kaspermarstal in #36
- build(deps): bump Microsoft.Extensions.Caching.Memory from 8.0.0 to 8.0.1 in /src/Cellm by @dependabot in #38
- ci: Disable commitlint for dependabot by @kaspermarstal in #43
- build(deps): bump Sentry.Extensions.Logging from 4.10.2 to 4.12.1 by @dependabot in #39
- ci: Disable conventional commits lint by @kaspermarstal in #44
- build(deps): bump Sentry.Profiling from 4.10.2 to 4.12.1 by @dependabot in #41
- build(deps): bump Microsoft.Extensions.Configuration.Json from 8.0.0 to 8.0.1 by @dependabot in #42
- fix: Parse tool description attributes by @kaspermarstal in #45
- feat: Add support for prompt as single string argument by @kaspermarstal in #50
- fix: CacheBehavior caches value even when model request is mutated downstream by @kaspermarstal in #56
- ci: Run dotnet format with restore by @kaspermarstal in #58
- fix: Remove Sentry metrics which were deprecated by @kaspermarstal in #59
- build(deps): bump Microsoft.Extensions.Configuration, Microsoft.Extensions.DependencyInjection, Microsoft.Extensions.Logging.Console, Microsoft.Extensions.Options and Microsoft.Extensions.Options.ConfigurationExtensions by @dependabot in #51
- fix: Increase timeouts for local LLMs by @kaspermarstal in #57
- build(deps): bump Microsoft.Extensions.Caching.Memory and Microsoft.Extensions.Options by @dependabot in #54
- build: Target .NET 8.0 by @kaspermarstal in #62
- feat: Replace GoogleAI provider with Google's OpenAI compatible endpoint by @kaspermarstal in #63
- feat: Update Llamafile version, add larger Llamafile models by @kaspermarstal in #64
- refactor: Clean up src/Cellm/AddIn by @kaspermarstal in #65
- feat: Use Microsoft.Extensions.AI by @kaspermarstal in #66
- feat: Add Ollama provider by @kaspermarstal in #67
- fix: Ollama provider by @kaspermarstal in #69
- build: Remove json schema deps no longer needed by @kaspermarstal in #72
- feat: Add HybridCache, remove MemoryCache by @kaspermarstal in #73
- feat: Add OpenAiCompatible chat client by @kaspermarstal in #74
- refactor: Models by @kaspermarstal in #75
- build: Target .NET 9 and update deps by @kaspermarstal in #83
- refactor: Use ExcelAsyncUtil to run task by @kaspermarstal in #84
- feat: Remove support for embedded Ollama and Llamafile servers by @kaspermarstal in #85
- build: Copy appsettings.Local.json to bin dir only if exists by @kaspermarstal in #86
- docs: Fix appsettings.Local.*.json examples and update readme to match by @kaspermarstal in #87
- bug: Fix OpenAiCompatible API key by @kaspermarstal in #94
- feat: Add Mistral and DeepSeek providers by @kaspermarstal in #95
- feat: Add Ribbon UI by @kaspermarstal in #96
- feat: Auto-download Ollama models by @kaspermarstal in #98
- docs: Update README.md with installations instructions via Release page by @kaspermarstal in #100
Full Changelog: https://github.com/getcellm/cellm/commits/v0.1.0