A lightweight .NET client library for interacting with LM Studio's local LLM server API. Supports chat completions, model management, structured output, and tool use.
Clone and build the project:
git clone https://github.com/Ddemon26/Lmss.git
cd Lmss
dotnet builddotnet add package LmsSharpReference the project directly in your .csproj:
<ProjectReference Include="path/to/Lmss/Lmss.csproj" />- .NET Standard 2.0 or .NET 8.0
- LM Studio running locally on port 1234
using LmsSharp;
var client = new LmsClient();
var models = await client.GetModelsAsync();
Console.WriteLine($"Available models: {models.Count}");- OpenAI-compatible API client
- Chat completions with streaming support
- Model management and health checks
- Structured JSON output
- Tool use and function calling
- Async/await support
- Cross-platform compatibility (.NET Standard 2.0 and .NET 8.0)
Contributions are welcome! Please feel free to submit issues, feature requests, or pull requests.
This project is licensed under the MIT License - see the LICENSE file for details.
