Aspose.LLM
.NET Library for AI and Chat

Add LLM and AI to your .NET apps with one library. Simple API for text and code generation, natural language understanding, chat sessions, and optional image input — no pipeline setup required.

Load models via presets, run stateful chats, and use multimodal input when your model supports it. Built for .NET developers who want local inference without managing native binaries.

Key features

Text and code generation, natural language understanding, chat sessions, and flexible configuration through presets.

  • Text generation and completion

    Generate and complete text for a wide range of use cases, from content creation to summarization and expansion.

  • Code generation and analysis

    Use LLMs for code generation, refactoring suggestions, and code analysis directly from your .NET applications.

  • Natural language understanding

    Integrate natural language understanding into your apps for classification, extraction, and question answering.

  • Chat sessions and multimodality

    Run stateful chat sessions and optionally combine text with images for multimodal inference.

Multiple languages and use cases

Use Aspose.LLM for text and code generation, summarization, and natural language understanding in many languages and scenarios. Combine with presets and parameters to adapt the model to your application.

Why Aspose.LLM?

  • On-premise

    run models locally

    Run Large Language Models on your own infrastructure. Keep data in-house and control deployment, without depending on external API availability or usage caps.

  • Simple

    .NET API, quick start

    Get started with a few lines of code. Install from NuGet, load a model, start a chat session, and send messages. Presets and optional customization let you tune behavior for your use case.

  • Multimodal

    data supported

    Multimodal support: send text and images in the same request when your model has vision. One API covers text-only chat and image-aware inference, so you can build assistants that understand both language and visuals.

Free evaluation

With a free evaluation license, only a specific model is available so you can try Aspose.LLM without purchase. A temporary license removes this limitation for 30 days and gives you access to all models and features.

Install the package from NuGet, get a temporary license, and build a fully functional LLM solution before deciding to purchase.

Simple .NET API for LLM

One .NET API for local LLM inference: set up the engine, load a model with default or custom presets, and run chat sessions. Aspose.LLM handles model loading, context, and sampling so you can focus on application logic — no native binaries or pipeline code to manage.

Use built-in defaults or tune settings as needed. The same API supports text-only chat and multimodal input (text plus images) when your model has vision support.

Chat and multimodality

Stateful chat and multimodal support in one API: run multi-turn conversations and, when your model supports vision, send text and images in the same request. Aspose.LLM keeps session state and handles inference — no extra integration for chat vs. image-aware workflows.

Change models and behavior via presets without code changes. Add Aspose.LLM to your .NET stack for document automation, code assistants, content generation, or custom AI, with full control over model choice and on-premise deployment.

Solutions

Aspose.LLM covers use cases from personal projects to enterprise: run LLMs on your infrastructure and add LLM capabilities to .NET apps with one library. Document automation, chatbots, code assistants, content generation, and natural language understanding — with full control over model choice, presets, and deployment.

Enterprise

Document automation and summarization
Internal chatbots and knowledge assistants
Code generation and analysis tools
On-premise deployment and data control
Custom presets and model tuning
Multimodal workflows (text and images)

SMB

Content generation and editing
Text analysis and classification
Customer support and Q&A automation
Local development and prototyping
Integration with existing .NET apps
Chat sessions and multi-turn dialogue

Personal

Local AI assistants and experiments
Learning and trying LLM APIs
Side projects and demos
Text and code generation on your machine
No cloud dependency or usage caps
NuGet install and minimal setup

Ready to go?

Get the Aspose.LLM package for .NET and start building LLM-powered applications today.

Presets and parameters

Aspose.LLM lets you control model selection, context size, and sampling via presets. Use built-in defaults for a quick start, or supply your own model paths and parameters. The same API supports different backends and model formats so you can switch or evaluate models without rewriting your application.

Fine-tune inference behavior for your use case: adjust context and chat parameters per session, and combine text with optional image input when your model supports multimodality. Run everything on-premise with no dependency on external API quotas or availability.

One API for text and multimodal

Whether you need text-only chat or inference with images, Aspose.LLM provides a unified .NET API. Start a session, send messages, and optionally attach media — the library handles model loading, session state, and response streaming so you can focus on integration and business logic.

Install from NuGet, set a license, and load a model; then build assistants, code tools, or document workflows with minimal boilerplate. Documentation and examples are available to get you started quickly.