Selasa, 08 April 2025

Build a local ChatGPT-like App with Blazor and MaIN.NET - Part 1: Getting Started with LLM Integration.

| Selasa, 08 April 2025

In this tutorial series, we’ll walk you through building a local ChatGPT-like application using Blazor and the MAIN.NET framework, which enables seamless integration with Large Language Models (LLMs).

Whether you're a .NET developer curious about LLMs or an AI enthusiast eager to create your own chatbot UI, this guide is for you.

Goal of Part 1: Set up a working Blazor app connected to a local LLM using MAIN.NET, with no external API needed.

ℹ️ MaIN.NET GitHub repository: https://github.com/wisedev-code/MaIN.NET - We will appreciate any feedback, support, or ⭐!

🧠 Step 1: Choose and Prepare Your Local LLM

The MaIN.NET framework comes with a powerful CLI tool that significantly simplifies model setup, configuration, and local inference. It’s a great way to get up and running quickly with local LLMs.

📌 I’ll cover the MaIN.NET CLI in a separate tutorial - stay tuned for that!

For this example, we’ll use Gemma 3 (4B parameters)—a lightweight and efficient open-source model from Google. It offers an excellent balance between performance and resource usage, making it a great candidate for local inference.

💡 Hardware Tips:
Ensure that the model file size is smaller than your available GPU VRAM.

Examples:

  • NVIDIA RTX 4090 → 24GB VRAM
  • NVIDIA RTX 5090 → 32GB VRAM

On Apple Silicon Macs, your usable memory is your system RAM.

For example, on a 32GB Mac Studio, you can typically run models up to ~30B parameters.

⚠️ Note:

The rule of comparing model size to VRAM or RAM is a rough estimation.

In practice, performance and model compatibility can also depend on:

  • CPU or GPU architecture
  • Memory bandwidth
  • System load and background processes
  • How well the model is quantized and optimized

👉 Download the Gemma3 model here:

https://huggingface.co/Inza124/Gemma3-4b

After downloading, move the .gguf file to a directory where you’ll store your models. I created a folder named MainModels inside my Documents.

🛠️ Step 2: Set Up a Blazor Project

Let’s create a new Blazor Web App project.

  1. Open your IDE or terminal.
  2. Create and navigate to a new project folder.
  3. Run the following command:
dotnet new blazor

📦 Step 3: Install MaIN.NET

With your Blazor project created, the next step is to install the MAIN.NET package.

MAIN.NET is available via NuGet, so you can easily add it using the command line.

Run the following command in your terminal:

dotnet add package MaIN.NET

Configure a path to the models directory in the appsettings.json file by adding:

"MaIN": {
    "ModelsPath": "./models"
  }

My path for example is: "/Users/paweljanda/Documents/". I use Mac.

Add the using directive at the top of the Program.cs file:

using MaIN.Core;

In Program.cs replace the var app = builder.Build(); line with:

builder.Services.AddMaIN(builder.Configuration);
var app = builder.Build();
app.Services.UseMaIN();

This will register MaIN.NET's services and make them available in your app, including the ability to connect to local LLMs and execute inference calls.

✅ You're now ready to connect a local model and build the chat interface!

💬 Step 4: Create the Chat Page

Now let’s build a simple interactive page where users can send messages to your local LLM and receive responses.

Create a New Razor Page

In your Blazor project, navigate to the Components/Pages folder (or just Pages, depending on your project template), and create a new file named ChatLLM.razor.

Paste the following code into the file:

@page "/chatllm"
@rendermode InteractiveServer
@using MaIN.Core.Hub

<h3>Chat with LLM</h3>

<p>@answerFromLLM</p>
<input type="text" class="form-control" @bind="messageToLLM" />

<button class="btn btn-primary" @onclick="SendMessage">Send Message</button>

@code {
    string messageToLLM = "";
    string answerFromLLM = "";

    private async Task SendMessage()
    {
        var result = await AIHub.Chat()
        .WithModel("gemma3:4b")
        .WithMessage(messageToLLM)
        .CompleteAsync();

        answerFromLLM = result.Message.Content;
    }
}

How it Works

  • The user can type a message into the input.
  • When the Send Message button is clicked, the SendMessage() method sends the input to the LLM via AIHub.Chat() using the MaIN.NET framework.
  • The model's response is displayed directly above the input field.

Test It

To launch your Blazor app with hot-reload enabled, open terminal in your IDE and run:

dotnet watch

This will start the development server and immediately open the project in your default browser.

Type a message in the input field and click Send Message - the app will forward your message to the local LLM, and you should see the response rendered on the page.

✅ At this point, you’ve successfully integrated a local LLM into your Blazor frontend using MaIN.NET!

🧭 Step 5: Add a Link to the Sidebar Menu

To make it easy for users to access your new chat page, let’s add a link to the sidebar menu in your Blazor app.

Modify NavMenu.razor

Open the NavMenu.razor file, and add the following HTML inside the existing <nav> or navigation list:

<div class="nav-item px-3">
    <NavLink class="nav-link" href="chatllm">
        <span class="bi bi-chat-dots-fill-nav-menu" aria-hidden="true"></span> ChatLLM
    </NavLink>
</div>

Add the Icon Style

To display the icon correctly, open or create the file NavMenu.razor.css in the same folder, and add the following CSS class:

.bi-chat-dots-fill-nav-menu {
    background-image: url("data:image/svg+xml,%3Csvg xmlns='http://www.w3.org/2000/svg' width='16' height='16' fill='white' class='bi bi-chat-dots-fill' viewBox='0 0 16 16'%3E%3Cpath d='M16 8c0 3.866-3.582 7-8 7a9.06 9.06 0 0 1-2.347-.306c-.584.296-1.925.864-4.181 1.234-.2.032-.352-.176-.273-.362.354-.836.674-1.95.77-2.966C.744 11.37 0 9.76 0 8c0-3.866 3.582-7 8-7s8 3.134 8 7Zm-7.5-.5a.5.5 0 0 1 .5-.5h4a.5.5 0 0 1 0 1h-4a.5.5 0 0 1-.5-.5ZM5 6.5a.5.5 0 0 1 .5-.5h7a.5.5 0 0 1 0 1h-7a.5.5 0 0 1-.5-.5Z'/%3E%3C/svg%3E  ");
    background-repeat: no-repeat;
    padding-left: 1.5rem;
}

This embeds an SVG chat icon as a background image. It gives your navigation a more polished and intuitive user experience.

✅ Result
You should now see a new ChatLLM item in your Blazor sidebar menu. Clicking it will navigate to your chatbot page at /chatllm.

🎉 That’s it! Your app now has a proper navigation entry to the LLM-powered chat interface.

Summary

✅ What We’ve Achieved?
You now have a working Blazor application with:

➡️ A local LLM (Gemma 4B) running via MAIN.NET.

➡️ A basic UI to send messages and receive responses.

➡️ Sidebar navigation to your chat page.

🚀 What’s Next?

In Part 2, we’ll add support for chat history so the model can maintain context throughout the conversation.

Future parts will include:

  • 📎 File uploads for document-based interactions.
  • 🎨 Visual and UX enhancements for a cleaner, more user-friendly interface.
  • 🚀 A production-ready LLM UI with advanced features.

💬 Got questions or feedback? Drop a comment below — I’d love to hear your thoughts!

👉 Follow me to stay updated — Part 2 is coming soon!


Related Posts

Tidak ada komentar:

Posting Komentar