Building My Own AI Chat Interface with Laravel and GenAI
It all started when I wanted to spend some time exploring the world of AI and figuring out how I could use it in my projects. At first, I wasn’t sure what to build — the AI space is so wide and evolving so quickly. But as I dug deeper, the idea of creating my own AI chat interface started to take shape.
Instead of rushing through with a throwaway script, I wanted to create something solid that I’d actually enjoy using. I wanted something structured, scalable, and built with tools I already enjoy working with. That’s how I ended up with this stack:
- Laravel — for back-end and overall application structure
- Inertia.js — to bridge Laravel with front-end frameworks
- PHP Neuron AI — to interact with GenAI models
- React.js — for building a dynamic front-end
- Shadcn UI — for clean and responsive components
This combination gave me the stability of Laravel and the flexibility of modern tools to experiment with AI in a way that felt practical and fun.
Setting Up a Fresh Laravel App
We’ll start from scratch by creating a new Laravel application. You can follow the official Laravel documentation to get started and choose any database or development environment you like.
For this blog, I’ll keep things focused purely on building the AI chat workflow, so I won’t dive into environment setup details.
Designing the Workflow
Before jumping into code, let’s map out how our AI chat interface will actually work:
- Start → The user lands on the dashboard.
- Take User Input → Capture the user’s first message on the dashboard.
- Create a New Thread → Generate a new chat thread for the conversation.
- Store User Message in Session → Temporarily store the first message in the session.
- Redirect to Chat Interface → Send the user to the main chat interface.
- Pull Message from Session → Retrieve the initial message from the session.
- Start Chat Session with Thread ID → Initialize the AI conversation using the thread ID for tracking.
💡 Tip: Defining a clear workflow like this makes it easier to organize your routes, controllers, and session handling, and ensures your AI chat behaves predictably.
Building the Dashboard
Since we’re using the Laravel 12 React + Inertia.js starter kit, our project already comes with Shadcn UI installed, which makes building beautiful, responsive components easy. We’ll be using these same AI elements in our app.
For the first screen, we’ll build a simple dashboard with a single input box so the user can start by entering their first message.
We’ll use the PromptInput component from ai-elements. Install it with:
npx ai-elements@latest add prompt-input⚠️ By default, ai-elements installs into
/components/ai-elements. For our setup, we’ll copy these into:/resources/js/components/ai-elements
Here’s our basic Dashboard.tsx:
import DashboardController from '@/actions/App/Http/Controllers/DashboardController';
import {
PromptInput,
PromptInputBody,
PromptInputSubmit,
PromptInputTextarea,
PromptInputToolbar,
PromptInputTools,
} from '@/components/ai-elements/prompt-input';
import AppLayout from '@/layouts/app-layout';
import { type BreadcrumbItem } from '@/types';
import { Head, useForm } from '@inertiajs/react';
const breadcrumbs: BreadcrumbItem[] = [
{
title: 'Dashboard',
href: DashboardController.index().url,
},
];
export default function Dashboard() {
const { post, data, setData } = useForm({
message: '',
});
const handleSubmit = () => {
post(DashboardController.post().url);
};
return (
<AppLayout breadcrumbs={breadcrumbs}>
<Head title="Dashboard" />
<div className="mx-auto flex h-full w-1/2 items-center justify-center">
<PromptInput onSubmit={handleSubmit} className="relative mt-4">
<PromptInputBody>
<PromptInputTextarea
onChange={(e) => {
setData({ message: e.target.value });
}}
value={data.message}
/>
</PromptInputBody>
<PromptInputToolbar className="p-2">
<PromptInputTools>
<div className="text-gray-500">
Please enter you message to start
</div>
</PromptInputTools>
<PromptInputSubmit disabled={false} status={'ready'} />
</PromptInputToolbar>
</PromptInput>
</div>
</AppLayout>
);
}Most of this should be self-explanatory: we’re using Inertia’s form helper to capture the message and post it to the controller.
Routes
Our web.php handles both GET and POST for the dashboard:
<?php
use App\Http\Controllers\DashboardController;
use Illuminate\Support\Facades\Route;
use Inertia\Inertia;
Route::get('/', function () {
return Inertia::render('welcome');
})->name('home');
Route::middleware(['auth', 'verified'])->group(function () {
Route::get('dashboard', [DashboardController::class, 'index'])->name('dashboard');
Route::post('dashboard', [DashboardController::class, 'post'])->name('dashboard.post');
});
require __DIR__.'/settings.php';
require __DIR__.'/auth.php';Dashboard Controller
<?php
namespace App\Http\Controllers;
use Illuminate\Http\Request;
use Inertia\Inertia;
use Inertia\Response;
class DashboardController extends Controller
{
public function index(): Response
{
return Inertia::render('dashboard');
}
public function post(Request $request)
{
$request->validate([
'message' => 'required'
]);
}
}Adding Threads
We’ll use a Thread model to store chat sessions.
php artisan make:model Thread -mUpdate the migration:
<?php
use Illuminate\Database\Migrations\Migration;
use Illuminate\Database\Schema\Blueprint;
use Illuminate\Support\Facades\Schema;
return new class extends Migration
{
/**
* Run the migrations.
*/
public function up(): void
{
Schema::create('threads', function (Blueprint $table) {
$table->uuid('id')->primary();
$table->foreignId('user_id')->constrained('users')->cascadeOnDelete();
$table->string('title');
$table->timestamps();
});
}
/**
* Reverse the migrations.
*/
public function down(): void
{
Schema::dropIfExists('threads');
}
};And the model:
<?php
namespace App\Models;
use Illuminate\Database\Eloquent\Concerns\HasUuids;
use Illuminate\Database\Eloquent\Model;
use Illuminate\Database\Eloquent\Relations\BelongsTo;
class Thread extends Model
{
use HasUuids;
}Updating the Controller
public function post(Request $request)
{
$request->validate([
'message' => 'required'
]);
$thread = new Thread();
$thread->user_id = Auth::user()->id;
$thread->title = "New thread";
$thread->save();
}With this, our dashboard is ready, so let’s move on to building the chat interface.
Building The Chat Interface
On the dashboard, we captured the user’s first message and created a new thread. Now, it’s time to build the chat interface where the actual conversation happens.
For this, we’ll use Conversation and Message components from ai-elements
Install them with:
npx ai-elements@latest add conversation
npx ai-elements@latest add message⚠️ Just like before, don’t forget to copy them into:
resources/js/components/ai-elements.
ChatController
Let’s create a new controller for handling chats:
php artisan make:controller ChatControllerUpdate it like this:
<?php
namespace App\Http\Controllers;
use App\Models\Thread;
use Illuminate\Http\Request;
use Inertia\Inertia;
class ChatController extends Controller
{
public function index(Thread $thread)
{
return Inertia::render('chat', [
'thread' => $thread
]);
}
}Routes
Add a route for the chat interface in web.php:
<?php
use App\Http\Controllers\ChatController;
use App\Http\Controllers\DashboardController;
use Illuminate\Support\Facades\Route;
use Inertia\Inertia;
Route::get('/', function () {
return Inertia::render('welcome');
})->name('home');
Route::middleware(['auth', 'verified'])->group(function () {
Route::get('dashboard', [DashboardController::class, 'index'])->name('dashboard');
Route::post('dashboard', [DashboardController::class, 'post'])->name('dashboard.post');
// New route for chat with thread id
Route::get('chat/{thread}', [ChatController::class, 'index'])->name('chat');
});
require __DIR__.'/settings.php';
require __DIR__.'/auth.php';Chat Page (chat.tsx)
Now create chat.tsx in resources/js/pages:
import ChatController from '@/actions/App/Http/Controllers/ChatController';
import {
Conversation,
ConversationContent,
ConversationEmptyState,
ConversationScrollButton,
} from '@/components/ai-elements/conversation';
import { Message, MessageContent } from '@/components/ai-elements/message';
import {
PromptInput,
PromptInputBody,
PromptInputSubmit,
PromptInputTextarea,
PromptInputToolbar,
PromptInputTools,
} from '@/components/ai-elements/prompt-input';
import AppLayout from '@/layouts/app-layout';
import { type BreadcrumbItem } from '@/types';
import { Head, useForm } from '@inertiajs/react';
import { MessageSquare } from 'lucide-react';
import { useMemo, useState } from 'react';
type Message = {
id: number;
role: 'user' | 'assistant';
content: string;
};
export default function Chat({ thread }) {
const { post, data, setData } = useForm({
message: '',
});
const [messages, setMessages] = useState<Message[]>([]);
const breadcrumbs = useMemo<BreadcrumbItem[]>(
() => [
{
title: 'Chat',
href: ChatController.index(thread.id).url,
},
],
[thread.id],
);
const handleSubmit = () => {
if (data.message) {
const newMessage: Message = {
id: Date.now(),
role: 'user',
content: data.message.trim(),
};
setMessages((prev) => [...prev, newMessage]);
setData({ message: '' });
}
};
return (
<AppLayout breadcrumbs={breadcrumbs}>
<Head title="Dashboard" />
<div className="mx-auto flex h-[calc(100vh-5rem)] w-full flex-col px-4">
<Conversation className="relative h-full w-full">
<ConversationContent>
{messages.length === 0 ? (
<ConversationEmptyState
icon={<MessageSquare className="size-12" />}
title="No messages yet"
description="Start a conversation to see messages here"
/>
) : (
messages.map((message) => (
<Message from={message.role} key={message.id}>
<MessageContent>
{message.content}
</MessageContent>
</Message>
))
)}
</ConversationContent>
<ConversationScrollButton />
</Conversation>
<PromptInput
onSubmit={handleSubmit}
className="relative mt-4 w-full"
>
<PromptInputBody>
<PromptInputTextarea
onChange={(e) => {
setData({ message: e.target.value });
}}
value={data.message}
/>
</PromptInputBody>
<PromptInputToolbar className="p-2">
<PromptInputTools>
<div className="text-gray-500">
Please enter you message to start
</div>
</PromptInputTools>
<PromptInputSubmit disabled={false} status={'ready'} />
</PromptInputToolbar>
</PromptInput>
</div>
</AppLayout>
);
}This gives us a fully functional chat interface with scrolling, empty states, and message rendering.
Updating DashboardController
Finally, update the post method so users are redirected to the chat interface after creating a new thread:
public function post(Request $request)
{
$request->validate([
'message' => 'required'
]);
$thread = new Thread();
$thread->user_id = Auth::user()->id;
$thread->title = "New thread";
$thread->save();
// Redirect to chat UI with initial message in session
return to_route('chat', ['thread' => $thread->id])
->with('initial-message', $request->message);
}Handling the User Message and Response
Now comes the main part. In AI chat interfaces, the response isn’t sent as one large block. Instead, it’s streamed in chunks, creating the “typing effect.”
Laravel supports this type of response out of the box using response()->stream() With this, we can stream partial data directly from the server to the frontend, which will render each chunk as it arrives, just like modern AI chat apps do.
Let’s create a method in our ChatController:
public function post(Thread $thread)
{
return response()->stream(function (): void {
foreach (['developer', 'admin'] as $string) {
echo $string;
ob_flush();
flush();
sleep(2);
}
}, 200, ['X-Accel-Buffering' => 'no']);
}And register the route:
Route::post('chat/{thread}', [ChatController::class, 'post'])->name('chat.post');You don’t need to worry too much about the exact implementation here. This is just a demo to illustrate how streaming works. The final version might look different depending on our needs.
Now, the next step is integrating this into React. Thankfully, the Laravel team is already working on a dedicated package to make streaming responses easier: https://github.com/laravel/stream
Since we’re using Inertia.js with React, install the React adapter:
npm install @laravel/stream-reactNow, in our chat page, we can use the useStream hook from the package.
⚠️ Important: Our endpoint is a POST method.
useStreamrequires passing a POST endpoint, so we need to provide the CSRF token for security.
We can make it easily accessible by sharing it through Inertia’s middleware:
<?php
namespace App\Http\Middleware;
use Illuminate\Foundation\Inspiring;
use Illuminate\Http\Request;
use Inertia\Middleware;
class HandleInertiaRequests extends Middleware
{
/**
* The root template that's loaded on the first page visit.
*
* @see https://inertiajs.com/server-side-setup#root-template
*
* @var string
*/
protected $rootView = 'app';
/**
* Determines the current asset version.
*
* @see https://inertiajs.com/asset-versioning
*/
public function version(Request $request): ?string
{
return parent::version($request);
}
/**
* Define the props that are shared by default.
*
* @see https://inertiajs.com/shared-data
*
* @return array<string, mixed>
*/
public function share(Request $request): array
{
[$message, $author] = str(Inspiring::quotes()->random())->explode('-');
return [
...parent::share($request),
'name' => config('app.name'),
'quote' => ['message' => trim($message), 'author' => trim($author)],
'auth' => [
'user' => $request->user(),
],
'sidebarOpen' => ! $request->hasCookie('sidebar_state') || $request->cookie('sidebar_state') === 'true',
'csrfToken' => csrf_token(),
];
}
}Now that our backend is set up to stream messages, let’s update the React chat page so users can see messages appear in real time, just like in modern AI chat applications.
import ChatController from '@/actions/App/Http/Controllers/ChatController';
import {
Conversation,
ConversationContent,
ConversationEmptyState,
ConversationScrollButton,
} from '@/components/ai-elements/conversation';
import { Message, MessageContent } from '@/components/ai-elements/message';
import {
PromptInput,
PromptInputBody,
PromptInputSubmit,
PromptInputTextarea,
PromptInputToolbar,
PromptInputTools,
} from '@/components/ai-elements/prompt-input';
import AppLayout from '@/layouts/app-layout';
import { type BreadcrumbItem } from '@/types';
import { Head } from '@inertiajs/react';
import { useStream } from '@laravel/stream-react';
import { MessageSquare } from 'lucide-react';
import { useMemo, useState } from 'react';
type ChatMessage = {
id: number;
role: 'user' | 'assistant';
content: string;
};
interface ChatProps {
thread: any;
csrfToken: string;
}
export default function Chat({ thread, csrfToken }: ChatProps) {
const [message, setMessage] = useState('');
const [messages, setMessages] = useState<ChatMessage[]>([]);
// Setting up streaming
const { send } = useStream(ChatController.post(thread.id).url, {
csrfToken,
onData: (chunk: string) => {
setMessages((prev) => {
const updated = [...prev];
const last = updated[updated.length - 1];
if (last && last.role === 'assistant') {
last.content += chunk; // Append streaming chunk to last assistant message
} else {
updated.push({
id: Date.now(),
role: 'assistant',
content: chunk,
});
}
return updated;
});
},
});
// Breadcrumbs for layout
const breadcrumbs = useMemo<BreadcrumbItem[]>(
() => [
{
title: 'Chat',
href: ChatController.index(thread.id).url,
},
],
[thread.id],
);
// Handle user message submission
const handleSubmit = () => {
if (!message.trim()) return;
const newMessage: ChatMessage = {
id: Date.now(),
role: 'user',
content: message.trim(),
};
// Send message to server stream
send({ message });
setMessages((prev) => [...prev, newMessage]);
setMessage('');
};
return (
<AppLayout breadcrumbs={breadcrumbs}>
<Head title="Chat" />
<div className="mx-auto flex h-[calc(100vh-5rem)] w-full flex-col px-4">
<Conversation className="relative h-full w-full">
<ConversationContent>
{messages.length === 0 ? (
<ConversationEmptyState
icon={<MessageSquare className="size-12" />}
title="No messages yet"
description="Start a conversation to see messages here"
/>
) : (
messages.map((msg) => (
<Message from={msg.role} key={msg.id}>
<MessageContent>{msg.content}</MessageContent>
</Message>
))
)}
</ConversationContent>
<ConversationScrollButton />
</Conversation>
<PromptInput onSubmit={handleSubmit} className="relative mt-4 w-full">
<PromptInputBody>
<PromptInputTextarea
value={message}
onChange={(e) => setMessage(e.target.value)}
/>
</PromptInputBody>
<PromptInputToolbar className="p-2">
<PromptInputTools>
<div className="text-gray-500">
Please enter your message to start
</div>
</PromptInputTools>
<PromptInputSubmit disabled={!message.trim()} status="ready" />
</PromptInputToolbar>
</PromptInput>
</div>
</AppLayout>
);
}Sending Initial Message from Session
As we are redirecting from the dashboard to chat interface for new message we are also storing initial message in session and we need to send this initial message to the back-end when are initializing the useStream, let’s do this
Update ChatController’s Index method
public function index(Request $request, Thread $thread)
{
// Pull the initial message from session (and remove it)
$initialMessage = $request->session()->pull('initial-message');
return Inertia::render('chat', [
'thread' => $thread,
'initialMessage' => $initialMessage,
]);
}Explanation:
session()->pull('initial-message')retrieves the initial message and removes it from the session in one step.- Pass
initialMessageto Inertia so that React can access it immediately.
Update chat.tsx Page
interface ChatProps {
thread: any;
initialMessage: string;
csrfToken: string;
}
export default function Chat({ thread, initialMessage, csrfToken }: ChatProps) {
// Initialize messages state with the initial message (if any)
const initialMessages = useMemo(() => {
const historyMessages = [];
if (initialMessage) {
historyMessages.push({
id: Date.now(),
role: 'user',
content: initialMessage,
});
}
return historyMessages;
}, [initialMessage]);
const [messages, setMessages] = useState<ChatMessage[]>(initialMessages);
// Initialize streaming
const { send } = useStream(ChatController.post(thread.id).url, {
csrfToken,
initialInput: initialMessage
? { message: initialMessage }
: undefined,
onData: (chunk: string) => {
setMessages((prev) => {
const updated = [...prev];
const last = updated[updated.length - 1];
if (last && last.role === 'assistant') {
last.content += chunk; // Append streaming chunk
} else {
updated.push({
id: Date.now(),
role: 'assistant',
content: chunk,
});
}
return updated;
});
},
});
}The system detects any initial message from the dashboard and seamlessly streams it to the backend using useStream.
Sending Data to the AI Model
Our basic chat interface is now ready. The next step is to send messages to an AI model. For this example, I’m using the Gemini models, which offer a free tier. You can obtain your API key from https://aistudio.google.com/api-keys and start integrating it with your backend.
To handle the AI integration, we’ll be using Neuron AI https://www.neuron-ai.dev/, Let’s start by installing the package.
composer require inspector-apm/neuron-aiNeuron AI also provides some command-line tools. For AI integration, we need to create an Agent. This can be done using the following command:
php vendor/bin/neuron make:agent App\\Neuron\\ChatAgentUpdated ChatAgent
<?php
declare(strict_types=1);
namespace App\Neuron;
use App\Models\Thread;
use NeuronAI\Agent;
use NeuronAI\SystemPrompt;
use NeuronAI\Providers\AIProviderInterface;
use NeuronAI\Providers\Gemini\Gemini;
class ChatAgent extends Agent
{
public function __construct(public Thread $thread) {}
protected function provider(): AIProviderInterface
{
return new Gemini(
key: env('GEMINI_API_KEY', null),
model: 'gemini-2.0-flash'
);
}
public function instructions(): string
{
return (string) new SystemPrompt(
background: ["You are a friendly AI Agent."],
);
}
}While it’s recommended to use a config file for external package configurations, for this tutorial we’ll access the API key directly via env().
Neuron AI also supports multiple providers, you can learn more here: https://docs.neuron-ai.dev/getting-started/agent
Using ChatAgent in the Controller
Now that we have our ChatAgent set up, we can update the controller so that responses come dynamically from the AI instead of returning static content.
use App\Neuron\ChatAgent;
use Generator;
use NeuronAI\Chat\Messages\UserMessage;
public function post(Request $request, Thread $thread)
{
$message = $request->input('message');
return response()->stream(function () use ($thread, $message): Generator {
$stream = ChatAgent::make($thread)->stream(
new UserMessage($message),
);
foreach ($stream as $text) {
yield $text;
}
});
}Since Laravel flushes the output buffer between each chunk returned by the generator, we use a Generator to stream the AI response in real time.
Now, if you reload the app and send any message from the Dashboard screen, our app will create a thread, send the first message to the AI model, and you can see the response in our chat interface. However, if you refresh the page, all messages are lost, and the model currently has no memory of past messages.
Thanks to the Neuron SDK, we can add memory to our chat system, allowing it to remember previous conversations.
Message History
To add memory for past conversations, we’re going to use the ChatHistory feature from the Neuron AI SDK https://docs.neuron-ai.dev/components/chat-history-and-memory, As mentioned in the documentation, multiple options are available for handling memory, but we’ll use SQLChatHistory. First, let’s create a migration and model:
php artisan make:model ChatHistory -mMigration file:
<?php
use Illuminate\Database\Migrations\Migration;
use Illuminate\Database\Schema\Blueprint;
use Illuminate\Support\Facades\Schema;
return new class extends Migration
{
/**
* Run the migrations.
*/
public function up(): void
{
Schema::create('chat_histories', function (Blueprint $table) {
$table->id();
$table->foreignUuid('thread_id')->constrained()->cascadeOnDelete();
$table->jsonb('messages'); // jsonb is used here; json can also be used depending on your database
$table->timestamps();
});
}
/**
* Reverse the migrations.
*/
public function down(): void
{
Schema::dropIfExists('chat_histories');
}
};Run the migration and then update the ChatAgent:
<?php
declare(strict_types=1);
namespace App\Neuron;
use App\Models\Thread;
use Illuminate\Support\Facades\DB;
use NeuronAI\Agent;
use NeuronAI\Chat\History\ChatHistoryInterface;
use NeuronAI\Chat\History\SQLChatHistory;
use NeuronAI\SystemPrompt;
use NeuronAI\Providers\AIProviderInterface;
use NeuronAI\Providers\Gemini\Gemini;
class ChatAgent extends Agent
{
public function __construct(public Thread $thread) {}
protected function provider(): AIProviderInterface
{
return new Gemini(
key: env('GEMINI_API_KEY', null),
model: 'gemini-2.0-flash'
);
}
public function instructions(): string
{
return (string) new SystemPrompt(
background: ["You are a friendly AI Agent."],
);
}
protected function chatHistory(): ChatHistoryInterface
{
return new SQLChatHistory(
thread_id: $this->thread->id,
pdo: DB::connection()->getPdo(),
table: 'chat_histories',
contextWindow: 10000,
);
}
}Now, when you chat with the agent, it will remember your messages. However, if you refresh the page, previous messages will still be gone. To fix this, we need to load previous messages in the React component.
Update the ChatHistory model:
<?php
namespace App\Models;
use Illuminate\Database\Eloquent\Model;
class ChatHistory extends Model
{
protected $casts = [
'messages' => 'array',
];
}Update ChatController Index method:
use App\Models\ChatHistory;
public function index(Request $request, Thread $thread)
{
$initialMessage = $request->session()->pull('initial-message');
$chatHistory = ChatHistory::where('thread_id', $thread->id)->first();
return Inertia::render('chat', [
'thread' => $thread,
'initialMessage' => $initialMessage,
'history' => $chatHistory?->messages,
]);
}Update the chat.tsx page:
interface ChatProps {
thread: any;
initialMessage: string;
csrfToken: string;
history: any;
}
export default function Chat({
thread,
initialMessage,
csrfToken,
history,
}: ChatProps) {
// Initialize messages state with previous messages and the initial message
const initialMessages = useMemo(() => {
const historyMessages =
history?.map((m) => ({
id: Date.now() + Math.random(),
role: m.role,
content: m.content,
})) ?? [];
if (initialMessage) {
historyMessages.push({
id: Date.now(),
role: 'user',
content: initialMessage,
});
}
return historyMessages;
}, [history, initialMessage]);
}The real power of this setup comes from just two things: ai-elements and the Neuron AI SDK. Together, they make building an elegant AI chat interface almost effortless.
For instance, the Response component can be paired with a Conversation to to render Markdown effortlessly. Beyond that, the Neuron AI SDK tools make your AI far more powerful and productive. For instance, adding features like Calendar Support is just a matter of configuration.
Thank you very much for reading this! If you found it helpful, please consider sharing it with others.
🔗 Full project available here https://github.com/f24aalam/ai-chat