Build an AI Chat With Ruby on Rails

Oct 1, 2024 - 3 min read
Build an AI Chat With Ruby on Rails
Share

In this post, I'll walk you through how I built an AI chat application using Ruby on Rails. This will give you a behind-the-scenes look at how such an app works, from handling user input to fetching AI-generated responses. If you prefer, you can also watch the accompanying video where I demonstrate everything in action.

Overview of the AI Chat Application

The chat consists of a simple user interface with:

  • An input field for messages
  • A send button
  • A list displaying the conversation history

However, the real magic happens behind the scenes. When a user submits a message, the app processes it, sends it to OpenAI’s API, and displays the AI-generated response.

Message Flow

  1. User Input: A user types a message and clicks send.
  2. Message Processing: The app saves the message in the database and queues a background job.
  3. API Request: The background job calls OpenAI’s API, sending the entire conversation history.
  4. AI Response: OpenAI returns a response, which gets saved and displayed in the chat interface.

Key Components in the Code

The Role of the System Prompt

One important feature is the system prompt, which isn’t visible in the UI but plays a critical role in guiding the AI’s responses. This prompt can instruct the AI to take on a specific role, such as a helpful assistant or a technical expert.

How Messages Are Handled

The chat app consists of two main controllers:

  • ChatsController: Manages chat sessions and the UI.
  • MessagesController: Handles user-submitted messages and triggers the AI response.

When a user starts a new chat, the ChatsController creates a new chat session. Messages are then processed through the MessagesController, where they are assigned roles (user or assistant) and stored in the database.

Calling OpenAI’s API

The actual AI response is handled in a background job (OpenAIResponse), which:

  1. Initializes an OpenAI client using an API key.
  2. Sends the conversation history, including the system prompt.
  3. Receives and processes streamed responses from OpenAI.
  4. Saves the AI response as a new message in the database.

Real-Time Updates

The chat uses Turbo Streams to update the UI in real time. When a new message is added, it automatically appears in the chat window without requiring a full-page reload.

Performance Enhancements

To improve performance, we could:

  • Stream responses directly to the UI instead of updating the database repeatedly.
  • Use message chunking to create a more natural chatbot-like experience.
  • Optimize database queries to reduce redundant calls.

Getting Started with OpenAI’s API

To integrate OpenAI into your Rails app, you’ll need an API key:

  1. Sign in at OpenAI’s Platform.
  2. Navigate to API Keys and create a new secret key.
  3. Store the key in your .env file and load it in your Rails application.

Final Thoughts

This AI chat application is a great example of how to integrate AI into a Rails app. By combining Rails’ robust backend with OpenAI’s powerful language models, you can build interactive and intelligent applications. Whether you want to create a customer support bot, a productivity assistant, or an AI-powered content generator, the approach shown here provides a strong foundation.

If you're interested in building AI-powered applications for your business, feel free to reach out!

Cezar Halmagean
Speaker, Founder, Consultant with over 16 years of experience in helping growing SaaS companies scale large Ruby on Rails applications. Has written about the process of building Ruby on Rails applications in RubyWeekly, SemaphoreCI, and Foundr.
This website uses cookies to improve user experience. By using our website you consent to all cookies in accordance with our Cookie Policy.