Build a real-time chat app using FastAPI (Python) for the backend, PostgreSQL to store messages, Redis for realtime fan-out (pub/sub), and a simple React frontend.
Async Python framework with WebSocket support for real-time messaging.
Reliable message storage with real-time pub/sub for instant delivery.
Simple, responsive interface that connects via WebSocket for live chat.
High-level system design with scalable components for real-time messaging.
Web/Mobile clients connect via WebSocket and REST APIs for seamless real-time communication.
FastAPI handles WebSocket connections, message routing, and business logic with async performance.
PostgreSQL stores messages reliably while Redis enables real-time pub/sub message distribution.
In plain words - why we picked these technologies for your chat app.
Fast to write, supports async code and WebSockets for real-time connections.
Reliable storage for user data and complete message history with ACID compliance.
Quick message passing between server instances with pub/sub for real-time delivery.
Store files like images safely with signed URLs for secure access.
Easy steps to build your chat app from start to finish.
Set up project with poetry/venv and install dependencies
Implement signup/login and real-time messaging endpoints
Save messages to Postgres and build React chat interface
Copy-and-paste commands to get your chat app running in minutes.
poetry init -n poetry add fastapi "uvicorn[standard]" sqlalchemy asyncpg redis[async] passlib[bcrypt] python-jose poetry add -D pytest pytest-asyncio
uvicorn app.main:app --reload --host 0.0.0.0 --port 8000 docker-compose up --build
For local development with Postgres + Redis + App.
version: "3.8"
services:
postgres:
image: postgres:15
environment:
POSTGRES_USER: chatuser
POSTGRES_PASSWORD: chatpass
POSTGRES_DB: chatdb
ports:
- "5432:5432"
redis:
image: redis:7
ports:
- "6379:6379"
backend:
build: .
command: uvicorn app.main:app --host 0.0.0.0 --port 8000 --reload
volumes:
- ./:/code
ports:
- "8000:8000"
depends_on:
- postgres
- redis
Drop this into app/main.py - very minimal but functional.
from fastapi import FastAPI, WebSocket, WebSocketDisconnect
import json
app = FastAPI()
class ConnectionManager:
def __init__(self):
self.active_connections: dict[int, WebSocket] = {}
async def connect(self, user_id: int, websocket: WebSocket):
await websocket.accept()
self.active_connections[user_id] = websocket
async def disconnect(self, user_id: int):
self.active_connections.pop(user_id, None)
async def send_personal_message(self, user_id: int, message: dict):
ws = self.active_connections.get(user_id)
if ws:
await ws.send_text(json.dumps(message))
manager = ConnectionManager()
@app.websocket('/ws')
async def websocket_endpoint(websocket: WebSocket, token: str = None):
user_id = 1 # validate token -> get user_id
await manager.connect(user_id, websocket)
try:
while True:
data = await websocket.receive_text()
print('received:', data)
except WebSocketDisconnect:
await manager.disconnect(user_id)
Clean and concise models for your chat app with SQLAlchemy ORM.
class User(Base):
__tablename__ = 'users'
id = Column(Integer, primary_key=True)
username = Column(Text, unique=True)
password_hash = Column(Text)
class Message(Base):
__tablename__ = 'messages'
id = Column(Integer, primary_key=True)
sender_id = Column(Integer, ForeignKey('users.id'))
recipient_id = Column(Integer, ForeignKey('users.id'))
content = Column(Text)
created_at = Column(DateTime, default=datetime.datetime.utcnow)
delivered = Column(Boolean, default=False)
Step-by-step journey of a message from sender to recipient
User sends message via WebSocket to server instance
Message stored in PostgreSQL for durability
Server publishes event to Redis channel for recipient
All servers receive event and forward if recipient online
Recipient client receives message and sends ACK
Messages saved to PostgreSQL before broadcasting ensures no data loss
Redis pub/sub provides sub-millisecond message delivery
Multiple server instances can handle thousands of concurrent users