Deep LearningNovember 12, 2025

If I Were Starting an AI/ML Project Today — This Is the Only Stack I’d Use

Because you can’t build serious AI projects on scattered notebooks anymore.

Ferdous Rahman's profile picture
Ferdous Rahman
3 min read3 views
If I Were Starting an AI/ML Project Today — This Is the Only Stack I’d Use

The Problem

Most “AI projects” never make it past the notebook phase.
You experiment, you get a cool demo running, and then — boom — deployment limbo.

What you need is a builder-first stack that treats AI like a backend system, not a science fair project.

If I were starting fresh in 2025, this is the stack I’d use.

The Stack

FastAPI + Uvicorn + Ruff + Async + Makefile + Docker + Postgres + OpenSearch + Ollama + Airflow
All in one repo.
With a notebook to prove you set it up right.

1. FastAPI — the Brainstem

FastAPI is clean, typed, async, and modern. It feels like Flask grew up and learned manners.

  • Structure your src/ with routersservicesrepositories, and schemas.
  • Pydantic models give you compile-time peace of mind.
  • Uvicorn makes it fly.

You get instant OpenAPI docs, async endpoints, and enough performance to make it production-ready from day one.

Rule #1: Build APIs like you’ll need them later. Because you will.

2. PostgreSQL — the Reliable Memory

Postgres is the adult in the room.
Pair it with SQLAlchemy (or SQLModel if you’re into minimalism) and Alembic for migrations.

  • Use async drivers (asyncpg or Psycopg3 async).
  • Run migrations, seed data, and configure through .env.

It’s relational, robust, and it’s even starting to speak vectors.

3. OpenSearch — the Smart Librarian

Need RAG or semantic search? OpenSearch is your open-source hero.

Hybrid retrieval — BM25 plus vector search — works beautifully for small to mid-sized projects.
You can wire it up to store embeddings, text, metadata, and retrieval logs.

Lesson from every RAG project ever: if you can’t search cleanly, your model will hallucinate confidently.

4. Ollama — Your Local LLM Buddy

Start local before you start paying per token.
Ollama lets you run open models on your own hardware and connect via REST.

Once you’ve validated the flow, you can always swap in a hosted model (OpenAI, Anthropic, or Mistral).

That’s how you get rapid iteration without monthly invoice anxiety.

5. Airflow — The Unseen Backbone

Every data project eventually needs pipelines.
Use Airflow for ingestion, embedding, retraining, indexing, and whatever else your data lifecycle demands.

You’ll thank yourself when “just one script” turns into six nightly jobs.

6. Dev Experience — Don’t Wing It

This is what makes your stack repeatable:

  • ruff: keeps your code clean
  • pre-commit: enforces good behavior
  • Makefile: bundles complex commands into one-liners
  • Docker + Compose: spins up your entire environment
  • uv (Uvicorn watcher): for fast dev feedback loops

If you’re serious about collaboration, reproducibility, or just not breaking your own setup, these are non-negotiables.

7. Quality, Observability & Scale

Even if you’re just “trying things,” pretend it’s production.

  • Add /health and /metrics endpoints.
  • Include tracing hooks.
  • Manage configuration properly.
  • Use structured logging.
  • Write type-checked code.

You don’t build scalable software by accident. You build it by pretending scale is coming tomorrow.

8. Quick Start

# 1. Clone
git clone yourrepo && cd yourrepo

# 2. Copy env
cp .env.example .env# 3. Sync deps
uv install# 4. Launch stack
docker compose up -d# 5. Visit
http://localhost:8000/docs

You’ll have FastAPI, Postgres, OpenSearch, Airflow, and Ollama running locally — the whole system live in minutes.

9. Why This Stack Works

Because it’s infra-first.
You’re not duct-taping notebooks. You’re building a backend that can host, serve, and scale your models.

It’s modular enough for solo builders, structured enough for teams, and production-ready when your side project turns serious.

Most importantly, it forces you to think like a builder, not a tinkerer.

10. The Takeaway

Every AI project eventually becomes a software project.
This stack just admits that early.

FastAPI is your interface.
Postgres remembers.
OpenSearch thinks.
Ollama speaks.
Airflow breathes.

Add Docker, discipline, and a dash of curiosity — and you’ve got the foundation of a real AI system, not a science project.

Stop hacking. Start building. The AI era rewards those who ship.

Share:
00

Comments

Loading comments...