×

AcademyAI — Build overview

How we build

AI compresses delivery time. Senior engineers direct, review, and own every line of output.

Delivery cadence

Week 4

First version alive

Rough. Buggy. Working to a degree. In your hands.


AI model POC established and in testing


Design system v1 complete – designed AI prototyping available to Andy and rest of team

Week 8

Good outputs

Multiple output forms. Real value being produced.


Still buggy, not yet beautiful interface


More advanced outputs, first fundamentals course demo

Week 12

Usable product

Solves the user's problem. Not commercial yet.


Producing stronger outputs, model well tested


Andy's market exploration learnings guide refinement prioritisation

Week 24

MVCP live

Payments. GDPR. Polished UX. Ready to launch.


Goal is to launch with fundamentals working well + one additional feature/module

How human and AI work together

Lawrence's question about how we split time between human developers and AI cannot be exactly quantified, but the core philosophy is that AI generates, humans curate – every line of code is reviewed, refined, and owned by experienced engineers. This is the most AI-literate and augmented team available and we'll be making strong use of AI to move quickly.

Specify

Human

Generate

AI

Review

Human

AI Tooling

We make use of many AI tools across all our processes, but in development terms, these are the most impactful.

Claude Code Agentic coding directly in the codebase — implementation, refactoring, test generation
Codex Async task execution against specs — parallel workstreams, PR-ready output

AI handles

Speed and volume

Scaffolding and boilerplate
Parallel workstreams via specs
Test generation
Rapid iteration and refactoring

Engineers own

Judgment and direction

Feature specifications
Code review on every change
Architecture and security
UX decisions and quality bar