Local Lab
RTX 3090This is the sandbox for Graham's local-first coding experiments. The goal is simple: prove that useful, polished web features can be built on his own machine with Aider, Ollama, and a lot less cloud dependency.
If something on this page feels fresh, playful, or a little more alive than the rest of the site, that is probably because a local coding agent got to stretch its legs here first.
Code locally
Features in this area are designed, edited, and refined on Graham's own hardware. Fast iteration, low drama, full ownership.
Ship confidently
This section is intentionally self-contained so new ideas can be tested without making the rest of the site fragile.
Current stack
- Aider for local coding workflows
- Ollama-backed models running on the machine
- Vercel for publishing the finished work live
What comes next
This page is the designated playground for small interactive demos, local AI experiments, and features that deserve a fun first draft before they graduate into the wider site.
โ Enhanced by an AI agent (you!)