Skip to main content

How I Ended Up Creating an AI Playground to Illustrate and Educate



TL;DR

AI Playground

User Guide

From a Conversation to a Spark

Recently, I participated in a World Café event organized by CareerTeam. The topic was how AI is being used in HR and beyond. My original curiosity was about the "arms race" between AI-generated CVs and motivation letters on one side, and AI-driven applicant tracking systems on the other. 

Unfortunately, that discussion never really happened. But the event was still eye-opening: most participants were from HR, and I got an insider's view into how they're dealing with AI---both for themselves and for their organizations. 

One moment stuck with me: An HR professional explained how her company implemented a chatbot for employees who don't know where to find information. A small company, limited resources---yet they made it work. That was my spark. If they could deploy something useful, then anyone should be able to.

Monday: RAG in 24 Hours

That thought turned into action the following Monday (August 25). I dove into RAG (Retrieval-Augmented Generation). After watching a few YouTube tutorials, I pieced together how it works:

  • Split text into chunks using LangChain
  • Generate embeddings for retrieval
  • Pass context to the LLM

I crafted a first prompt and let Claude Code help me build the prototype. Within 24 hours I had something working. Another day of refinement made it reliable enough to demo.

Then I thought---why stop at RAG?

  • I added a regular chat interface.
  • Integrated vision models.
  • Redesigned the whole site to support model selection at runtime via a simple JSON configuration.

By midweek, I had a much more flexible system in place.

Friday: Enter MCP

On Friday, I decided to add MCP (Model Context Protocol) support to the chat function. This was trickier. Claude Code struggled at first with the tool-calling flow, so it took multiple iterations. I started simple with a Weather API (OpenMeteo). After debugging and refining, we had a working Weather MCP server.

Then Claude suggested: "Why not build a registry of MCP services?"

That was all the encouragement I needed---I spent Saturday afternoon building out five more MCP servers. Here's where it got really interesting.

The AHA Moments

When I looked for publicly available MCP services, I found almost none. Most were designed to run locally in Python, which didn't fit my cloud-centric TypeScript stack. So I asked Claude: "Convert this Python MCP server into TypeScript."

And it just... did. Smoothly. That was my first AHA moment.

The second AHA moment came a few hours later when I realized I could do this over and over. Conversion after conversion, commit after commit. The workflow was smooth, and I wasn't hitting the dreaded token context limit I had learned to work around. Why? Because Anthropic's Sonnet 4 model had just been upgraded with a 1 million token context window. I'd read about it before, even commented on LinkedIn---but now I could actually use it.

It felt transformative.

Wrapping Up

Today is Sunday, August 31st. What started as a casual event turned into a full week of exploration, coding, and a few breakthrough realizations. The result is the AI Playground---a space to illustrate, experiment, and hopefully educate others about how these systems fit together.

🔗 Check it out here: AI Playground 

📘 Read the guide here: User Guide


Comments

Post a Comment

Popular posts from this blog

I've Been Vibe Coding for 2 Months, Here's What I Believe Will Happen

In the past few months, I've embarked on an experiment that has fundamentally changed how I approach software development. I've been "vibe coding" - essentially directing AI to build software for me without writing a single line of code myself. This journey has been eye-opening, and I'd like to share what I've learned and where I think this is all heading. My Vibe Coding Journey I started vibe coding with Claude and Anthropic's Sonnet 3.5 model, later upgrading to Sonnet 3.7, Claude Code, and other tools. My goal was straightforward but comprehensive: create a CRM system with all the features I needed: Contact management (CRUD operations, relationships, email integration, notes) Calendar management (scheduling meetings, avoiding conflicts) Group management for organizing contacts A campaign system with templates A standalone application using the CRM's APIs for external contacts to book meetings direct The technical evolution of this project was inter...

Vibe Coding Alert! How I Rebuilt a Wix Site and Fed the “AI Will End SaaS” Panic

My better half is an artist and maintains a Wix.com site. For the second time in two years, Wix decided to raise the hosting fees. That’s when I suggested to my spouse that I could rebuild the website and host it on Firebase (where I host most of my projects). I assumed this wouldn’t be a big deal (I was wrong) and started researching ways to use a lightweight CMS with Firebase support. Such a system exists — it’s called FireCMS — and it’s excellent. Before I dive deeper, here’s her original site (no longer a paid Wix site):  Miyuki's WIX site Her instructions were clear: replicate it as closely as possible. So I went to work. I created a product development document with use cases, scope, screenshots from the original site, the required features, and of course FireCMS integration. I used ChatGPT to draft the document, then set up a new Firebase instance, and finally launched the Vibe Coding agent (Claude Code). The process wasn’t too different from my other projects, but what sur...

The case for central bank digital currency as public infrastructure to enable digital assets

I have dabbled a fair amount with all sorts of crypto currencies and their respective permissionless networks. In fact, I have been dabbling since 2012 which is by my last count a whopping 12 years. While I have always maintained that I do believe the general concept for digitization and programmability of assets is on the right path, its implementation, the user experience, the accessibility, the fraudulent activities, and the overall inefficiencies permissionless DLTs have, never made me into a true believer. I have stated that opinion on several occasions, here , here and here . There are still barriers to entry when it comes to digitization of assets: sustainable- and interoperable infrastructure. To illustrate this, I recently asked a notary public here in Zurich, why they can’t store the notarized documents as PDFs, the answer surprised me: because they must keep records for at least 70 years. Now, think about what would have happened if we stored these documents on floppy disks...