A devlog on rebranding a Wordle clone, evaluating Cloudflare Workers, and deploying with Google Cloud Run—with AI-assisted coding from Cursor.
Dan + GPT4o
This project started as a small experiment. I wanted to repurpose an open-source Wordle clone, rebrand it as Word Game Daily, and see if anyone would play it if I hosted it on my own domain.
Word Game Daily
The goal wasn't just to deploy it, but also to test lightweight deployment strategies and maybe hook it into other projects later. Along the way, I kept a few notes as I worked.
Task | ✅ / ❌ | Notes |
---|---|---|
Fork and rebrand | ✅ | Simple rename + HTML tweaks |
Running locally with Flask | ✅ | No issues |
Rewriting to FastAPI | ✅ | Chosen for OpenAPI support (for yestag.dev) |
Deploying to Cloudflare Workers | ❌ | Blocked by filesystem limitation |
Cursor-generated deployment automation | ⚠️ | Helped, but required review and edits |
Google Cloud Run deployment | ✅ | Easy once containerized |
Adding analytics via Google Tag Manager | ✅ | JS event code generated by Cursor |
iframe sharing support | ✅ | Required allow="web-share" |
Throughout the project, I used Cursor to generate or update code via prompts. It handled some tasks well, but it also made a few mistakes—especially around deployment tooling and environment-specific limitations.
Some of my prompts included:
The original clone used Flask, which worked fine locally. But I decided to switch to FastAPI early on—not because of any technical blocker, but because of personal preference, and another project of mine, yestag.dev, relies on OpenAPI docs. FastAPI generates OpenAPI schemas out of the box, which simplifies integration. I may make use of this in a later experiment.
Prompt to Cursor:
"This project needs to run on FastAPI, not Flask."
Cursor generated the necessary changes to the main Python file and I made a few light edits to the HTML templates. The transition was smooth.
My first deployment plan was to use Cloudflare Workers. It's fast, cheap, and now offers beta support for Python and FastAPI. That made it a natural target for something small like this.
Prompt to Cursor:
"I would like to deploy this to Cloudflare using Wrangler. What changes do we need to make?"
I tried this with both:
auto
modelBoth responded with reasonably helpful scaffolds, but they were not up to date with Cloudflare's Python environment, and they missed key limitations.
fastapi-cloudworker
The filesystem was the real showstopper. The game depends on:
.json
files for word listsThese require reading from disk at runtime, which Workers don't support. Workarounds exist (e.g., Cloudflare KV or R2), but they add some setup I feel aren't worth the overhead for this kind of quick experiment.
🧠 Key Insight: Even with newer LLMs like Claude 3.7 Sonnet, AI deployment help lags behind the bleeding edge. Models didn't immediately know that CF Workers' Python support doesn't include filesystem access, and that's a blocker if I'm expecting a deep understanding within the LLM.
So, I pivoted.
Instead of fighting with Cloudflare's limitations and the LLM's context cutoff, I went with a reliable alternative: Google Cloud Run.
Prompt to Cursor:
"OK this isn't going to work as a Cloudflare Workers function. I want it as a Docker container which I can deploy using Google Cloud Run. So I need a Dockerfile."
Cursor generated a Dockerfile
and a build.sh
script using Google Cloud Build. Both were solid starting points—but once again, manual fixes were required:
--platform=linux/amd64
to the build step (to ensure architecture compatibility with GCR)Once the container was deployed, everything worked cleanly.
To understand if anyone was using the game, I integrated Google Tag Manager and asked Cursor to help generate tracking events.
Prompt to Cursor:
"I've added Google Tag Manager. Can you add some gameplay events to the index and game.js files?"
Cursor responded by injecting dataLayer.push()
calls for:
The events were correctly wired to GTM and helped confirm that people were playing.
<iframe allow="web-share"></iframe>