February 2026 · 3 Min Read
I Spent Three Months Building Something Nobody Wanted
The story of Glyph, an AI-powered carousel generator for nonprofits—and why it never shipped.
In December 2025, I stumbled across a founder on Instagram who had built a tool to automate carousel posts for his nonprofit. Bold headline, branded colors, simple layout—the same visual template, over and over. It seemed like the kind of repetitive work AI could handle. I thought I could build it better.
So I started Glyph as an honors project. The idea was simple: paste an Instagram handle, let AI analyze the account's visual style, and generate new carousel slides that feel on-brand. I'd interned at Restoring Rainbows during high school, one of the largest youth-led nonprofits, so I felt I had the connections to grow something like this.
I did not do a single minute of user validation before writing code.
The Build
For three months, I refined the pipeline. Glyph scraped posts via Apify, ran them through Gemini for visual analysis, then fed everything into a multi-stage “Creative Director” and “Art Director” orchestration layer. FLUX.2 Pro generated backgrounds on Replicate. Satori and Sharp composited text overlays.
The Pipeline
The architecture was ambitious—maybe too ambitious. Multi-stage AI orchestration, design-thinking frameworks, percent-based layout coordinates. It was engineered.
And for simple posts, it worked. The pipeline could analyze an account's style and produce something that felt right. I was making real progress.
Where It Broke
Then I tried more complex posts. Multi-element layouts. Nuanced brand photography. Carousels that needed to feel like a human designer understood the organization's mission. It felt like starting from zero.
The pipeline that worked for simple slides couldn't handle the jump. Backgrounds bled colors between elements. And the images themselves just looked off. Not broken, but not right either. One workaround, then another, then another.
AI wasn't failing. It was limited in the way I was trying to use it—I was asking it to replicate something that requires genuine design judgment, and no amount of prompt engineering could bridge that gap. I stopped debugging and started asking myself a different question: Is this struggle actually leading somewhere meaningful?
The Market Reality
That question pushed me to do something I should have done on day one—talk to the people I was building for.
The answer was immediate. Nonprofits have hundreds of volunteers. Making Instagram carousels is exactly the kind of task they delegate for free. Why would they pay for an AI tool to do it?
The struggle wasn't meaningful—not because the technology couldn't improve, but because even a perfect version of Glyph would be solving a problem that didn't exist.
What I Took Away
“Make something people want.”
— Paul Graham
Three months of work and nothing to ship. That stung.
But the lessons were worth every hour—AI orchestration, image compositing, prompt engineering. I pushed against the limits of what the technology can do. And I learned the hardest lesson in building products: talk to users before you write a single line of code.
Glyph never shipped. But I came out of it knowing how to orchestrate multi-stage AI systems—and knowing that the next thing I build starts with a conversation, not a codebase.