When IBM Killed Our Database

These are the highlights from an episode of Tern Stories. You can watch the full conversation with Anthony Sottile on YouTube, Spotify, Apple, or wherever you get your podcasts.
IBM was dropping RethinkDB support. No migration path. No alternatives. Just an end date.
Andy Monroe’s team at LaborChart had five engineers and an entire construction scheduling platform built on a database that was already dead. They’d watched every other RethinkDB hosting provider shut down, leaving only Compose.io. When IBM acquired Compose, they knew it was just a matter of time.
Andy’s air conditioning had died two hours before our interview. It was 90 degrees in Kansas, and as we talked about this moment, the irony wasn’t lost on either of us. The company he’d helped build solved scheduling problems for construction workers—the very people who might have fixed his AC. Now he had a different kind of emergency to handle.
The Day IBM Dropped the Bomb
“We finally got the notification we were dreading. IBM was dropping RethinkDB support.”
They’d known it was coming. RethinkDB had been defunct for years, and Compose.io—LaborChart’s last hosting provider—had been acquired by IBM. Self-hosting wasn’t an option. So Andy and his team took stock.
They considered MongoDB, Couchbase, and DynamoDB—but none matched how they used RethinkDB.
“The way we used RethinkDB was fundamentally relational. Foreign keys between records. Lots of joins.”
Ultimately, Postgres wasn’t just the safe choice—it was the right one. LaborChart’s data model had matured into something Postgres was built for.
The pgwriters Breakthrough
Like most NoSQL stores, RethinkDB had no enforced schema. To combat that, Andy had added strong TypeScript types during a previous CoffeeScript rewrite. That gave him enough structure to model the Postgres schema. It was a lever to fix years of accumulated mess while he was at it.
He renamed legacy concepts, cleaned up confusing fields, and even patched a long-lived typo: a field named “recipeint” had made it all the way into the public API. They kept it for compatibility but fixed it in the new schema.
And instead of treating the app as the source of truth, Andy flipped it:
“Let Postgres lead. We generated types from the database schema.”
Andy built an abstraction he called “pgwriters”—one per table—to translate RethinkDB-style documents into normalized Postgres writes. Each function knew how to pull apart a JSON blob and insert the right pieces into the right tables. That made them reusable for both backfills and live traffic, with every write following the same path.
Zero-Downtime Execution
The migration unfolded in deliberate stages. First, they used the pgwriters to backfill Postgres directly from RethinkDB, one table at a time. Because the same code powered dual writes in production, there was no difference between the initial data load and a live write—just a matter of timing.
Once dual writes were live, they added a validation layer. Every read pulled from both databases, converted the results to JSON, and logged any mismatches. This gave them a live diff of production behavior without exposing customers to risk.
Only after the logs stayed clean did they flip the feature flag to start reading from Postgres. They sequenced tables carefully, starting with those that nothing else depended on.
“I remember those first few tables. Are customers about to crash? Pretty much every time—radio silence on Sentry. Exactly what you want.”
Nothing crashed. In fact, as they flipped every flag, there weren’t any incidents at all.
The AI-Era Lesson
Andy left before the migration was finished, but the system held. The team shipped it without drama—because the structure was already in place.
That’s the lesson he’s carried into his current work with AI tools like Claude.
“What’s good for humans is good for Claude. What’s good for Claude is good for humans.”
Clear schemas, consistent write paths, and predictable types don’t just help humans—they’re exactly what AI systems need to reason about code. The migration proved it: structure saves you. Whether you’re rewriting a data model or prompting a language model, what’s good for Claude is good for humans too.
Find Andy at andymonroe.dev or email him at mail@me.andymonroe.dev.