Fuel at 50% Off

Why Koko stacked $60M of debt selling fuel at a loss.

Hey - It’s Nico.

Welcome to another Failory edition. This issue takes 5 minutes to read.

If you only have one, here are the 3 most important things:

  • Koko, a clean-cooking startup in Kenya, has shut down — learn why below

  • How to build AI product sense.

  • Google just released Project Genie, an AI model that creates interactive worlds — learn why this matters below

A huge thanks to today’s sponsor, Oceans. Build a high-output global team at up to 80% less cost with their help.

Build a high-output team without U.S. headcount costs AD

Did you know one U.S. hire can cost the same as a small global team?

That’s where Oceans Talent comes in.

We help startups hire vetted global talent across ops, finance, marketing, EAs, and more at up to 80% less than U.S. hiring costs.

Here’s what you get:

  • Deep discovery on your goals and role requirements

  • A match sourced from 1500+ monthly applicants

  • Vetting beyond resumes across 12 hard and soft skills

  • Integration plan to support the first 100 days with defined expectations, success metrics, and recommendations

  • Dedicated account managers help you scale your Oceans Talent team as you hire more.

400+ companies use Oceans Talent to do more while spending less.

This Week In Startups

🔗 Resources

The automation curve in agentic commerce.

Pilot helps startups run world-class finance, including bookkeeping, payroll, taxes, and fundraising, powered by dedicated US-based experts and AI-driven insights * 

📰 News

OpenAI launches a way for enterprises to build and manage AI agents.

Anthropic releases Opus 4.6 with new ‘agent teams’

OpenAI launches new macOS app for agentic coding.

Amazon to begin testing AI tools for film and TV production next month.

💸 Fundraising

Lawhive, a startup using AI to reimagine the general practice law firm, raises $60 million.

Business identity startup Duna raises €30m.koko

* sponsored

Fail(St)ory

Subsidizing Fuel With Credits

Koko ran a clean-cooking network in Kenya. It sold bioethanol for everyday cooking to households that normally relied on charcoal or kerosene.

In early 2026, it shut down almost overnight.

What made Koko interesting wasn’t the product. It was the business model. The company sold fuel and cookware far below cost, took on serious debt, and expected to make its money from carbon credits sold to buyers overseas.

When that plan broke, everything else followed.

What Was Koko:

If you live in Europe or the US, a “clean-cooking startup” sounds niche. In Kenya, it isn’t. A large share of households still cook with charcoal or kerosene, especially in low-income areas. These fuels are smoky, unsafe indoors, and tied to deforestation.

Koko tried to replace charcoal and kerosene with bioethanol, a cleaner fuel that burns without filling the room with smoke. 

The company built a nationwide network of automated fuel-dispensing machines. Think ATMs, but for cooking fuel. You brought your container, tapped in, refilled, and went home. By the time Koko shut down, there were more than 3,000 of these machines spread across Kenya.

What made Koko unusual was the business model. It sold bioethanol at roughly half the market price, around KES 100 per liter when the going rate was closer to KES 200. It also sold a proprietary cookstove for about $12, even though the reported cost to make and deliver that stove was closer to $115.

Koko was losing money on every customer, on purpose. The company never expected to make money from Kenyan households. The assumption was that these families could never pay the “real” price for clean cooking, and pretending otherwise would kill adoption.

Instead, Koko planned to get paid through carbon credits. When a household switched from charcoal or kerosene to ethanol, it avoided emissions. Those avoided emissions could be measured, verified, and sold to buyers overseas.

The easiest way to picture it is this: imagine you’re an airline in Europe. You can’t easily cut your emissions, so you buy carbon credits to offset them. Koko wanted to sell you those credits, at a relatively high price, and then use that money to subsidize cooking fuel for families in Kenya.

In practice, the carbon buyer paid for the clean cooking. The Kenyan customer just saw cheap fuel and a stove that worked.

That setup made Koko more like a climate-finance machine than a normal consumer startup. It also meant the whole model depended on being allowed to sell those credits internationally. When that assumption started to wobble, the rest of the business had very little margin for error.

The Numbers:

  • 🗓️ Founded: 2013

  • 🏠 Customers: ~1.5M households served

  • 🏧 Network: 3,000+ fuel ATMs

  • 🧑‍🏭 Team: 700+ employees laid off

  • 💸 Funding: $100M+ debt + equity

  • 🧾 Debt: $60M+ by early 2026

Reasons for Failure: 

  • The entire business depended on one government letter: Koko needed a Letter of Authorization from the Kenyan government to sell its credits into the compliance pathways it was built around. Without it, the company couldn’t claim the “high-price” carbon revenue that funded the subsidies. When the refusal came in January 2026, the model stopped working immediately.

  • The unit economics were intentionally negative, and they stayed that way: Selling ethanol at half price and stoves far below cost can work if there’s a reliable subsidy source that scales with usage. Koko’s subsidy source was carbon, but the company didn’t control access to that market. So every new household was both a win and a bigger cash drain. At 1.5 million households, you’re carrying a national-scale subsidy program on a startup balance sheet.

  • Debt turned policy delay into a fatal cliff: Koko financed the gap with debt tied to future carbon revenue. Koko had neither. Once the government refused the LoA, $60M+ in debt became unserviceable, fast. There’s no graceful unwind when your cost base is physical infrastructure and your revenue is blocked by policy.

Why It Matters: 

  • Koko shows what happens when your revenue depends on policy instead of customers.

  • Subsidies that scale faster than certainty can kill you before they ever pay off.

Trend

Text-to-World

You’re probably numb to “AI can generate X” headlines by now.

So here’s the one that actually made me sit up a bit: Google’s Project Genie doesn’t just generate a scene. It generates a place. A place you can walk around in, mess with, and turn into a tiny game with arrow keys.

I talked about world models a couple months ago. Back then it was mostly theory and demos on Twitter. Now Google ships a thing you can actually try. And it’s a little unsettling.

Why it Matters

  • It’s a real stepping stone toward AGI. An agent needs to predict what happens next when it acts, and world models are how you train that without wrecking real stuff.

  • It’s a simulation shortcut for industries that already pay for simulation. Driving, robotics, training, QA, safety drills, anywhere “let’s test it in a fake world first” is already a budget line.

What it is

Project Genie is Google’s new experiment for building interactive virtual worlds from prompts.

You type “enchanted forest” or “ancient city” or whatever your brain comes up with. You can also feed it an image. Then you drop in a character and start moving around. Arrow keys. Camera controls. The usual.

If this was just video generation, it would be cute and forgettable. The wild part is that it stays coherent while you play.

As you walk forward, the model generates what’s ahead in real time. When you turn around, the stuff behind you doesn’t randomly morph into something else. If there’s a wall, you don’t phase through it. If you hit an object, it reacts like an object, not like a glitch. It’s doing enough physics to feel like a world instead of a slideshow of vibes.

That’s the mind-bending shift: the model isn’t only drawing frames. It’s tracking state.

It has to “remember” what it already showed you, predict what should appear next, and keep the whole thing consistent across time.

Google calls it a “world model.” While LLMs predict the next word, world models predict the next moment. They simulate how an environment evolves and how your actions change it.

The Trend

So why does this matter?

The obvious answer is video games. Since Project Genie came out a lot of gaming companies have seen a sharp decline in their stock price: Take-Two gets whacked more than 9% in a day. Roblox drops around 12%. Unity eats the worst of it, down about 30%. 

But gaming is just the loudest first-order effect. The bigger story is that a stable, interactive world model is a new kind of infrastructure. Once you have it, it leaks into everything.

A couple months ago I wrote about Fei-Fei Li’s “Spatial Intelligence” thesis. The point was simple: today’s AI is great at describing the world, terrible at inhabiting it. LLMs can talk about a kitchen perfectly and still fail at basic stuff like distance, occlusion, and what happens when you push a cup off a counter.

Her argument was that the next leap isn’t more words. It’s giving models a sense of space, time, and cause and effect. 

That’s why world models matter. They’re the bridge between “knows language” and “can operate.”

If you want AGI, you don’t just need a model that can answer questions. You need a system that can do the boring parts of being competent in the real world.

World models are a way to train that competence without burning through real-world trials. You can’t teach an agent to be useful in warehouses, hospitals, factories, or roads by letting it crash into things until it learns. 

You need a safe place to practice.

That’s what world models are: a training sandbox where actions have consequences and the world stays consistent. Project Genie is a playable demo of that idea.

Help Me Improve Failory

How useful did you find today’s newsletter?

Your feedback helps me make future issues more relevant and valuable.

Login or Subscribe to participate in polls.

That's all for today’s edition.

Cheers,

Nico