• Failory
  • Posts
  • The Product Leaked. Literally.

The Product Leaked. Literally.

Why Zero Co couldn’t recover from a leaky rollout.

Hey — It’s Nico.

Welcome to another Failory edition. This issue takes 5 minutes to read.

If you only have one, here are the 5 most important things:

Let’s get into it.

This Week In Startups

🔗 Resources

Make product management fun again with AI agents.

Five practical lessons to scale your data products.

📰 News

Google’s Gemini chatbot gets upgraded image-creation tools.

Duolingo launches 148 courses created with AI after sharing plans to replace contractors with AI.

OpenAI explains why ChatGPT became too sycophantic.

💸 Fundraising

Edgerunner AI raises $12M to help military use AI.

ID verification startup Persona raises $200m at $2bn valuation.

Cloud optimization startup Cast AI raises $108 million.

Dub, a startup that lets users trade the same stocks as influencers raises $30 million.

Fail(St)ory

Zero Co, Zero Left

This week, Zero Co shut down.

The Australian startup had raised over $13 million to reinvent how we buy everyday cleaning products — replacing single-use plastics with sleek, paper-based refills built for reuse.

Six years in, the mission is over.

What Was Zero Co:

Zero Co was built around a simple idea: what if you could buy shampoo, dish soap, and laundry detergent without all the plastic waste?

Their approach combined recycled packaging, refill systems, and a promise to clean up oceans along the way. You’d start with a kit — reusable bottles made from ocean plastic, plus refill pouches made from landfill-bound waste. Once empty, you’d mail the pouches back for cleaning and reuse. The branding was bright, the tone was optimistic, and the mission was loud and clear.

They didn’t stop there. In 2023, they rolled out ForeverFill — a full redesign. Paper-based refills. Sleek new bottles made from 80% recycled materials. Concentrated formulas that used less plastic, weighed less to ship, and lowered freight emissions. They claimed up to 97% less plastic than traditional brands, and costs that would come down over time.

Zero Co didn’t lack ambition. Their goal was to build a $100M-a-year business and donate 1% of revenue to ocean cleanup efforts. In 2021, they made headlines for raising $5M in under six hours on the crowdfunding platform Birchal — a record. VCs joined in with another $6M. They raised another $2M from the crowd in 2024.

It wasn’t a niche project. It was a full bet on making sustainable packaging work at scale. But the economics never caught up.

The Numbers:

  • 📅 Founded in 2019

  • 💰 Raised $13M+ (crowdfunding + VC)

  • 📦 Launched ForeverFill in late 2023

  • 🌊 Removed over 45 million bottles from the ocean.

Reasons for Failure: 

  • The original model was hard to scale: Zero Co’s first system asked customers to mail back empty refill pouches so they could be cleaned and reused. It was a nice idea in theory, but expensive and logistically painful in practice. Handling returns, cleaning packaging, and managing reverse logistics added complexity that most DTC brands don’t have to deal with. The company spent two years stuck in this loop, trying to make the model work before eventually pivoting. The 2023 ForeverFill relaunch was meant to solve this by moving to paper refills that could be discarded, but by then, a lot of time and cash had already been burned.

  • The ForeverFill launch didn’t go as planned: ForeverFill was supposed to be a reset — lighter, cheaper, less plastic, and no more return logistics. But soon after launch, customers started reporting broken seals and leaky packaging. Zero Co had to issue refunds and pause parts of the rollout. For a product betting everything on better packaging, this kind of failure hit hard. To his credit, CEO Mike Smith addressed the issue head-on, with full transparency and a genuine tone you don’t often see in these situations:

  • They couldn’t find a growth engine: Zero Co raised fast, but they never really hit their stride on revenue. The CEO admitted as much: “We’ve tried a number of different strategies and tactics over the years to put the business on a sustainable growth trajectory but, unfortunately, have been unable to do so.” 

  • High product complexity in a low-margin category: They weren’t just selling soap — they were designing packaging, managing supply chains, coordinating cleanups, and trying to educate customers about a new way to consume. That’s a lot to juggle for a category where people usually just buy what’s cheapest at the supermarket. Most DTC brands aim for high-margin, high-loyalty products. Zero Co picked a category with none of that.

Why It Matters: 

  • If your product is the packaging, it has to be perfect.

  • Crowdfunding can get you attention — but it’s not a growth strategy.

  • Hardware + logistics + low-margin goods = an uphill battle.

Trend

Decentralized LLMs

Everyone’s talking about bigger models, faster GPUs, and who's got the most training data. But a quiet revolution is brewing in a very different corner of AI — one that doesn’t rely on giant data centers or scraping the internet. 

Two startups are working together to build a large language model that’s trained across a distributed network of user-owned devices, using data people actually control. It’s called Collective-1, and it could signal a whole new way of doing things.

Why It Matters:

  • AI without data centers: Collective-1 is trained across a decentralized network of devices, not inside a warehouse full of GPUs.

  • User-owned data: Instead of scraping the internet, this model is trained on data people contribute willingly — and still control.

  • Open and scalable: Built by two open-source-first startups, this approach opens the door for anyone to train their own model.

How It Works

Right now, if you want to train a serious AI model, you need three things: a mountain of scraped data, a warehouse of high-end GPUs, and a budget with at least eight zeroes. Everything happens inside giant data centers, owned by companies that can afford them. The result? A few companies control the most powerful models in the world.

Collective-1 flips that. It’s trained using federated learning — a method where the model is sent to where the data lives, not the other way around. Think laptops, desktops, idle GPUs. The raw data stays on the device. No one is uploading personal info to a server. It’s all done locally, and only model updates are shared back.

Behind this project are two startups:

  • Vana handles the data layer. It’s a decentralized network where users contribute data — conversations, health metrics, app activity, GPS logs — and retain full ownership through a system of DataDAOs. This gives Collective 1 the opportunity to train on a bunch of private information that other LLMs do not have acces to.

  • Flower Labs built the infrastructure for training. They maintain the largest open-source federated learning framework and have already trained models up to 7B parameters across distributed networks.

To make all this work, Flower Labs developed a tool called Photon. It's what keeps the system from falling apart. Normally, training across thousands of mismatched devices with unreliable internet would be a nightmare. Photon acts like air traffic control — managing communication, optimizing performance, and making sure all the pieces move in sync. It was co-developed with researchers in the UK and China and is already open-sourced.

Collective-1 is their first joint model, but it's just the start. Flower is already working on a 30B-parameter model, and by the end of the year, they aim to hit 100B — matching the scale of models built by OpenAI and Anthropic.

Why This Changes the Game

This isn't just about building one model differently. It’s about rewriting who gets to build models at all.

Today, training frontier AI models is reserved for a handful of well-funded players with deep compute access. Everyone else builds on top of what they're allowed to use. With decentralized training, that could change fast.

Here’s what this unlocks:

  • Startups can build their own models without renting a supercomputer or raising a mega-round.

  • Data becomes a real asset — users contribute it voluntarily and could be rewarded when it’s used.

  • Diversity of models — with access to richer, more personal data, builders can train models that are more useful in specific domains, rather than relying on generic internet text.

And because the tools and frameworks are open source, anyone can start building — not just those who can pay to play. The power shifts from centralized labs to distributed communities

Collective-1 is the proof of concept. What comes next could open the door to a much more open and competitive AI landscape.

Help Me Improve Failory

How Was Today's Newsletter?

If this issue was a startup, how would you rate it?

Login or Subscribe to participate in polls.

That's all of this edition.

Cheers,

Nico