Exposed: Failed Startups Cash In on Slack Chats, Selling Employee Conversations to AI Firms for Up to $100,000
Shuttered companies are turning internal emails, Slack messages, and project data into hot commodities as AI labs hunt for real-world training material — raising fresh privacy concerns.
In an unexpected twist in the fast-growing artificial intelligence industry, failed startups are finding new value in what many once considered digital leftovers — their internal conversations.
From Slack chats and emails to Jira tickets and shared documents, these records are now being sold to AI companies for as much as $100,000, turning years of everyday workplace communication into valuable training material for next-generation AI systems.
At the center of this emerging market is SimpleClosure, a firm that helps startups close down operations. Over the past year, the company has reportedly facilitated nearly 100 deals, connecting defunct businesses with AI labs eager for authentic, real-world data.
What makes this data so attractive is its rawness. Unlike carefully curated datasets, workplace archives capture how people actually communicate — the quick decisions, late-night brainstorming, frustrations, and informal exchanges that define modern work culture. For AI developers, this offers a rare opportunity to train systems that can behave more like human colleagues.
Speaking to Forbes, SimpleClosure’s CEO, Dori Yona, described demand as overwhelming, likening it to a “gold rush” as companies compete for access to high-quality data.
One striking example is Shanna Johnson, former head of cielo24, who sold her company’s entire 13-year digital footprint after it shut down. The archive — filled with everything from internal jokes to detailed project discussions — provided a financial cushion that helped the company wrap up its affairs smoothly.
But as the trend grows, so do concerns about privacy and consent.
Many employees whose messages are now being repurposed for AI training were never aware this could happen. While companies claim to remove personally identifiable information before selling the data, experts warn that complete anonymity is difficult to guarantee. Contextual clues — such as writing style, project details, or specific situations — can still make individuals identifiable.
Privacy advocate Marc Rotenberg of the Center for AI and Digital Policy has long warned about the risks tied to digital workplace tools, noting that what feels like private communication can easily become exploitable data.
Legally, the situation is complex but often unfavorable to employees. In most cases, ownership of internal company data rests with the company or its liquidators, leaving workers with little control over how their past communications are used once a business shuts down.
SimpleClosure says it is working to strengthen its data-scrubbing processes before any sale is finalized. Still, the development is already fueling calls for tighter regulations and clearer rules around consent.
For founders facing the difficult reality of shutting down, this new marketplace offers a chance to recover some value from years of effort. For employees, however, it raises a more unsettling possibility — that casual messages sent during a typical workday could one day be sold and used to train machines.
As the demand for authentic human data continues to surge, one thing is becoming clear: in the age of AI, even forgotten workplace conversations may have a second life — and a price tag attached.
Comments