You may have glanced a business headline last week about a corporate restructuring completed by OpenAI, the maker of ChatGPT.

Didn’t read it? You’re forgiven. It’s hard to dive into “Tech Bros Do Boring Thing With Lawyers” when Prince Andrew is getting title-stripped and 42 million Americans are on the verge of going SNAP-less.

But that very boring story had big implications for creators around the world.

Image of OpenAI CEO Sam Altman created with Sora2, the company’s new text-to-video product.

Last week’s corporate restructuring was done largely to clear the path to a potential late-2026 initial public stock offering.

OpenAI is currently valued at $500 billion, the second highest-valued private company in the world. If an IPO were to happen it could be the first trillion-dollar stock offering ever.

Here’s the deal with IPOs, though. They require a massive amount of public disclosure, legal scrutiny, and document filings. SEC rules are notoriously rigorous. An OpenAI offering would likely be the most-scrutinized IPO in history.

Those SEC rules would force OpenAI to be legally candid about the company’s risk exposure from the dozen-plus copyright infringement lawsuits it faces. Legally candid isn’t the same as truthful, as it involves the language-washing services of Wall Street’s highest paid legal firms. But the risk will be out there in the public eye.

That’s bad for business. Investors hate unknown X-factors that could cost a company billions of dollars. (Anthropic recently settled one copyright lawsuit for $1.5 billion.)

That’s a lot of motivation for OpenAI to end those pesky lawsuits. The easiest way to do that is to cut deals. Settle. Fork over an apology envelope full of cash.

A big potential dealbreaker

What could slow this flood of OpenAI settlements?

Court rulings in favor of tech companies.

We saw one earlier this morning. In London, a UK court ruled that Stability AI did not infringe the copyright of IP rights holders by using troves of Getty Images photographs to train its artificial intelligence model. Although the court slapped Stability’s hand for a minor infringement of Getty’s trademark, this was a clear win for AI developers.

“An AI model such as Stable Diffusion, which does not store or reproduce any copyright works (and has never done so) is not an ‘infringing copy,’” the court ruled.

UK law has no influence in the United States, of course, but the early rulings on the “fair use” of copyrighted material for AI model training have generally leaned in favor of the tech companies. The exception was Anthropic, which was caught using out-and-out pirated databases that the company knew were pirated. They settled in part because of rising investor fears that a copyright ruling against Anthropic could bankrupt the company.

Further complications: The AI bubble and time itself

One more factor to consider: OpenAI surely wants to launch its IPO before the AI bubble bursts. The clock on that is ticking, and loudly.

It’s so loud, in fact, that all of this trillion-dollar-IPO talk could be moot by next month. This year’s skyrocketing stock prices have been driven by overheated AI investment. When that bubble bursts it won’t kill AI. But it will chill investor fervor and suck a ton of money out circulation. Remember that the dot-com bust of 2000 erased overhyped company valuations but not the internet economy itself.

If the bubble pops, OpenAI might delay its IPO. Or it could leave OpenAI as one of the few companies to survive the wipeout. In either case, sooner is better than later in terms of getting its public shares on the stock exchange floor.

If I were Sam Altman I’d be telling my legal team: Draw up a settlement plan now.

Enjoy what you’re reading?
Become an AI Humanist supporter.

MEET THE HUMANIST

Bruce Barcott, founding editor of The AI Humanist, is a writer known for his award-winning work on environmental issues and drug policy for The New York Times Magazine, National Geographic, Outside, Rolling Stone, and other publications.

A former Guggenheim Fellow in nonfiction, his books include The Measure of a Mountain, The Last Flight of the Scarlet Macaw, and Weed the People.

Bruce currently serves as Editorial Lead for the Transparency Coalition, a nonprofit group that advocates for safe and sensible AI policy. Opinions expressed in The AI Humanist are those of the author alone and do not reflect the position of the Transparency Coalition.

Portrait created with the use of Sora, OpenAI’s imaging tool.

Keep Reading

No posts found