Many of the problems with AI right now aren’t technology issues, they’re design flaws. I was reminded of that this week when Amazon Web Services (AWS) crashed and left smart-bed users unable to sleep.

AWS powers so much of the digital cloud that Monday’s two-hour outage took down WhatsApp, banking websites, and untold numbers of products hooked up to the global web.

Like mattresses.

There’s a smart-bed system called Eight Sleep that, when connected to the cloud, automatically adjusts the bed’s temperature. When AWS crashed, Eight Sleep products went haywire. Some users found their beds turned into ovens.

The problem of ill-designed or over-designed products traces back to the 1930s, when manufacturers realized that engineering products to fail would “instill in the buyer the desire to own something a little newer, a little better, a little sooner than is necessary.” Thus: planned obsolescence. (The quote is from Brooks Stevens, the mid-century designer who shaped Harley-Davidsons, the Cadillac Eldorado, and the Miller Brewing Company logo.)

With the coming of AI, we’re now stepping into the next version of planned obsolescence: indentured consumerism.

The forever subscription

The most notorious example of this is the universally loathed HP printer, which is designed to brick if the user loads anything other than HP-brand ink.

Car makers are getting in the game by requiring monthly subscriptions for features that once were standard or at least pay-once options. Toyota now charges $8 a month for remote start. Tesla’s full self-driving mode will run you $200 every 30 days.

These aren’t just ploys to parasite your credit card (although they are that). By linking your product to the cloud the manufacturer harvests your data 24/7. Like your Nest thermometer, that Eight Sleep system is constantly gathering information about your sleeping habits, movements, and temp settings.

Your product isn’t just spying on you—it’s using you to grow and harvest data. That data is worth, in the aggregate, billions of dollars to AI tech companies training the next generation of AI models.

Amish intentionality

I recently heard University of Washington law professor Ryan Calo speak at a Society + Technology at UW event in Seattle. Calo’s forthcoming book, Law and Technology: A Methodical Approach, will be published next month.

One of the things he said struck me: “Technology poses as inevitable when in fact it’s deeply contingent.”

Whenever a new technology or product launches, its profit-driven proponents present it as Our Inevitable Future. In fact it’s not. New products fail all the time. Ask Jeff Zuckerberg how those Meta Quest VR headsets are selling.

I do think the technology of AI is a lasting step into the future. The uses and products that come from that technology have yet to be decided.

This is where one of Ryan Calo’s case examples comes in handy. He writes about how Amish communities respond to technological change. They aren’t early adopters, obviously. But they do consider the tools of technology with intention and care. The horse-drawn carriages they ride (and some do drive cars) were once cutting-edge tech, after all.

Many Amish communities gather, analyze, and discuss whether and how a new technological tool might fit within their own value system. Calo offers this as an illustration of how each of us might consider, with more care and intentionality, how a new tech tool might fit within our own value system.

Is that thermometer ‘smart,’ or is it ‘pure’?

Earlier this year I needed a clinical thermometer. I was running a fever. It happens. My local pharmacy offered only one brand: the Kinsa “smart” digital thermometer. It required me to download an app on my phone to read my temperature. Which meant I was forced to produce data—was forced to work for Kinsa, essentially—in order to receive the use of the product I paid $31.49 for.

These are the moments—standing there in the Rite Aid aisle—that now require each of us to run Ryan Calo’s Amish test. Am I willing to both pay Kinsa and give them my data for free? Or is it worth it to drive to a different drug store and pay a little extra for a “dumb” thermometer?

I believe product designers and marketers will soon find a way to up-value products that aren’t connected to the cloud. Don’t fall for the false “smart-dumb” dichotomy. Learn from the success of organic branding. Call that non-digital thermometer…maybe…a pure product.

As consumers it’s up to us to stay sophisticated in our understanding of these indentured products and exercise intentional choices in the store aisle. The “pure” product may cost a little more, like organic lettuce. But the product will be yours alone. And you will owe the tech maker nothing.

Illustration: Image generated using Photo Image Realistic GPT.

Enjoy what you’re reading?
Become an AI Humanist supporter.

MEET THE HUMANIST

Bruce Barcott, founding editor of The AI Humanist, is a writer known for his award-winning work on environmental issues and drug policy for The New York Times Magazine, National Geographic, Outside, Rolling Stone, and other publications.

A former Guggenheim Fellow in nonfiction, his books include The Measure of a Mountain, The Last Flight of the Scarlet Macaw, and Weed the People.

Bruce currently serves as Editorial Lead for the Transparency Coalition, a nonprofit group that advocates for safe and sensible AI policy. Opinions expressed in The AI Humanist are those of the author alone and do not reflect the position of the Transparency Coalition.

Portrait created with the use of Sora, OpenAI’s imaging tool.

Keep Reading

No posts found