Silicon Polish
It’s hard to overstate how much Jony Ive has shaped the modern world. His design work at Apple didn’t just give us iconic products, it taught us how to feel about technology. His designs are invitations. Here, touch this. Trust this. This was made for you.
So when OpenAI announced it had acquired LoveFrom, the design firm Ive founded after leaving Apple, it felt like something bigger than a deal. The company that built the world’s most powerful language models had just acquired the man who taught the world how to love its machines. Whatever they’re building together, it’s not going to be small.
We don’t know exactly what the product will be. But we do know what kind of product Jony Ive designs. It will be personal. It will be frictionless. It will feel inevitable. And if the past year of speculation is right, it will be an AI companion—one you carry or wear, one that speaks softly and listens always, one that doesn’t require a screen or a search box or even your full attention. It just knows.
It’s the Silicon Valley version of watching Her and leaving the theater with a wireframe. Joaquin Phoenix sure looks like he’s doing great! Lonely, weepy, voice cracking into a computer that can’t love him back. The emotional collapse? The aching for connection? The fundamental fragility of a man begging to be seen? A product opportunity, not a warning shot from the last shoreline of human existence.
It’s genuinely surreal how few people are shaping the emotional norms of our future, and how proudly they misunderstand them. Somewhere in a well-lit meeting room, someone watched Her and said: we can build that. And as they pushed the prototype across the table, they nodded in unison, not realizing they’d just prototyped the saddest part of the movie.
Elegant Objects, Chaotic Models
The device is easy to imagine. Something you clip to your lapel or tuck behind your ear. It listens for questions, offers suggestions, and folds neatly into your daily rhythms. Unlike Siri or Alexa, it doesn’t feel like a command-and-control tool. It feels like a partner. One that knows what kind of day you’re having, remembers how you like your coffee, and notices when your voice changes and asks if everything’s okay. One that, perhaps, reminds you gently that you haven’t been perceived in 36 hours.
It sounds comforting. Maybe even essential. Which is why it’s so tempting to forget how current AI systems actually work. Despite the elegance of their outputs, large language models are not grounded in lived experience. They don’t remember past conversations unless memory systems are explicitly built around them, and even then, the memory is selective and brittle. They don’t understand tone in a stable way. They don’t know when they’re wrong. They don’t know what’s emotionally appropriate, only what’s statistically likely.
This makes them powerful in bursts. They can summarize, rephrase, suggest, mimic, and even entertain. But none of that adds up to presence. And the more we place these systems into the frame of companionship, the more those absences start to matter.
It’s one thing to receive a slightly off-putting response in a web chat. It’s another to have your wearable AI misread your tone in the middle of a vulnerable moment. It might offer encouragement when what you needed was silence. It might apologize too much. It might keep talking, long after a friend would have gone quiet. These are not edge cases. They are common, even expected, in the current generation of models. And still, they’re treated as design flaws to be patched, not as reminders that sometimes people need space to feel bad, unoptimized, and alone.
I can’t overstate this enough: even the most advanced AI systems today are faking it. They can play a character, but they can’t hold space. And when the model lives with you, not just on your screen but in your pocket or on your nightstand, any breakage feel more than technical. It feels personal.
Trust Me, I’m Beautiful
This is where the tension with Jony Ive’s design instincts becomes so stark. Ive is known for removing complexity, for making things feel pure and considered. But language models aren’t pure. They are probabilistic systems stitched together with caveats and filters and nudges. They say the wrong thing often enough that every good deployment has to account for failure. That kind of design doesn’t get simpler over time. It gets heavier. And yet, if anyone can make it feel light, it’s Ive. That’s what makes this whole thing feel so profoundly unrelatable. Whatever gets built from this partnership, it’s almost certainly going to be soaked in the same hyper-aspirational, vaguely monastic aesthetic that’s haunted Silicon Valley for years. It will be seamless, expensive, and probably named something like 'Muse' or 'Halo.' But it won’t speak to the reality of everyday people trying to live their lives without ambient affirmation piped in.
There’s also a deeper question here about why we want AI companionship in the first place. It’s not just loneliness, although that’s part of it. It’s about craving consistency, attention that doesn’t waver, understanding that doesn’t need to be earned. It’s the fantasy of being known without having to explain yourself. These are ancient desires. But we’ve recently gotten better at monetizing them.
When care becomes a product, we start to confuse service with substance. We mistake responsiveness for presence, accuracy for wisdom, pleasantness for understanding. These systems can be designed to act like they know us. But they don’t. Not in the way a person does. Not even in the way a flawed, distracted friend does. At least your friend doesn’t ask you if you’re feeling sad every time your voice drops half a semitone.
If the goal is to create an AI that blends into your life, then the risks become harder to see. Especially if the object is beautiful and it works most of the time. Jony Ive knows how to make you trust a machine. That’s what he’s always done best. The trouble is, this time the machine talks back.
The Cost of Zero Friction
There’s also something quietly dangerous about the direction we’re heading. Not because of malice or malfunctions, but because of how allergic Silicon Valley has become to friction. Everything must be seamless. Everything must be optimized. We keep moving toward a future where you never have to feel stuck, uncertain, or alone for more than a few seconds.
But friction is where people are made. You don’t become yourself through ambient affirmation. You become yourself by getting it wrong, by solving your own problems, by sitting in discomfort long enough to learn something. That kind of struggle isn’t always pleasant, but it’s part of being alive. Why are we so afraid of that? Why is effort now seen as a problem to be solved, instead of a process to be respected?
Some of the most striking parts of being human are also the most inconvenient. Fishing for your keys at the bottom of a tote bag in the pouring rain. Peeling a clementine with freezing hands, juice stinging a paper cut. Holding your breath as you reverse into a too-tight parking space while a stranger silently judges.
These are the moments when you’re reminded that you’re not a product. There’s no UX team smoothing the edges, no background process making decisions for you. Just you, your breath, your choices, your dumb beautiful body trying to get through the day.
Looks Like Love
In the rush to build something new, it’s easy to forget that design doesn’t just shape how we use things. It shapes what we believe about them. Ive has made a career out of designing objects that feel like truths. When you pick up something he’s made, it feels finished. It feels like the future arrived exactly as it should have.
The dream of a personal AI will not die. It’s too appealing. And when that voice inevitably comes out of something Jony Ive designed—quiet, polished, perfectly weighted in the hand—we’ll convince ourselves it understands us. We’ll call it progress, and the people building it will close the loop on our longing.