ARTICLE AD
It was reasonable to expect that Apple would do with AI what it has done before with so many features and apps: wait, take notes and then redefine. But though it has filed off some of the sharper edges of the controversial technology, the company seems to have hit the same wall as everyone else: Apple Intelligence, like other AIs, doesn’t really do anything.
It does do something. A few things, in fact. But like other AI tools, it seems to be an incredibly computationally demanding shortcut for ordinary tasks. This isn’t necessarily a bad thing, especially as inference (that is, performing the actual text analysis, generation, etc.) becomes efficient enough to move to the device itself.
But Tim Cook told us at the outset of Monday’s “Glowtime” event that Apple Intelligence’s “breakthrough capabilities” will have “an incredible impact.” Craig Federighi said it will “transform so much of what you do with your iPhone.”
The capabilities:
Rephrase snippets of text Summarize emails and messages Generate fake emoji and clip art Find pictures of people, locations, and events Look up thingsAny of those feel like a breakthrough to you? There are countless writing helpers. Summary capability is inherent to nearly every LLM. Generative art has become synonymous with a lack of effort. You can trivially search your photos this way across any number of services. And our “dumb” voice assistants were looking up Wikipedia entries for us a decade ago.
True, there is some improvement. Doing these things locally and privately is definitely preferable. And there are some new opportunities here for people who can’t easily use a regular touchscreen UI. So there is certainly a net increase in convenience.
But literally none of it is new or interesting. There doesn’t appear to have even been any meaningful change to these features since they were released in beta after WWDC beyond the expected bug fixes.
One would have expected that “Apple’s first phone made from the ground up for Apple Intelligence” would justify being so. As it turns out, the 16 won’t even ship with all the features mentioned; they’ll arrive in a separate update.
Is it a failure of imagination? Or of technology? AI companies are already beginning to reposition their models as yet another enterprise SaaS tool, rather than the “transformative” use cases we heard so much about (it turns out those were mostly just repeating stuff they found on the web). AI models can be extremely valuable in the right place, but that place doesn’t seem to be in your hand.
There’s a bizarre mismatch between how commonplace these AI capabilities are becoming, and how bombastic the descriptions of them are. Apple has become increasingly prone to the kind of breathless promotion it once showed up with its restraint and innovation. Monday’s event was among the least exciting in recent years, but the language was, if anything, more extravagant than usual.
Like the other AI providers, then, Apple is participating in the multi-billion-dollar game of make-believe: that these models are transformative and groundbreaking even if almost no one finds them to be so. Because who could justify spending as much as these companies have when the result is that you can do the same things you did five years ago?
AI models may be legitimately game-changing in certain areas of scientific research, some coding tasks, perhaps materials and structural design, possibly (though perhaps not for the better) in media.
But if we are to trust our eyes and thumbs, rather than Cook and Federighi’s reality distortion hour, it sure looks like the ones we’re supposed to be excited about don’t do much that’s useful at all, let alone revolutionary. Ironically, Apple’s announcement has failed to provide AI its “iPhone moment.”