From ChatGPT getting into a spat with Scarlett Johansson to Google rewriting your search results to suggest you stick some glue on your next pizza, AI is the biggest tech story of 2024 by far. We’ve seen voice assistants duetting with each other, hyper-realistic video conjured from a short prompt and Nvidia become a $3 trillion company all off the back of the most hype behind large language models and their cutting-edge capabilities. Until today, Apple was pretty much absent from the conversation.
Now that’s changed in a major way at its WWDC 2024 developers conference in Cupertino, California. Announced today and launching later this year alongside iOS 18 for iPhone, Apple Intelligence is the company’s attempt to make good on the promise of contextual, personalised computing that didn’t quite come to pass after Siri’s launch 12 years ago. Turns out shouting at your iPhone to set a voice timer, isn’t all that revelatory when compared to the multimodal capabilities of Google’s Gemini model, as well as Meta’s Llama and Microsoft’s Copilot.
As such, Apple has gone back to the drawing board with Siri to create an assistant that's fit for 2024. “Siri no longer just a voice assistant, it’s really a device assistant,” said John Giannandrea, Apple’s senior vice president of machine learning and AI strategy at a press conference in Apple Park’s Steve Jobs Theatre. “You'll be able to type to it and not just use your voice, but much more deeply, it has a ve.
