In brief
- Apple unveiled major updates across its platforms at WWDC 2025, led by a new visual system called Liquid Glass.
- The company emphasized privacy-focused enhancements using Apple Intelligence.
- Updates include new messaging features, real-time translation, improved developer tools, and OS upgrades for iPhone, iPad, Mac, Watch, and Vision Pro.
Apple wrapped its WWDC 2025 keynote on Monday with sweeping updates to its device operating systems and a striking new design. But for all the refinements across iPhone, iPad, Mac, Apple Watch, and Vision Pro, one question lingered: What happened to Apple Intelligence?
When Apple’s big push into AI was introduced at WWDC 2024, CEO Tim Cook described it as a “new chapter” for the company—one that combined Apple’s hardware with the growing momentum of generative AI. Apple Intelligence was meant to place the company in the same league as OpenAI, Nvidia, Google, and Microsoft.
A year later, that promise remains largely unfulfilled, and has drawn industry-wide criticism as well as corporate upheaval.
Indeed, its most significant impact on the AI landscape might be a research paper titled “The Illusion of Thinking” last week, in which the company outlined the limitations of large language models and warned against overestimating their reasoning capabilities. The paper emphasized that while LLMs may appear intelligent, they mainly rely on pattern recognition.
Nonetheless, today’s conference opened with Apple’s Senior Vice President of Software Engineering Craig Federighi heralding its AI integration: “We’re making the generative models that power Apple Intelligence more capable and more efficient, and we’re continuing to tap into Apple Intelligence in more places across our ecosystem,” he said.
Federighi announced that Apple is opening its AI infrastructure to developers through a new “Foundation Models Framework” that allows apps to tap directly into the same on-device intelligence that powers Apple’s own software. Updates to Xcode introduce generative tools for developers, including integration with ChatGPT, predictive code completion, and conversational programming via Swift Assist.
Perhaps Apple was under-promising in the hopes of over-delivering after the debacle of the Apple Intelligence rollout.
Instead, the presentation today devoted more attention to a sweeping visual redesign of OS X and iOS, bringing more UX conformity across Apple’s entire product suite. The redesign features “Liquid Glass”—a responsive, context-aware design element that adapts to touch, content, and context across devices. The redesign affects everything from the lock screen to system icons, aiming to make transitions between Apple devices more seamless.
Other updates unveiled at WWDC include enhancements to Messages, which now support polls, custom backgrounds, typing indicators, group payments, and improved spam filtering. Live translation enables real-time language translation in Messages, FaceTime, and phone calls using AI.
The Phone app is receiving upgrades, including Hold Assist—a way to retaliate against being put on hold via a standby mode that alerts you when the person you’re trying to reach finally answers—and Call Screening, which prompts callers to identify themselves before connecting. That feature, it’s worth noting, came with Google Voice when it rolled out in 2009.
Other updates include:
- iPadOS 26: A new windowing system, an upgraded Files app, and professional-grade audio/video workflows.
- VisionOS 26: Spatial computing improvements and support for third-party accessories, including the PlayStation VR2 Sense Controller.
- WatchOS 26: A chatty “Workout Buddy” will give you encouragement during your routines.
Apple said the updates will be available in a public beta in July, with full releases coming in the fall.
Generally Intelligent Newsletter
A weekly AI journey narrated by Gen, a generative AI model.