Apple’s AI strategy re-focuses on delivering quality-of-life improvements across shiny new UIs 
This WWDC might be Apple’s most important developer conference in a decade.
As I laid out in my article last week about AI companies chasing the hardware dream, over the past few months, Apple has had a seemingly endless string of troubles — the initial tariff hoopla, DMA compliance penalties, the fallout of the App Store anti-steering injunction, and its former design chief Jony Ive teaming up with OpenAI to develop new AI-first devices. There is clearly a lot riding on the line for Apple to deliver something at this WWDC to take back control over its narrative, especially with competitors surging ahead in the AI race.
For what it’s worth, part of the blame also lies with Apple for overpromising on Apple Intelligence features, in particular a LLM-based Siri 2.0, at last year’s WWDC but then failed to deliver. During its opening keynote, Apple executive Craig Federighi briefly acknowledged the setback, citing the need for “more time to reach a high-quality bar.” While no significant improvements to Siri were revealed, Apple promised updates “in the coming year.”
Suppose you’ve only gleaned the headlines following the WWDC keynote on Monday, such as this one from the New York Times, you’d be likely to get the impression that Apple spent more time on the new “Liquid Glass” redesign than its AI features, choosing to focus on cosmetic upgrades rather than cutting-edge AI innovations. While it is certainly a warranted, if superficial, take, it’d be far more interesting to consider Apple’s strategy behind its WWDC announcements, and the entailing implications for brands and marketers alike.
Liquid Glass Design Emphasizes Spatial Physicality
Let’s start with the headline-grabbing UI overhaul. Not only will all of Apple’s operating systems, from iOS to watchOS to macOS, be named after the fiscal year of their public debut, starting with 26, from now on, they are also all unified under a new “Liquid Glass” design, which features a fluid aesthetic that responds dynamically to its surroundings. Translucent icons and menu bars subtly refract the images and texts in the background like curved glass, while system elements glide with softened edges and glow with real-time contextual tinting.
https://medium.com/media/5b5f5c43ef2525f5986cd2da52a56455/href
If you’ve tried a Vision Pro headset, you’d notice that the “Liquid Glass” design is aesthetically aligned with the spatial design language Apple came up with the visionOS for its Vision Pro headsets. In other words, this isn’t merely a cosmetic update; it represents a strategic move by Apple to prepare its vast user base for the future of spatial computing and mixed-reality interfaces. Call it “onboarding through osmosis,” as Apple emphasizes the physicality of its UI design, which is a must for mixed reality UIs where 3D digital elements need to blend in with its physical environment.
For brand marketers and advertisers, Apple’s shift toward the “Liquid Glass” design — and its alignment with visionOS — signals a deeper transition: from screen-based experiences to spatial, tactile ones. This is more than a design refresh; it’s a preparatory step for a post-flat UI world where digital content must feel native to 3D space. This means preparing for spatial content in your brand storytelling. Innovative brands must create brand assets not just as visuals or videos, but as interactive experiences that could live in mixed reality environments.
AI-Driven “Quality-of-Life” Improvements
Besides the shiny new UI, Apple also crammed a lot of new incremental improvements into its upcoming “class of 26” OSs, many of which are powered by Apple Intelligence. Rather than positioning its AI tools as flashy or standalone, Apple has woven them quietly into everyday functions, including enhancing spam call and text screening, live AI translation, and more — all without calling attention to the underlying AI.
Apple’s AI tools now quietly power a range of practical features in retail and travel, from detecting merchant categories to optimize credit card rewards to automatically tracking packages and flights — all seamlessly integrated into the OS. The Apple Wallet has been upgraded to support Apple’s growing role in everyday functions. Users in nine states and Puerto Rico can now use Digital IDs, with the ability to create one from a U.S. passport coming this fall. Boarding passes offer live flight tracking, airport maps, luggage updates via FindMy, and direct links for seat upgrades or Wi-Fi purchases.

For Apple Pay transactions, merchants can use category codes and metadata to ensure the user’s preferred payment method appears by default, streamlining checkout. Wallet also now includes a unified view of upcoming payments and uses Apple Intelligence to automatically extract order details from email, displaying them in the app — provided merchants follow formatting guidelines and register through Apple Business Connect.
What’s powerful about this approach is that it introduces a new surface for marketing that’s native, private, and contextually rich. A screenshot of a sneaker could prompt shopping suggestions or store availability. A saved restaurant menu could lead to a reservation or review options. Importantly, all of this can happen within Apple’s ecosystem — without ever leaving the OS — creating a closed-loop experience that’s user-initiated and brand-relevant. For marketers, this opens up new opportunities to reach consumers not through interruption, but through relevance — appearing in moments when users are already engaged and seeking follow-up.
Of course, not all the AI features Apple announced were hits. One particular feature that feels less thought-out is the AI “Workout Buddy” in watchOS, a motivational companion that offers robotic platitudes mid-exercise. Who would want this always-positive, uncanny valley companion for their workouts?
But overall, this quiet, user-first AI strategy reflects Apple’s broader approach: the focus isn’t on winning AI benchmarks like OpenAI or Google. Their strategy centers on developing a practical, system-wide AI solution for the majority of users’ daily needs. It might not satisfy Wall Street, especially in the wake of negative press and investor impatience, but it should play well to Apple’s actual user base: people who just want their phone to feel smarter, not reinvented around AI.
On-Device AI Model: Apple’s Olive Branch to Developers
Apple has not had the best PR with the developer community lately. The longstanding 15–30% commission on App Store purchases continues to frustrate developers, particularly as antitrust scrutiny grows. Many feel Apple exerts too much control over app distribution, payment systems, and business models, forcing developers into a walled garden with few alternatives. The Fortnite-led lawsuit and the resulting anti-steering junction is a recent win for developers, but it also further calls into question Apple’s commitment to the developer community that built a vibrant app ecosystem and made the iPhone a success.
Seen through this context, Apple’s introduction of a Foundation Model Framework at WWDC 2025 can be seen as a meaningful olive branch. By offering developers free access to its on-device AI model, Apple is lowering the barrier to entry for building AI-enhanced apps while preserving user trust through private, local processing. The framework includes optional safety guardrails and allows developers to integrate AI features — like mail summarization — into third-party apps without bundling full models or relying on cloud infrastructure.
While Apple’s model doesn’t rival GPT-level systems in raw capability, it doesn’t have to. The value lies in its seamless integration, strong privacy posture, and zero-cost access, especially for indie developers who might otherwise struggle with scale or compliance. Early adopters like Kahoot, AllTrails, and Day One are already putting these tools to work, offering quality-of-life enhancements that feel native to the Apple ecosystem.
The Expansion of Apple Intents
Though Apple didn’t launch “Siri 2.0” as many expected, it did announce quiet but critical groundwork for that future. The expansion of App Intents, which allows third-party apps to declare functions that Siri (and other system features) can invoke, is crucial.
App Intents are everywhere — Spotlight, Shortcuts, and Visual Intelligence this year, in addition to Widgets and Control Center last year), and is also what powers the unreleased Siri updates. Snippets let developers surface widget-like functionality within these areas. Since they all use the same system, it might be easier to get developers on board.
Siri, much like Amazon’s Alexa, might be suffering a bit from the incumbents’ burden here. Apple faces Siri stability issues due to the challenges of integrating large language models into its existing system. Former Apple employees reportedly attribute this to the complexity of retrofitting LLM technology versus building a generative AI assistant natively, suggesting other companies with GenAI-native assistants had fewer initial integration problems.
Right now, it’s invisible to most users. But over the next year, as more developers plug into this system, Apple will be in a better position to roll out a genuinely useful AI assistant — one that can actually do things inside your favorite apps. This is a foundational layer, not a front-end moment, and that’s very on-brand for Apple.
Want to Learn More
Overall, Apple is taking a more measured approach towards AI. While steady in its own feature rollouts, Apple is also quietly empowering developers with a foundational model and laying the groundwork for a new Siri with App Intents. For brands, this could be a lesson in restraint in the face of industry-wide hype, and strategically deploying AI to tangibly improve the customer experience rather than chasing AI gimmicks.
If you are keen to learn more about Apple’s WWDC announcements and what they mean for your brand, the Lab is here to help. You can start a conversation by reaching out to our partnerships director Ryan (ryan.miller@ipglab.com).
Apple WWDC 2025 Recap: AI Hype Seen Through the “Liquid Glass” was originally published in IPG Media Lab on Medium, where people are continuing the conversation by highlighting and responding to this story.

