Apple unveils new AI features at WWDC 25
Digest more
Liquid Glass design, Apple Intelligence APIs, and visionOS 26 tools highlight Apple’s 2025 State of the Union for developers.
Apple has yet to deliver on some of the key technology advances that could modernize developers' apps for the AI era.
Follow along with the Gizmodo crew as we unpack everything Apple announces at its annual developer conference in Cupertino, Calif.
Striking a balance between speed and caution, and ambitious and realistic is difficult. Apple may have just nailed it.
Steve Jobs might have loved Liquid Glass; the battle to keep the iPad from turning into a Mac seems to be over; and other takeaways from a keynote that wasn’t so short on news after all.
Apple is seen as behind the pack in AI, and the company had a chance to change that perception with its WWDC keynote on Monday. But shares fell, suggesting investors weren't too impressed with what the company previewed.
We’re getting Live Translation in iOS 26 across a number of apps, improved Visual Intelligence that can now read your screen, Call Screen and Hold for You in the Phone app and an AI-supercharged Shorcuts app.
Apple announced on Monday a slew of artificial intelligence features including opening up Apple Intelligence's underlying technology in a modest update of its software and services as it lays the groundwork for future advances.
Users will be able to access the AI models either on device or with Private Compute to generate responses that feed into the rest of their shortcut.
Apple unveiled a major advancement in its artificial intelligence efforts with the launch of the Foundation Models Framework.