2025-06-11 10:53:00
www.zdnet.com

Tim Cook and Craig Federighi at Apple WWDC 2025.
Jason Hiner/ZDNET
After the WWDC 2025 keynote, it’s official — Apple is going in a different direction on AI than the rest of the industry.
There were no outlandish promises or predicted breakthroughs, unlike last year’s moonshots on “personal context” and a reimagined version of Siri. And there was very little mention of chatting with bots at all.
Also: The 7 best AI features announced at Apple’s WWDC that I can’t wait to use
Instead, Apple is taking the less speculative and more established strengths of large language models (LLMs) and integrating them piece-by-piece into the iPhone and other devices — often without the need to even mention the word AI.
An AI for iPhone (and other Apple devices)
First and foremost is Apple’s Live Translation feature. Language translation is one of the things that LLMs do really well. In most cases, you have to copy and paste into a chatbot or use a language app like Google Translate to take advantage of those superpowers. In iOS 26, Apple is integrating its Live Translation features directly into the Messages, FaceTime, and Phone apps so that you can use the feature in the places where you’re having conversations.
Also: Apple’s secret sauce is exactly what AI is missing
Next, there’s Visual Intelligence. Apple will now let you use it from any app or screen on your phone by integrating it directly into the screenshot interface. With iOS 26, Visual Intelligence can now recognize what’s on your screen, understand the context, and recommend actions. The example that was shown in the keynote was an event flyer where you take a screenshot and Visual Intelligence automatically creates a calendar event for it.
This is actually a step toward an AI agent, one of the most popular — and sometimes overhyped — tech trends of 2025. I’m looking forward to trying this feature and seeing what else it can do. I’ve had good luck using the Samsung/Google version of a feature like this called Circle-to-Search. Another new thing Visual Intelligence will let you do in iOS 26 is ask ChatGPT questions about what you’ve captured on your screen. Visual Intelligence can also take the text of what you’ve captured in your screenshot and read it aloud to you or summarize it.
Another one of the excellent LLM capabilities that’s been enhanced this year can be seen in Shortcuts, which can now tap into the Apple Intelligence models. For example, you can create a Shortcut that would take any file you save to the Desktop on MacOS 26, use Apple Intelligence to examine the contents of the file (while preserving privacy), and then categorize it and move it into one of several different folders that you’ve named based on categories of stuff you do. You can even automate this to happen every time you save a file to the desktop, which again makes this more like an AI agent.
Also: Apple’s Goldilocks approach to AI at WWDC is a winner. Here’s why
One more way that Apple is tapping into LLMs can be seen in the new functionality in the Share button in iOS 26. For example, you can now take a list of things from a PDF or a web page in Safari, select the text, tap the Share button, and then select the Reminders app. Apple Intelligence will use its generative models to analyze the list and turn them each into to-do items in the category you choose in the Reminders app. If it’s a long list, then you can even have the AI break it into subcategories for you, again using LLM’s natural language processing (NLP) capabilities.
Apple’s AI for developers
Lastly, while most of the leading players in generative AI typically offer both a chatbot for the general public and a coding companion for software developers — since those are two of the things that LLMs are best known for — Apple didn’t say much about building either at WWDC 2025. With thousands of developers on the campus of Apple Park, it might have seemed like the perfect time to talk about both.
But all Apple would say about the next version of Siri — which has long been in need of a re-think — was that it’s still working on it and that it won’t release the next Siri until it meets the company’s high standards for user experience.
And when it comes to programming companions, Apple did not unveil its own coding copilot for developer tools like Swift and Xcode — after promising Swift Assist at last year’s WWDC. Instead, Apple made a couple of big moves to empower developers. It opened up its own Foundation Models framework to allow developers to tap into the powers of Apple Intelligence — with as little as three lines of code in Swift, Apple claims. Plus, it all happens on-device and at no cost.
Also: How Apple just changed the developer world with this one AI announcement
And in Xcode 26, Apple will now allow developers to use the generative coding companion of their choice. ChatGPT is integrated by default in Xcode 26 but developers can also use their API keys from other providers to bring those models into Xcode. Apple sees this as a fast-moving space and wants developers to have access to the latest tools of their choice, rather than limiting developers to only stuff built by Apple.
All in all, Apple is making a lot of pragmatic choices when it comes to AI, leaning into the things that LLMs do best, and simply using generative AI to make better features on its phones, laptops, and other devices.
Keep up with all the latest AI developments from Apple and the rest of the AI ecosystem by subscribing to ZDNET’s free Tech Today newsletter.
Keep your entertainment at your fingertips with the Amazon Fire TV Stick 4K! Enjoy streaming in 4K Ultra HD with access to top services like Netflix, Prime Video, Disney+, and more. With an easy-to-use interface and voice remote, it’s the ultimate streaming device, now at only $21.99 — that’s 56% off!
With a 4.7/5-star rating from 43,582 reviews and 10K+ bought in the past month, it’s a top choice for home entertainment! Buy Now for $21.99 on Amazon!
Help Power Techcratic’s Future – Scan To Support
If Techcratic’s content and insights have helped you, consider giving back by supporting the platform with crypto. Every contribution makes a difference, whether it’s for high-quality content, server maintenance, or future updates. Techcratic is constantly evolving, and your support helps drive that progress.
As a solo operator who wears all the hats, creating content, managing the tech, and running the site, your support allows me to stay focused on delivering valuable resources. Your support keeps everything running smoothly and enables me to continue creating the content you love. I’m deeply grateful for your support, it truly means the world to me! Thank you!
BITCOIN bc1qlszw7elx2qahjwvaryh0tkgg8y68enw30gpvge Scan the QR code with your crypto wallet app |
DOGECOIN D64GwvvYQxFXYyan3oQCrmWfidf6T3JpBA Scan the QR code with your crypto wallet app |
ETHEREUM 0xe9BC980DF3d985730dA827996B43E4A62CCBAA7a Scan the QR code with your crypto wallet app |
Please read the Privacy and Security Disclaimer on how Techcratic handles your support.
Disclaimer: As an Amazon Associate, Techcratic may earn from qualifying purchases.