5 new AI-powered features that flew under the radar at Apple’s launch event

Apple/ZDNET

Follow ZDNET: Add us as a preferred source on Google.


ZDNET’s key takeaways

  • Apple’s event features hidden AI updates across its products.
  • The company is going against the grain of other smartphone makers.
  • These subtle features streamline user experience and ease. 

As an AI reporter, I have covered every major smartphone launch event in the past year. The line between hardware and software is blurring with each release, and new AI features are equally noteworthy company to company. This week, Apple took a quieter approach to embedding AI in its products. 

Also: Apple iPhone 17 event recap: Reactions to iPhone Air, Apple Watches, AirPods Pro 3, more

After a streak of overpromising and underdelivering in the AI space, Apple went back to basics with its new product drops. The focus of the new smartphones, watches, and AirPods was on the new hardware, including better specs across cameras, battery life, form factor, and more.

But, if you were paying close attention, AI was responsible for some of the newest, most exciting experiences on the devices — even though they weren’t necessarily presented as Apple Intelligence features. 

To help you catch up on what you may have missed, take a look at the round-up of releases below, ranging from most obvious to least. 

1. Live Translation in AirPods 

Ironically, the biggest AI upgrade didn’t even occur in the iPhone but rather in the new AirPods Pro 3. Live Translation brings the real-time translating capabilities Apple announced at WWDC back in June in iOS 26 to AirPods. As the name implies, when the feature is activated, users can partake in free-flowing natural conversations and hear the translation of what the other person is saying live in their ears. For users who don’t have the AirPods, they can see the live transcription on their iPhone screen. 

Also: The iPhone 17 lineup arrived with higher price tags – are tariffs to blame?

ZDNET’s Jason Hiner demoed the feature live from Apple Park, and he shared with me that he was impressed, as the experience exceeded his expectations. In general, LLMs have a deep understanding of language and how people speak, and as a result, they are particularly good at accurately translating speech, not just literally but also using additional context. As a result, this integration will likely be a genuinely helpful use case of AI.

2. Center Stage for your selfies 

With the Center Stage feature, when you are taking a selfie, if you want a horizontal shot, you don’t need to flip your phone now. While you can manually have it toggle the orientation using AI, the camera can automatically widen the shot when it detects a group of people and rotate from landscape to vertical to get everyone in the frame. While this feature isn’t the flashiest, it is one of those convenient features that’ll just make your life more convenient on a daily basis.

3. Hypertension notifications

One of the biggest announcements from the event was the addition of hypertension notifications to the Apple Watch Series 11 and Ultra Watch 3. With the feature, your Apple Watch can alert users when it detects signs of chronic high blood pressure, also known as hypertension. While this is by no means a standalone AI feature, Apple did share that it was developed using advanced machine learning, in addition to training data from multiple studies. It is neat seeing how Apple leveraged AI to build a feature that can be very helpful to people’s everyday lives.

4. New Photographic Styles filter 

Apple’s iPhone 16 lineup featured a new version of Photographic Styles, which allowed users to scroll through different “styles” or subtle filters that adjust the colors and tone in a photo before taking it. Now, according to Apple, there is a new Bright style in iOS 26 that can “brighten skin tones and apply a pop of vibrance across the image.” 

Also: Why I’m breaking the 5-year iPhone upgrade cycle – and I’m not alone

The company said that these Photographic Styles are powered by the Apple Neural Engine, meaning some AI underpins the technology. 

5. Updated Photonic Engine

The iPhone 17 Pro and Pro Max have an updated Photonic engine, or the computational photography behind how the phone processes your photos. Apple said that “the image pipeline uses more machine learning to preserve natural detail, reduce noise, and significantly improve color accuracy.” Similar to the Center Stage feature, this is another example of how AI can subtly help make your everyday phone experience easier. 

Bonus: new chipsets 

Beyond new features, Apple unveiled new chipsets to power its devices across the board. Although this may sound less flashy than perhaps an upgraded Siri, it lays the groundwork for more advanced AI features.

Also: Every iPhone that can be updated to iOS 26 (and when you can install it)

For example, the new A19 and A19 Pro chipsets feature a 5-core GPU with Neural Accelerators built into each core, which should enable the devices to run more powerful generative AI models as seamlessly as possible. Another prime example is the S10 chip in the Apple Watch SE 3, which, even for the budget model, enables, for the first time, on-device processing for Siri. 



Original Source: zdnet

Leave a Reply

Your email address will not be published. Required fields are marked *