TECH

iPhone 16 exclusive AI feature coming soon to iPhone 15 Pro
It's official. Visual Intelligence is finally coming to the iPhone 15 Pro models. And it's all thanks to the newly launched iPhone 16E.
Although the iPhone 15 Pro received Apple Intelligence right alongside the iPhone 16 lineup, it notably missed out on Visual Intelligence. That’s finally set to change though. Apple has officially confirmed that it plans to bring Visual Intelligence to the iPhone 15 Pro and iPhone 15 Pro Max, with the feature possibly arriving with the stable iOS 18.4 update.
The reason Visual Intelligence has so far been missing from the iPhone 15 series is because of its implementation—the feature can only be triggered using the dedicated Camera Access button. Since the iPhone 15 Pro series lacks such a button—and the fact that Apple did not provide an alternative method—the feature remained exclusive to the iPhone 16, iPhone 16 Pro, and iPhone 16 Pro Max. That changes with the iPhone 16E, which comes with Visual Intelligence despite lacking the Camera Access button.
So, how did Apple do it? As demonstrated during the iPhone 16E announcement, Visual Intelligence can be mapped to the Action Button and added as a shortcut to the Control Center. And we've now official confirmation from Apple (via Daring Fireball) that both these new methods to summon Visual Intelligence will be coming to the iPhone 15 Pro (view at Best Buy) and iPhone 15 Pro Max, allowing users of these devices to try out this super handy Apple Intelligence feature for the first time. Apple says it'll arrive "in a future software update," but won't say exactly when. It's not live in the first iOS 18.4 beta yet, but it won't be far-fetched to expect it to be part of the final stable release.
For the uninitiated, Visual Intelligence is a lot like Google Lens on Android. It uses a combination of computer vision and generative AI to quickly provide details about places, identify plants and animals, summarize or translate text, and look up items on Google— all just by pointing your camera.
In general, this feature allows people to point their smartphone's camera at an object and obtain information about it in real time. This feature was only available on Apple's iPhone 16 line, since the only way to activate it was through the dedicated camera button.
In this case, all you had to do was press and hold for a while to use the Intelligence visual. Therefore, even though the Pro models of the iPhone 15 line support Apple Intelligence's AI features, this feature was initially left out. With the recent launch of the iPhone 16e, a new adaptation has worked around this problem.
This is because Apple has applied a shortcut to use the feature in the Control Center. In fact, the trend is that this will be implemented by the company in all models that support AI features. Likewise, it is possible that activating the feature could be linked to the use of the Action Button in a future hardware update. The details of this update have not been revealed, but it is speculated that it will be in version 18.4 of iOS and, as its beta version is expected to arrive soon, it may not be long before smartphones have access to Visual Intelligence.
As for Apple's AI features, it is worth noting that the Portuguese version was released in the first beta of iOS, while the official launch is expected to take place in early April.
mundophone
No comments:
Post a Comment