iOS 27 will be able to Complete Photos Using AI
Apple is set to upgrade the built-in photo editor on iOS, iPadOS, and macOS with long-awaited AI features. The iPhone will learn to expand a captured frame using neural networks, sharpen objects outside the focus area, and even get an AI color corrector, reports Mark Gurman from Bloomberg.
The neural network features will work roughly the same way as on Google and Samsung devices. However, unlike Google and Samsung, users’ photos will not be sent to the cloud.
Android flagships are unable to make AI changes locally — they send every photo to Google’s servers for processing. In contrast, Apple wants all AI enhancements to be applied locally, directly on the device.
This creates two issues. First, no matter how powerful the upcoming A20 Pro chip is, it still won’t match the scale of Google’s cloud computing. As a result, when generating frames or using the eraser tool, minor artifacts may occasionally appear. Privacy, however, is worth the trade-off.
Second, not all devices in Apple’s lineup will be able to handle local AI computations. iOS 27 will be released for all current iPhones, including the iPhone 12 and SE 3, but the AI features will only be available on newer iPhones starting with the iPhone 15 Pro.
The wait for the Apple Intelligence announcement won’t be long: the company will unveil the new versions of iOS, iPadOS, and macOS on June 8 at WWDC26.
