Don’t Fret – Intel Macs Will Still Get This ‘M1-Exclusive’ macOS 12 Feature

Toggle Dark Mode

When Apple

unveiled macOS 12 Monterey

last month, it drew a line in the sand between its new leading-edge M1-powered Macs and those models still using Intel chips. Now, however, it appears that this line has been blurred, at least a bit.

There are at least six big new features in macOS Monterey that

were expected to require an M1 Mac

, but now the latest macOS 12 beta is bringing at least one of them to Intel-flavoured Macs too — Live Text in Photos.

The change comes in the fourth macOS 12 beta, released to developers earlier this week, which was quickly followed by a third public beta of the same build. There wasn’t much else interesting about the latest macOS beta — Apple briefly suggested that

Universal Control

had finally been enabled, but sadly, that turned out to be premature.

However, buried in the release notes was an oblique confirmation that Live Text was quietly being added to Intel Macs as well:

Apple didn’t mention the word “Intel” in the release notes at all, but “all Mac computers” naturally includes the many Intel variants that Apple has sold in the past — and still sells today.

As

Rene Ritchie

theorizes, Apple seems to have changed course and added Live Text “based on demand,” but it also helps that support for Live Text on the Mac doesn’t need to be done in real-time like it does on the iPhone and iPad.

Even though most of Apple’s Macs feature a FaceTime camera, even the M1-powered models won’t allow you to view Live Text through the camera. This seems reasonable considering that the MacBook and iMac cameras are designed for things like video conferencing, and most users aren’t likely going to be pointing it at signs and receipts the same way they would with an iPhone.

Of course, you’ll still be able to extract Live Text from pictures captured with the Mac’s FaceTime camera — you just won’t be able to do it while looking at a live preview of the image. Instead, you’ll have to load it up in Photos or Preview.

As Ritchie explains, this means that Live Text on the Mac doesn’t really need the M1’s Neural Engine. While M1-powered models will almost certainly still leverage that for even faster processing, Intel versions can simply handle it “

opportunistically

.” This means it might run a bit slower on older Intel Macs, but at least it will be possible.

Live Text in Photos

The real benefit of

the new Live Text feature

will be found in iOS 15 on the iPhone, where users will be able to extract text right through the iPhone’s

Camera

app.

You’ll be able to copy text from anything you’re looking at simply by dragging over the text in the photo preview to select it and copy it to your clipboard.

You can even use iOS’ data detectors to place a call directly to a recognized phone number, or navigate to a recognized address.

It’s a magical feature, and it doesn’t just work in the

Camera

and

Photos

apps, either. It’s a system-level feature that should work for any photo that you’re viewing, anywhere on your device. There are already a few exceptions with specific apps that don’t use the standard iOS photo APIs, like Facebook and Instagram, but these may also simply be a matter of waiting for these apps to be updated.

Most significantly, Apple is doing all the processing directly on the A-series or M-series chip in your iPhone, iPad, or Mac. Your photo data never leaves your device for this purpose, and Apple’s servers don’t analyze your photos for text at all. This is similar to the on-device photo processing for things like faces and objects that Apple has been doing since iOS 10 back in 2016.

Of course, if you sync your photos with iCloud Photo Library, then your photos will be stored on Apple’s servers, but the key is that Apple doesn’t perform any computational analysis on the photos stored there. Every other service that does this kind of analysis relies on powerful cloud servers to do the heavy lifting, meaning that all the text in your photos is living in a database on that company’s server. Apple’s approach is a massive win for privacy.

Comment

However, this requirement for on-device processing is precisely why the Live Photos feature wasn’t originally going to be supported on Intel Macs. Apple presumably felt that the Intel chips weren’t up to the task, and more importantly, the code to handle this was built for Apple’s own Neural Engine, which is only found in the 2017 A11 Bionic and later A-series chips, and of course, the new M1 chip that powers Apple’s latest Macs. This is the same reason

that you’ll need at least an iPhone XS/XR to use Live Text in iOS 15

.

In the case of Intel Macs, however, Apple obviously realized that it could batch process photos for Live Text, rather than needing to do it in real-time. It’s not yet clear exactly how this approach may differ under the hood — it’s likely Apple is analyzing photos in the background and pre-storing the text for later — but the result is the same: there will be one more feature for Intel Mac users to enjoy when macOS Monterey launches later this year.

Read Next:

Best Apple Deals | M1 MacBook Air $149 OFF, Apple Watch Series 6 $100 OFF, AirPods $69 OFF (+ Many More)

Popular Articles