by Jason Snell & Dan Moren
Since the near-simultaneous arrival of the iPhone 8 and iPhone X in 2017, Apple has been on a mission to split the iPhone product line into two distinct sets of models: a more expensive set that incorporates all the cutting-edge features Apple can dream up, and a set that trades some of that high-tech flash for affordability.
This year’s iPhone 14 and 14 Pro feel like the final resolution of that mission. The iPhone 14 is a very mildly updated version of the iPhone 13, down to it being powered by last year’s A15 processor. But while the iPhone 14 has stood (almost) still, the iPhone 14 Pro has rocketed further ahead. The result is that Apple’s new iPhones for the fall of 2022 are more distinct from one another than ever before.
That’s a good thing for Apple, because the more that users are tempted to spend bigger on the extra features, the more money for Apple. But I’d argue that it’s good for potential buyers too, in the sense that if they’re choosing to spend money on the more expensive phone, they’d want to know what they’re getting for their money. And if those features don’t impress, you can save your money.
There’s not much to say about the iPhone 14. In the past two years, Apple seems to have put most of its effort into separating the Pro line from the base model. I actually preferred the iPhone 12 to the iPhone 12 Pro, because their specs were similar and the cheaper models looked better. I even bought myself an iPhone 13 Mini, despite the limited camera and the iPhone 13 Pro’s addition of a ProMotion display.
But the iPhone 14 is, in terms of specs and appearance, just the iPhone 13. This is the year where Apple has paused the base-model iPhone so that the iPhone Pro models can gain separation. Yes, Apple has taken this opportunity to do a structural redesign which makes the iPhone 14 the most repairable iPhone ever. That’s great news—not just for users, but for Apple, who has to do in- and out-of-warranty repairs on the thing.
But beyond that, there’s not much here that’s new. The new Photonic Engine image-processing pipeline promises improved low-light camera performance. (I admit to being a bit baffled about what keeps this new feature from coming to phones with almost identical hardware, like the iPhone 13 Pro.) There’s a new front-facing camera with autofocus, and Cinematic Mode has been boosted to support 4K up to 30fps, and there’s a 24fps option. You can also unlock the iPhone via Face ID in landscape mode, at long last.
But really, the big new features involve safety: Crash Detection, which uses improved sensors to determine if you’ve been in a car crash and automatically call for help; and Emergency SOS via satellite, which aids in getting a message out if you’re in trouble and don’t have cellular service. I was unable to test the latter feature, since Apple hasn’t turned it on yet, and I decided not to test the former because I’m trying to avoid car crashes, thank you very much.
And really, the most notable thing about the iPhone 14 product line is that there’s now a larger model, the iPhone 14 Plus, replacing the old iPhone mini size. I’m sad that the mini is gone, because I love mine, but I get it—people like big phones. The iPhone 14 Plus isn’t yet available for sale, so I don’t have any observations to make about it—other than to say that since it’s more or less the same size as the Pro Max phones have been the last few years, there aren’t likely to be very many surprises.
Still, does it matter that the iPhone 14 is a bit of a snooze? It doesn’t, because most people who will consider it will be coming from the iPhone X or XS or XR or 11 or 12. When you roll a bunch of years’ worth of upgrades together, the upgrades get very nice indeed. But this year, Apple’s creep toward the future was at a snail’s pace, at least on the iPhone 14.
Now here’s the main event. While the iPhone 14 was sleeping, the iPhone 14 Pro leaped forward. In true Apple fashion, this is an iPhone that’s awfully familiar on the outside (this is the third year for this external design) compensating with a whole host of internal upgrades. While a handful of features span the entire line—the changes to the camera pipeline, Crash Detection, and Emergency SOS via Satellite—almost everything that’s new in the iPhone this year is in the iPhone 14 Pro and Pro Max.
When you compare the iPhone 14 to the iPhone 14 Pro, you can see Apple’s hardware-differentiation strategy in action. The iPhone 14 Pro has a superior screen, processor, and camera—a trio that goes to the core of the smartphone experience.
But there’s one other place where Apple is differentiating between the models, and it’s a curious one, one that combines hardware and software. It’s Apple taking a hardware liability—the requirement that a phone with an essentially edge-to-edge display needs to have cutout areas for its forward-facing sensors to peek through—and adding software to turn it into a feature.
Apple didn’t want you to think about the notch.
The blank space at the top of the iPhone X’s otherwise nearly edge-to-edge OLED display was a necessary compromise for Apple, but leaving a blank area to house its TrueDepth camera stack still had to hurt. Apple made the best of it—putting status-bar information to the left and right—but the best it could do was really to not give it an official name, pretend it wasn’t there, and hope it would fade into the background.
Which—let’s be honest—it did. I never think about the notch. Over five years it just became the shape of the iPhone screen, and I otherwise ignored it.
Apple, meanwhile, has been plotting its next move, which it has made with the iPhone 14 Pro. It’s hidden the proximity sensor (which locks out the touchscreen and dims the display when you hold the phone up to your ear) under the display glass itself, and reduced the rest of the sensor stack (the front-facing camera and an infrared dot projector and camera for Face ID).
Apple could have asked itself what it wanted to do for its next move and answered, “make an even smaller notch.” Instead, it decided to make two small cutouts and surround them with a new iOS interface element. Apple calls it the Dynamic Island, and the mere act of giving it a name speaks to what Apple wants iPhone users to do: Gaze into the Dynamic Island. Tap on it. Tap and hold it. Glance at it.
It’s a bold move to turn a liability into an asset by drawing attention to an area that’s partially incapable of displaying anything but blackness. The Dynamic Island dances, expanding down and out and performing some impressive gymnastics to make it seem like those sensor cutouts don’t really exist and are just part of the black blob’s ever-shifting identity.
It’s also an example, I think, of Apple’s unique position as the designer of its own software and hardware. The Dynamic Island is not just an iOS feature cooked up in order to enable a new hardware feature, like an addition to the Camera app—it’s a significant evolution of iOS, a whole new concept for displaying ongoing background information, prompted by a specific issue in iPhone hardware.
Apple could’ve taken the easy way out, but instead, it used that small hardware problem as a catalyst to shake up iOS. It’s a bold strategy. I can’t wait to see how it works out.
Unfortunately, judging the Dynamic Island today is not really possible, as it’s a half-finished feature. Some of Apple’s apps support it, and audio apps and VOIP apps that use Apple’s standard APIs get support right out of the box in iOS 16. If I play a podcast in Overcast and then swipe up to the lock screen, a tiny thumbnail of the podcast appears on the left side of the wide oval of the Dynamic Island, and a tiny live waveform representing the audio, color matched to the podcast art, appears on the right side. Tapping the Island will take me back to Overcast; tapping and holding bring up playback controls like the ones you’ve seen before on the iOS lock screen.
In the last few years, Apple has been inventing a lot of new ways to display temporary notifications that aren’t part of Notification Center, and it seems that all of them are being unified into the Dynamic Island. If someone calls you, the Dynamic Island expands to provide that information and let you answer. AirPods connect and display themselves in the Dynamic Island, rather than sliding down from the top of the screen and then popping back up. Flipping the ring/mute switch, connecting to a charger, Focus Mode changes, AirDrop, Personal Hotspot status, navigation directions if you’re actively using turn-by-turn directions in Apple Maps—these events all had different ways of informing you that they occurred, and now they’re all spawned out of the Dynamic Island. (But not Siri, for some reason.)
It’s a good start. But there’s another major piece of the Dynamic Island story that doesn’t exist yet. In June, Apple introduced something called Live Activities, which were explained at the time as a way to update currently-occurring events without sending a hail of notifications. It was a welcome feature, and everyone nodded along knowingly, thinking of all those rumors of an iPhone with an always-on display that would make that lock screen even more valuable real estate.
All that was true, but Live Activities also drives the Dynamic Island. Sports apps will be able to display live scores in the Island. Ride-hailing apps can display the current status of your approaching car. Food delivery apps can let you know where your pizza is. The list will undoubtedly go on and on and on.
While the Dynamic Island is here today, it feels a little empty without the rich support of third-party apps. So much so that I think it’s not really fair to issue a final judgment on the Dynamic Island until we can see how it works with a thriving set of apps that support it.
It feels like Apple has taken a conservative approach with the Dynamic Island—which is not what I expected to think about a dynamic animated blob that permanently lives in the top portion of my iPhone. But in its normal state, the Dynamic Island is pretty low-key. Apple could have opted to take up more screen space and display interactive controls, but it chose to hide that under a tap-and-hold gesture. (It also feels like Apple has the interaction gestures backward—I’d prefer tapping to quickly reveal interactive controls and tapping and holding to pop the entire app open.)
I think erring on the conservative side is probably the right call, given that the Dynamic Island is already a pretty large change to the iPhone interface. It’s taken me some time not to see the dancing waveform of my background music as a distraction, though eventually I got used to it.
For most users, I expect there will be some adaptation time as we begin to grapple with the idea that the iPhone is no longer a one-app-at-a-time device. The Dynamic Island is single-screen multitasking done on the smallest of scales, but it is multitasking nonetheless. I think it’s a necessary step, since so much is happening in the background on our iPhones, but it does add clutter, and Apple seems to want to keep that clutter to a minimum for now.
The Island design language itself can best be described as “whimsical.” It really is sort of like a cartoon character, with sharp animations that make it feel elastic and alive. It will literally bump against other interface elements, like the time, and shake as if it’s made contact with a physical object. It looks especially impressive on the iPhone Pro’s sharp display, with its ProMotion frame rate and a special subpixel anti-aliasing algorithm.
There’s a lot of room for evolution here. With the exception of the cutout space, the Dynamic Island can be whatever Apple makes of it. Starting slow and restrained is probably a good idea, but I think I’ll be disappointed if this is all the Dynamic Island ever is. It needs to grow and evolve, and based on what Apple learns in the next few months as people pick up their iPhones and as new apps come to support Live Activities, I hope to see some new features in iOS 17.
“Why is your phone still on?” my daughter said to me as we sat at dinner. She’s 20, and her phone is probably her most important possession. And yet when she got a glimpse of the iPhone 14 Pro’s new always-on display, she wasn’t impressed—she was put off.
It definitely takes some getting used to. I’m still not entirely over the knee-jerk experience of being concerned that I’ve accidentally left my phone on and need to rectify the problem. But once I get past that, I ask myself: What does leaving the iPhone’s screen on actually accomplish?
Everyone’s iPhone lock screen is going to be different—especially given all the lock-screen customization settings in iOS 16. Some people just want a pretty picture to look at, and the always-on lock screen will supply it, though most of my pictures were dimmed to the point where I could barely make anything out.
Other people might see it as a source of glanceable information, and on that front, Apple has helped out with the iOS 16 addition of lock-screen widgets. Unfortunately, you can only add a handful of widgets in a single block, plus a single text customization above the time. (So much for density. Another line of widgets would be welcome.)
As with the Dynamic Island, the lack of Live Activities support blunts the impact of the always-on display. With a Live Activity on your lock screen, you could monitor the progress of a baseball game or your incoming Lyft ride without waking your phone up. We’ll have to wait and see how that experience changes the screen in future months.
At present, however, I have to say that the always-on lock screen feels… inessential. It’s a nice idea, but I have yet to feel gratified that I was able to glance at my iPhone and see something without reaching for it. It displays the time nice and big, which would be great if I didn’t wear a watch or find myself surrounded by other gadgets that also can tell me the time. iOS 16’s move to roll notifications up into a tighter bundle and place them at the bottom of the screen is a great organizational choice, but it also eliminates the whole idea of just glancing to see what’s come in since the last time I checked.
Beyond the ability to stay on all the time, the iPhone 14 Pro’s display has been upgraded to be brighter. HDR content can now peak at 1600 nits, and as a result, that dynamic range is very much higher. It’s an impressive, high-contrast display that I notice every time I am looking at HDR photos or videos. Apple says that the iPhone display will now peak at 2000 nits in outdoor settings, an admirable goal since it’s often hard to see iPhone content, especially the nuances of HDR, in a bright environment. That said, the iPhone’s OLED screen still seems to dim after a while when it’s being used outside, especially in hot conditions. Meaning that too often, I’m left trying to read my iPhone screen outside while wearing sunglasses and it’s suddenly much less brighter than it should be.
The biggest single hardware upgrade in the iPhone 14 Pro is the main camera, which now has a 48-megapixel sensor, four times the pixels of the iPhone 13 Pro. Apple has for years said (accurately) that counting megapixels is not enough when it comes to measuring the quality of a camera, and the 12MP camera in the iPhone 8 is indeed a far cry from the 12MP camera in the iPhone 13 Pro.
True to its word, Apple has taken its flashy 48MP sensor and made its default mode… a 12-megapixel image. The idea is that Apple’s new “quad-pixel sensor” allows it to gather light from four separate pixels and then combine them to create a 12MP image with superior results, especially in low-light situations. And yes, I saw much less noise in images generated in 12MP mode.
But Apple’s decision is still somewhat puzzling. While you can get a 48-megapixel image out of the iPhone 14 Pro, you have to do it by turning on RAW capture in the Settings app. These RAW captures are slow—it takes a second or more for the camera to be available to take another shot after you snap one—and they’re huge (80 to 100 MB each). But they are also, especially in bright light, spectacularly detailed. Yes, they can be a little noisy, but with a little work in a RAW photo editor (I used Adobe Lightroom Classic), I was able to make great-looking images that had amazing levels of detail the likes of which I’d never been able to do on an iPhone before.
I get that 48MP images are overkill for most snapshots. I get that the files are huge. I get that the 12MP images using the quad-pixel method are less noisy and superior in lots of lighting conditions. I understand Apple’s decision to make quad-pixel shooting the default. But I’m a little surprised that shooting a 48MP HEIF image isn’t an option, and that enabling the 48MP camera requires a visit to Settings.
Perhaps it really just comes down to Apple not trusting iPhone users to use the power of the 48MP image wisely. But I’d rather Apple use its vaunted combination of software and hardware to do something clever, like offering 48MP shooting when the lighting is so good that the benefits of quad-pixel shooting aren’t particularly needed.
There is one mode where Apple allows any user to bypass quad-pixel shooting without any settings changes: the new 2x mode, which literally uses the middle 12 megapixels of the 48MP sensor. The result is a “zoom” that’s made entirely of native pixels without the benefit of the quad-pixel approach. It’s a great added tool in framing shots, essentially like getting a fourth camera. It also shows that lots of images right off the 48MP sensor look great—which is why I think it’s too bad that Apple shied away from giving more direct access to the sensor.
I didn’t spend a lot of time testing the new Action Mode for video, which is like an ultra-aggressive version of Apple’s standard image stabilization. But what I saw, I liked—it shoots video that looks like you’re using a gimbal, even though you’re not. It’s not something you’d want to shoot with unless you’re running around and bouncing the camera, but I was impressed with the output. (Again, I have to ask: shouldn’t Apple be able to sense that you’re shooting video while bouncing around and activate Action Mode, at least as an option?)
The bottom line is that the iPhone 14 Pro camera is remarkable, but I think most regular iPhone users won’t really take advantage of its most impressive feature. If you’re the kind of person who takes pride in your amateur photography but has never shot in RAW because it seems intimidating and overkill for your needs… it may be time to embrace shooting RAW, because the extra detail of the camera’s 48MP mode is worth it.
For some, one of the highlights of the fall is the unveiling of Apple’s latest chip architecture. In 2020, the cores in the A14 were also the cores around which the M1 line of chips were designed. In 2021, we got the A15—and the M2 processor uses those cores. It’s possible (though not guaranteed) that the A16 processor added to the iPhone 14 Pro (but not the iPhone 14) will be the basis of future iPad and Mac processors as well.
The last few years, Apple’s focus seems to have been more on designing those Mac variations while offering some incremental progress in processor speed from year to year. As Apple itself pointed out, it’s so far ahead in terms of smartphone chip speeds that it’s lapped the competition—it can afford to take its foot off the gas for a year or two while it works on advancing its game in other areas. I also suspect that the silicon brief for the iPhone 14 Pro was more about improving the image pipeline for that 48-megapixel camera sensor.
In any event, I can report that GeekBench found that the A16 is faster than the A15… by about ten percent in single-core tests, 15 percent in multi-core tests, and 7.3 percent in GPU-based tests. This is more or less in line with recent year-to-year improvements to the line. The Apple processor machine just seems to keep pushing things forward, 10 percent at a time.
In the last few years, there have been numerous instances where Apple’s hardware effort seems to be a bit ahead of its software team. You’d think that for a product as important to Apple as the iPhone, things might be a little more in sync, but the iPhone 14 and 14 Pro have shipped without one of their highlight features (Emergency SOS via Satellite), and the banner feature of the iPhone 14 Pro—the Dynamic Island—is going to spend weeks, if not months, hamstrung by Apple’s failure to ship the Live Activities API in iOS 16.0.
Those features will get there, eventually. The Emergency SOS feature will almost certainly save lives and reassure iPhone users that they’re never truly out of contact in case of an emergency. The Dynamic Island has had a promising start, and if Apple keeps working on it, I think it could be a major addition to iOS. The new iPhone Pro camera is spectacularly good—but only the most committed iPhone photographers will ever really know how good.
If it’s time for you to upgrade your iPhone, you will be happy with the iPhone 14 Pro. If you’re a dedicated iPhone photographer, the 48-megapixel camera in RAW mode is probably worth serious upgrade consideration. If you are curious about the future of iOS interfaces, the Dynamic Island and always-on display are likely to eventually spread everywhere, but are making their debut here.
While the iPhone 14 is almost being held in stasis, the iPhone 14 Pro feels like the first stirrings of an entirely new generation of iPhone hardware and software. That new generation will probably start in earnest in 2023, but if you want a taste of the future now—or just want to shoot some of those sweet high-resolution images—the iPhone 14 Pro will give you the chance.
If you appreciate articles like this one, support us by becoming a Six Colors subscriber. Subscribers get access to an exclusive podcast, members-only stories, and a special community.