Apple’s iPhone 6s camera makes a huge leap in quality | Engadget
Apple has completely overhauled the camera on the new iPhone 6s and iPhone 6s Plus, giving it much higher resolution and 4K video. The new iSight camera now has a 12-megapixel sensor, 50 percent more than the iPhone 6’s shooter. The new iPhones will also be Apple’s first to go beyond 1080p video with Ultra HD 4K. Apple hasn’t touched the pixel count on its last three iPhone models, which stayed at eight megapixels, despite other improvements. However, it clearly felt the need to counter rivals like Samsung’s Galaxy S6, which has an excellent 16-megapixel camera.
Apple got by with eight megapixels on the iPhone 6 and 6 Plus cameras because they reproduce colors accurately, perform well in low light and focus quickly and accurately. The new models don’t sacrifice those qualities with the extra resolution, thanks to an improved sensor and faster A9 processor, according to Apple’s Phil Schiller. He said, “The goal was to add pixels without degrading quality,” so the sensor has not only 50 percent more pixels, but also 50 percent more focus pixels.
Meanwhile, Apple said it’s doing 4K video in a different way than most manufacturers by capturing every frame discretely at a full eight megapixels — rather than capturing the differences between frames. That should result in video that doesn’t break up when the shooter or subject is moving. However, it’ll also make for very large files, so if you’re looking at a base 16GB iPhone model, 4K video could eat up your memory quickly. You also won’t see those extra 4K pixels on a new Apple TV, which is limited to 1080p HD playback.
For selfie junkies, the front-facing camera has been upgraded to five megapixels, a big jump from the iPhone 6’s anemic 1.2-megapixel camera. Apple also managed to bring a “flash” to the front camera, via a clever cheat. It turned the iPhone 6s’ Retina display into an illumination system, a feature that Apple calls the “Retina flash.” When you take the shot, the display lights up to triple the brightness, providing enough illumination for your face at arm’s length.
Subscribe to the Engadget Deals Newsletter
Great deals on consumer electronics delivered straight to your inbox, curated by Engadget’s editorial team. See latest
Please enter a valid email address
Please select a newsletter
By subscribing, you are agreeing to Engadget’s Terms and Privacy Policy.
Earlier, Apple launched a 3D Touch feature on the iPhone, and it works with images via a feature called “Live Photos.” The camera has the option turned on by default, so when you take a snap, it extends capture for a few seconds after you hit the shutter. You can then “force press” the screen on your shots to see motion and hear sound. Apple also gave its Watch some love, letting you use the Live Photos as an animated watch face. Other apps, like Facebook, will eventually support the feature, too.
The 4K video and images we saw at Apple’s event looked great (below), but we’ll reserve judgment until we can try the cameras for ourselves. You can read our hands-on for both new iPhones, and don’t forget to check our liveblog for more info.
Get all the news from today’s iPhone event right here.