Intro
(Intro SFX)
- Okay, what exactly is happening with the iPhone's camera?
Like we've done years of blind smartphone camera tests
in bracket format and the iPhone,
Supposedly one of the premium cameras
in the entire smartphone industry,
consistently loses in the first round.
Then we do a scientific version with 20 million+ votes
and it finishes in the middle of the pack.
"And yet, Marques, you named it the fourth time running
Best overall smartphone camera system in 2022
and gave it a trophy.
What's up with that?"
A concerning amount of people have started
to notice that the iPhone camera
feels like it's taken a few steps back lately
and I agree with them.
I think we should take a closer look at this.
(relaxed music)
What is a Camera
So first of all, cameras have come a long way
to the point where smartphone cameras
aren't just cameras anymore.
See, back in the day, a camera was a sensor
that would travel around covered all the time
and when you wanted to take a photo,
You would expose that sensitive bit
to the environment around it
and it would collect the light and close it.
Then the photo would be a representation
of how much light hits each part of the sensor.
The better the sensor, the better an image you can get,
The more light information, the more it becomes.
These days though,
it's turned into a whole computational event.
Your smartphone sensor is sampling the environment,
not once, but often several times
in rapid succession at different speeds.
It's taking that light information,
and merging exposures.
It's doing tone mapping, noise reduction, HDR processing
and putting it all together
into what it thinks will be the best-looking image.
This, of course,
is a very different definition of a picture.
So now it's not just about having the best sensor
that gathers the most light information,
It's at the point where software
makes a much bigger difference
to the way the image looks at the end of the day
than anything else.
Like next time you watch
a smartphone reveal event, for example,
Keep an eye on all the new additions that get made
and just how many of them are pure software.
Google Pixel Camera
So Google struck gold when they first
started using the IMX363 sensor
way back in the day with the Pixel 3's camera
because they got their software tuning with it just right
And it was an instant smash hit.
So they kept using that great camera combo
in every Pixel since then.
The 3, the 3a, the 4, the 4a,
the 5, the 5a, and even the Pixel 6a.
So year after year of new phones,
same sensor, same software tuning combo
because it just worked.
If it ain't broke, don't fix it.
So when you saw the Pixel 6a
Pixel 6 Camera
win December's scientific blind smartphone camera test,
What you saw was a four-year-old sensor
and software tuning combo that is still so good
that in a postage-stamp-sized comparison
of compressed side-by-side images
where you can't judge sharpness
or depth of field too much,
just appreciating the basics,
This combo nailed the basics
better than anyone else.
Now, when the Pixel 6 came along, stay with me,
Google finally updated its design and branding
and they finally changed to a new sensor
with this new camera system.
So they go from the tried-and-true 12-megapixel
to this massive new 50-megapixel sensor
and it kind of threw a wrench into things.
- So it looks to me that the Pixel is over-sharpening.
I think the one on the left looks too crunchy.
- The camera on the Pixel 6 does have a habit
of making things just look HDR-y.
I dunno if there's a technical term for that.
- [Dan] And if you look at all the photos,
it's clear the Pixel is still doing Pixel things.
- I think Google's still running all
of their camera algorithms at 11,
like when they don't need to anymore.
- Right now, new phones with much bigger sensors
are still processing like their smaller older ones.
Sensor Processing
- The basic principle is:
They were doing all this processing with the old sensors
as if they were not getting a lot of light
And then suddenly they had this massive new sensor
which is getting way more light information
but they were still running all of this processing.
They would still do high-sensitivity stuff
and then they'd do noise reduction
because if you have high sensitivity,
You need noise reduction.
But then since you're doing noise reduction,
You need to do sharpening on top of that
to make up for it
And just overall you're doing way too much.
And so the photos are overprocessed.
So this fancy new phone would come out
with a new camera system,
but you could argue, legitimately,
that the older Pixel still took better-looking photos.
Google Pixel 7
So Google had to work hard on the drawing board
and make some adjustments and some updates to the software
to dial in this new sensor.
It took a while, but now with the Pixel 7 out
A full year later with the same huge 50-megapixel sensor,
They're back on track.
And hey would you look at that,
Pixel 7 is right behind the Pixel 6a in the blind camera test.
So when I see iPhone 14 Pro photos
It looks a little inconsistent
and a little overprocessed right now,
I see a lot of the same stuff
that Google just went through with the Pixel.
Because the iPhone story is kind of along the same lines,
They used a small 12-megapixel sensor
for years and years and years.
Then the 13 Pro sensor got a little bigger
but this year, the iPhone 14 Pro
is the first time they're bumping up
to this dramatically larger 48-megapixel sensor.
And so guess what?
Some iPhone photos this year
are looking a little too processed
And it's nothing extreme, but it's a real
and they will have to work on this.
I suspect that by the time we get
to iPhone 15 Pro, you know, a year later,
They'll have some new software stuff they're working on.
And I bet there's one new word they use on stage.
You know, we finally have Deep Fusion
and pixel-binning and all this stuff,
I bet there's one new word they use
to explain some software improvements with the camera.
But anyway, I think this will continue improving
with software updates over time
and they'll continue to get it dialed
And I think it'll be fine.
But that's only half my theory.
This does not explain why
All the previous 12-megapixel iPhones
Also, all lost in the first round
in all those other bracket-style tests.
And this is a separate issue
that I'm a little more curious about
because as you might recall,
All of our testing photos have been photos of me.
My Photos
Now, this was on purpose, right?
Like we specifically designed the tests
to have as many potential factors
to judge a photo as possible.
If it was just a picture of this figurine
in front of a white wall,
the winner would probably just be whichever one's brighter,
maybe whichever one has a better gold color.
But then if we take the figurine
with some falloff in the background
now we're judging both color and background blur.
Maybe you add a sky to the background,
now you're also testing dynamic range and HDR.
So yeah, with our latest photo, it's a lot.
It's two different skin tones.
It's two different colored shirts.
It's some textures for sharpness,
the sky back there for a dynamic range,
short-range falloff on the left,
long-range falloff on the right.
I mean with all these factors,
Whichever one people pick as the winner
ideally is closer to the best overall photo.
I also wanted the pictures to be of a human
just because I feel like
Most of the important pictures that people take,
Most often, what they care about are of other humans.
Software vs Reality
But as it turns out,
using my face as a subject for these
revealed a lot about how different smartphones
handle taking a picture of a human face.
Because as I've already mentioned,
These smartphone cameras have so much software now
that the photo that you get when you hit that shutter button
Isn't so much reality
As much as it is this computer's best interpretation
of what it thinks you want reality to look like.
And each company goes to a different level
of making different choices and different optimizations
to change their pictures to look different ways.
They used to be a little more transparent about it.
Some phones would identify
When you're taking a landscape photo
and they'd pump up any greens they can find of the grass
Or they'd identify any picture with a sky in it
and pump up the blues to make it look nicer.
I did a whole video on smartphone cameras versus reality
that I'll link below the Like button
If you wanna check it out.
But the point is when you snap that photo on your phone,
You're not necessarily getting back a capture
of what was really in front of you.
They're bending it in many ways.
The iPhone's thing is when you take a photo
it likes to identify faces and evenly light them.
It tries every time.
And so this feels like a pretty innocent thing, right?
Like if you ask people normally,
"What do you think should look good in a photo?"
And you say, "Oh, I'll evenly light all the faces in it."
That sounds fine, right?
And a lot of time it looks fine
Shadow
but it's a subtle thing like in a photo
where you can see the light is coming from one side clearly,
where you can see from the Pixel's camera,
There's a shadow on the right side of the face.
With the iPhone though,
It's almost like someone walked up
and added a little bounce fill, (chuckles)
just a nice little subtle bounce fill.
But sometimes it looks a little off.
Like look, this is the low-light photo test we did
from our blind camera test.
On the left is the Pixel 7 again,
which looks like all the other top dogs.
And on the right is the iPhone 14 Pro
finished in the middle of the pack.
It might be hard at first to see why it looks so weird
but look at how they completely removed the shadow
from half of my face.
I am being lit
from a source that's to the side of me,
and that's part of reality.
But in the iPhone's reality, you cannot tell,
at least from my face, where the light is coming from.
Every once in a while you get weird stuff like this.
And it all comes back to the fact
that it's software making choices.
And the other half of that is skin tones.
Skin Tone
So you've heard me say for a few years in a row
that I mostly prefer photos coming from the Pixel's camera,
and we've done lots of tests
where I have me as a sample photo
And you can tell it looks good.
Turns out Google's done this thing over the past few years
with the Pixel camera called Real Tone.
It doesn't get that much attention,
But it turns out to be making a pretty big difference here.
Historically, a real issue for film cameras back in the day
was that they were calibrated for lighter skin tones
and people with darker skin tones
would typically be underexposed in those pictures.
So now fast forward to today, cameras are all software.
Smartphone cameras are software
so they can all make adjustments
to account for different variety of skin tones, of course.
But they still all do it to different varying degrees.
As you might have noticed
Real Tone
A lot of phones sold in China
will just brighten up faces across the board
because that's what people prefer
in photos in that region very often.
Google goes the extra mile
to train their camera software on data sets
that have a large variety of skin tones
to try to represent them correctly across the board.
And that's what it's calling Real Tone.
And Apple's cameras, from what I've observed,
simply just like to evenly light faces across the board
and doesn't necessarily account
for different white balances and exposures necessary
to accurately represent different types of skin tones
when I think they totally could.
So basically, it turns out this is a big part
of what we were observing in Pixel's
and a lot of the phones
that do accurately represent my skin tone
finishing higher in this blind voting thing that we did
because they happen to do that well.
And that's a thing that people considered
when they voted on them.
I haven't said this a lot,
but I think this is one of the earliest reasons
that I liked about RED cameras were,
You know, 8K is great,
Color Science is great, but the way it represents
and renders my skin tone accurate over a lot of,
You know, the Sonys and the ARRIs
and Canons that I've tried,
That's one of the things
That drew me to these cameras.
So all this software stuff is why photo comparisons
between modern smartphones is so hard.
There are a lot of channels that do a really good job
with the side-by-side photo test, you know,
but even as you're trying to pick one over the other,
You've probably noticed this,
You might like the way one of them
renders landscape photos over the other
but the way a different one renders photos
with your skin tone and then the way
a different one renders photos of your pet, for example.
So I'm sure Apple will defend everything they're doing now
with their current cameras as they typically do.
But I'm gonna keep an eye on what I'm also sure
which means they're for sure working
on tuning these new cameras, dialing them in,
and eventually getting it better
with the iPhone 15 and 15 Pro.
So back to the original question
From the beginning of the video,
We can't leave that unanswered, which is,
"All right, the Pixel 6a,
Do you like the Pixel photos, Marques,
It won the blind scientific camera test
but you still gave the trophy
For the best overall camera system for the iPhone,
The very 14 Pro that we've been talking about
This whole video, why?"
And if you listen carefully, you already got it,
which is that scientific test that we did
tested one specific thing,
It tested the small postage-stamp-sized,
You know, exposure and colors are general thing
with a bunch of different factors,
but sharpness and detail
with all the compression that we did wasn't tested.
Also, the speed of autofocus,
reliability of autofocus wasn't tested.
The open-close time of the camera app,
How fast and reliable you can get a shot, wasn't tested.
Also, the video wasn't tested.
So the microphone quality, video quality,
speed and reliability of autofocus there,
file formats, sharpness, HDR, and all that stuff, weren't tested.
Maybe someday we will test all that but until then,
The lesson learned is the pretty pictures
that come from the Pixel or whatever phone's in your pocket
are partly photons, but primarily processing.