Testing: LG G4 and Android Smartphone Cameras

By Norman Chan

Or, why Android cameras don't suck.

This is a follow up to our review of the LG G4 smartphone we posted last week. I wanted to give you a better look at the photos I've been able to take with the G4, and also to elaborate more on the state of Android cameras in general. The last two Android phones we've tested--the Samsung GS6 and LG G4--have produced the best photos I've seen from any smartphone, iPhone 6 Plus included. But we're still seeing stories like this Motherboard column, proclaiming that Android cameras still suck. It's a hyperbolic headline, but the point of the post has merit: smartphone photo quality is a product of more than just the camera sensor; it's dependent on factors like optics, post-processing, JPEG compression, and even the screen you're using to view those pictures. Apple has done well optimizing its camera hardware and photo software, while Android is at the mercy OEMs' hardware choices and in-house camera apps.

For example, HTC One M9's harsh photo processing hurts its camera performance, but shouldn't be considered representative of other recent flagships. Samsung and LG are role models when it comes to excellent integration of camera hardware and software--I am loving the photos I've been able to take with them. The GS6 and G4's photo processing algorithms seem to make the most of the raw image data that that passes through their respective optics and sensors. A good image processing algorithm is the result of choices and tradeoffs. Engineers have to prioritize factors like sharpening, tonal adjustments, compression, processing power, speed, and file size. On Android, something that helps is the ability to give that algorithm the best possible image in advance by letting photographers configure manual camera settings. It all starts in the camera app interface.

The camera app on the LG G4 is one of the most robust I've used on a smartphone. Users have almost all the settings you would find on an entry-level DSLR: white balance, exposure adjustment and lock, ISO, shutter speed, and even manual focus. That lets you effectively shoot in full manual, aperture, or shutter priority (with a fixed aperture, of course). The manual focus "dial" in the camera app threw me off a bit, since autofocus has been sufficient in most smartphones. But I liked the ability to use it for focusing on densely layered subjects like a bouquet of flowers. With the shallow depth-of-field offered by the camera's f/1.8 iris, manual focus was also very useful for macro shots.

That level of manual control lets me evaluate the differences between camera sensors and image algorithms without being thrown off by variability in auto-exposure and auto-focus. Comparing photo detail in low-light environments, for example, isn't a matter of how grainy an image looks at its highest ISO setting. What's more important to me is how the detail in the file looks when I've taken the photo I think is appropriate for that setting. In low light, that can mean manually underexposing so light sources and reflections don't get blown out.

Take this 100% cropped-in comparison between the Galaxy S6 and G4 cameras in the same bar (GS6 on left, G4 on right). I used the "Pro mode" on the GS6 to underexpose and keep ISO from spiking up, but no manual shutter control on the GS6 meant that I couldn't tell how slow the shutter was for that ISO setting. As a result, I got the level of brightness I wanted out of that photo, but at the expense of a tiny bit of blur. On the G4, I was able to coordinate exposure with ISO and shutter, and the result was a photo where you can make more of the text in the details. Samsung's algorithms also compress more highly than LG's--the file size was twice as large on the G4 than GS6, with same resolution photos.

Interestingly enough, on several comparison photos between the GS6 and G4, the people I showed the photos to thought that the GS6 pictures looked better. The G4 JPEGs may store finer detail, but it seems like people liked the way GS6's image algorithm processed color and dynamic range. It looks like Samsung's photos are a bit more saturated than the LG's, and shadow detail is crunched into the black. It's a tradeoff of dynamic range for "pop". That kind of processing makes a lot of sense for Samsung's brilliant AMOLED screen, even if it may be less pure of a photo. Since the GS6 can't save RAW photos, we're really comparing apples and oranges, with the upshot being that the GS6' photos look fantastic on the GS6's screen, and the G4's photos look fantastic on the G4. But if I needed to export a photo from one of these cameras to Photoshop and blow up--to plaster on a billboard, let's say--I'd prefer using a source image from the G4.

Another camera test I like to perform to evaluate image processing and compression is taking a photo of a brick wall. Those photos tend to tax the most out of compression algorithms--it's difficult to preserve all the details in the cracks and crevices when crunching a 5312x2988 photo down to 4MB. In comparing brick wall photos taken with the LG G4 and iPhone 6 Plus, I could definitely tell that the G4 was saving capture more detail and saving more data. But once again, blind tests shown to friends resulted in the iPhone 6 photo being favored. It's a testament to Apple's image processing algorithm. Minute pixel detail isn't the top priority to the human eye if you're looking at a scaled photo on a smartphone or even retina laptop screen. What people seem to respond to are edge enhancements and color saturation--attributes that aggregate to determine the "clarity" of a photo. And as a side note, this brick wall test also demonstrated the vignetting that was apparent in the G4's photos--something that could have been mitigated in the image processing.

Finally, I want to address the issue of RAW photos and their relation to built-in image processing. Google's Camera2 API allows OEMs to build in RAW photos saving on their camera apps. The HTC One M9, LG G4, and Nexus 6 have this enabled. These DNG files can be massive, often well past 25MB per photo, which makes them impractical to keep on your phone forever. They're also not the end-all-be-all of smartphone photography. In my tests, I found that saving in RAW and processing a photo later has the potential to drastically improve an image, but that in some cases, the camera app's image algorithm yielded a better looking JPEG than a post-processed RAW pic.

Saving in RAW worked best for me in the daylight, where I used the extra image data to color correct, reduce highlights, and add radial and graduated filter effects. This photo is a great example of a best-case RAW edit scenario. What RAW photo editing on the LG G4 ended being poor for was recovering detail in really low-light scenarios. In the comparison below, the JPEG that came out of the camera is on the left, while the edited RAW version is on the right. Yes, there's technically more detail on the RAW edit (notice the hairlines), but the RAW version also retains the discoloration of ISO noise--something that manual post-processing struggles to remove. The shadows on the in-camera JPEG may have less detail, but at least the dark areas are even. LG's camera algorithm did a better job getting rid of excess data when processing the photo than I could in Lightroom.

All of this is a long way of reminding you that camera hardware specs are just one component that goes into how a smartphone photo looks, and to give you a glimpse of how I think about photography when testing smartphones. To say that Android cameras suck is a gross oversimplification of the current state of smartphone photos; but simplification is something that Apple's cameras and algorithms do get right. If you just want to tap and snap, you're going to be able to get quality photos on the iPhone 6. I just prefer the versatility and high-resolution detail of my photos from the LG G4, which look great on the phone itself and on the desktop.

You can find the full-res test photos I took with the LG G4 in this Google photos album.