One of my biggest pet peeves in consumer tech writing is the phrase "tack sharp," when used in camera reviews. Those words alone don't qualify anything, and makes me think that whomever conducted that review just took a bunch of random photos, zoomed in to them on a desktop image viewer, and gave them a cursory evaluation. To be fair, testing the image quality of a camera and lens system isn't easy, even with best of intentions. That's the topic of this technical blog post by LensRentals founder Roger Cicala. Cicala, who last month wrote about the subtle optical variations in different camera lenses, wanted to call attention to the validity and accuracy of lab tests run by camera review sites. Each type of test has its own strengths and weaknesses, but more importantly, how the test is conducted can also vary between reviewers and affect the interpretation of results.
Addressed by the post are the two predominant types of lens tests used, Optical Bench Testing and Computerized Target Analysis. Optical Bench Testing, as its name implies, requires an optical bench: a mechanical rig in which the lens is attached to a computer-controlled actuator and run through a series of programmed optical measurements. It's precise, repeatable, and reliable, but an optical bench system can cost tens of thousands of dollars. Cicala also notes that it only tests lens quality focused to infinity (not as useful for macro lenses) and doesn't test a camera system (body and lens).
Computerized Target Analysis is the technique used by reputable camera review sites like Imaging Resource and DPreview, both of which are much more thorough and reliable in testing than any general technology review site. For this type of test, reviewers run sample photos through special (and expensive) analysis software--Imatest and DxO Analyzer--to evaluate sharpness at different focus distances and different parts of the image (eg. how sharp the corner of a photo is when the camera system is focused on the center of the frame). Because the proprietary software isn't completely transparent about the way it evaluates images, lens reviews using different software can't be directly compared. Says Cicala:
In practice, though, Imatesters may not agree nearly as much as DxO testers. Why? Because Imatest’s flexibility allows much more variation. There are several different test charts you can use (and depending on the chart, what is defined as the corner of the image differs a lot). You can analyze RAW files or jpgs (which have some in-camera sharpening applied). Imatest gives vertical and horizontal resolution separately, indicating astigmatism, and the reviewer can average the numbers, use the higher number, or lower number, etc.
And it's not just that different reviewers can use different test charts, the quality of the actual test chart matters too. One commonly used chart is ISO 12233, which is the International Organization for Standardization's official standard for measuring camera resolution. A licensed and approved chart for testing purposes is something you have to buy from ISO, though the design is not copyrighted and so testers often just download a high-resolution PDF and print out their own test chart to photograph. But the results from an inkjet-printed test chart isn't comparable to from one using an approved (and expensive) linotype print. And with high-resolution DSLRs, a backlight film chart is recommended by Imatest. The point isn't that using a "lesser" chart invalidates results, but it can put a ceiling on the usefulness of that test.
Cicala's point isn't that you can't trust camera and lens reviews, but that you should always be thinking about what exactly was being tested and looking for qualifications and explanations in every review you read. The more context you can glean about testing methodology, the better you can factor the results of that review into your purchasing decisions. "Tack sharp" alone shouldn't be taken at face value.