Quantcast
Latest StoriesPhotography
    Beautiful Quadcopter Video Over Prague

    I really enjoyed this aerial tour of Prague, shot with a Phantom DJI and GoPro with a three-axis stabilizer. The filmmakers promise that they were cautious in their filming, but I think it would still be considered reckless by veteran multi-rotor hobbyists who are struggling with regulatory limbo and dickishness here in the States. It makes me wonder if this genre of videos--beautiful aerial montages from the uniquely mobile perspective--is a flash in a pan, and will go away as governments tighten down their restrictions on where hobbyists and professional videographers can fly. I was actually surprised to see how few drone or aerial RC videos came out of Burning Man this year. DJI's Eric Cheng shot a great montage, but I expected the desert skies to be littered with these things.

    In Brief: Canon Announces PowerShot G7 X, Long-Awaited 7D Mark II

    It's been a crazy week in technology already, and a few bits of news escaped me until today. Photokina is going on right now, and camera companies are making some pretty big announcements there. New "entry-level" Leica cameras, Instagram-styled Polaroid instant cameras, and plenty of lenses. I honestly have trouble keeping up with all of it. But Canon has two cameras that extremely notable. First is the PowerShot G7X, a direct competitor to the Sony RX 100 III. It's a compact that uses the same 1-inch type 20MP sensor, but is $100 cheaper than the RX 100 III and has a better 24-100mm equivalent f/1.8-2.8 lens. Apparently, the lens stays wider longer, and Canon goes directly after Sony with a clicky lens dial and a dedicated exposure comp dial. No EVF, but that's not a big miss. Canon knows where fans are at and are finally addressing high quality compact needs. Also announced was the long-awaited EOS 7D Mark II, the best APS-C camera you can get from Canon before going full-frame. Like the 7D, it's designed for video shooters, with a fast AF system equipped with 65 cross-type focus points. Recording is still limited to 1080p50, but now there's a headphone jack and uncompressed clean HDMI out. I bet it shoots pretty good photos, too.

    Norman 3
    Tested In-Depth: Sony RX100 III Compact Camera

    We sit down to discuss Sony's latest high-end compact camera, the RX100 Mark III. Having tested both predecessors to this model, we evaluate its new features like the electronic viewfinder and improved zoom lens, as well as its image quality compared to big DSLR cameras. Here's why it's one of our favorite new cameras to use!

    Testing: Sony RX100 MK III Compact Camera

    I've been on the hunt for a pocket camera to complement my DSLR, and spent time with cameras with sensors ranging from micro 4/3 to full-frame. My test of Sony's RX100 Mark II made me seriously consider the trade off between body size and sensor size. There were several things that held it back from being ideal for my day to day use, but I realized that getting a compact camera with an APS-C or Full-Frame sensor would compete too directly with my DSLR. Going for portability made more sense for a secondary camera. So for my recent birthday, I ended up buying Sony's RX100 Mark III, based on the praise other photographers have given it. It arrived a little over a week ago, and I've been shooting a lot with it since. And even though I've already committed to the camera, I'm still running it through the practical testing that I would give any new camera to gauge its strengths and weakness, and to relay that experience to you. So here's what that shooting experience has been like so far.

    One of the reasons I felt I would be comfortable buying the RX100 III before using it is because it inherits almost all of the great things I like about the RX100 II. That includes size, weight, tilting LCD, image quality, manual controls, and wi-fi features. The size and weight are perfect for these cameras to be stowed in a jacket pocket (though the MK III is slightly thicker and heavier than both previous models). I was already satisfied with the RAW and JPEG image quality from Sony's 20MP 1" -type sensor, even if the lens on the MK II was a little lacking. And I have been very impressed with the Wi-Fi connectivity of Sony's cameras, which I used extensively on the a7 and RX100 II. In using this 3rd-generation RX100 the things I wanted to specifically test for were the new zoom lens and autofocus speed, as well as the digital viewfinder.

    The OLED viewfinder is probably the most noticeable addition to the RX100 line, and surprisingly doesn't add to the heft or bulk of the camera. There's still a built-in flash, and the only thing you lose is a hotshoe that was on the MK II. This EVF pops up on the left side and needs to be extended a little bit before use, so you can't switch to it as instantaneously as you would a fixed EVF like on the Fuji cameras. The eye proximity sensor has proven to be accurate, though. I found the 800x600 resolution (100% coverage, .59x magnification) sufficient for framing and focusing, since I use digital peaking assists anyway for finding focus. I know some people who only use EVFs for their shooting, but I typically can't stand the latency--my brain wants the response of an optical viewfinder. But I have been using the EVF on this MKIII outdoors and even for reviewing photos. If Sony offered a version of the MK III without the EVF for a lower price, I would've gone with that one. But the $150 price difference between the models accounts for the EVF, the new lens, and new processor.

    Below are my sample photos taken so far, with notes on what they say about the camera. The photos were not post-processed at all, just RAW files ingested in Lightroom and resized/exported as JPEGs. Click each of them to enlarge.

    Testing: Instagram's Hyperlapse App for iOS

    Instagram today announced and released a new iOS video app called Hyperlapse. It was a pet project of Instagram engineers Thomas Dimson and Alex Karpenko, and impressed Instagram founder Kevin Systrom enough that the company developed it into a full-fledged app. Wired Design's Cliff Kuang has an exclusive story about the app's origins, if you're curious. But after a morning of testing, here's what you need to know about it.

    Hyperlapse is a time-lapse app for iOS, much like Studio Neat's Frameographer or the time-lapse feature built into many smartphones. Unlike those apps, three isn't much to configure--you don't set the interval time between snaps, nor the framerate of your output video. You just hit record and Hyperlapse starts record, at a default rate of five frames a second (assuming 30fps output). That translates to one second of video for every six seconds of time passing--pretty fast for a time-lapse. But what makes these time-lapses a "hyperlapse" is the stabilization between captured frames, making it look like your time-lapse video was shot on a gyro-stabilized gimbal. And technically, your video is gyro-stabilized, since the app takes into account the iPhone's gyro data to match frame angles and smooth out the video movement. The result is smoother time-lapses that you'd get than just putting your phone on a tripod, without using complex motion-correction algorithms like Microsoft Research's hyperlapse project.

    I shot a few Hyperlapse videos to post on Instagram, and frankly wasn't very impressed by the output. The gyro-stabilization works to some extent, but doesn't do a good job compensating for very shaky movement. You still have to try to keep your hands still or your phone held steady against a fixed object. Also, the video output on my iPhone 5 took a long time to process for a minute-long clip, and compressed the hell out of it. Hyperlapse is really only ideal if you're shooting the Instagram-preferred 15 second clips (about three minutes in real time), and if you don't care about video compression whisking away HD details. Full clips are saved to the iPhone's camera roll, like the video I uploaded to Vimeo and embedded below. A two minute clip ended up being only 120MB on my phone, and looked worse than a stationary time-lapse I shot and exported with Frameographer.

    Airplanes Taking Off and Landing in Time-Lapse

    Photographer Milton Tan was granted access to a restricted runway at the Singapore Changi Airport to shoot this time-lapse compilation of planes taking off and landing at one of the busiest airports in the world. Tan details the technical aspects of his shoot on his blog, explaining that he used long exposure shots with a Canon 5D III and 7D with a range of zoom lenses. 7000 photos were processed in Lightroom and edited in Premiere. This is how I imagine a Star Trek-style spaceport to look like in real-time, with planes warping off as beams of light. (h/t Petapixel)

    Real-Time Face Tracking and Projection Mapping

    This is one of the coolest things I've seen in a long time. PICS, a Japanese video production company, experimented with face tracking and projection mapping to animate and transform the face of a model in real-time. The model's face was marked with tracking dots and painted in reflective make-up, which allowed a computer system to match an 3D animation with her head movements. From afar, the positional matching and low latency of the projection create a mesmerizing and surreal illusion. It's the kind of effect that I would love to see used in movies, shot in-camera instead of done in post with CGI.

    Filming The Light and Dark Side of The Godfather

    Gordon Willis, who passed away on May 18, 2014, will always be best known as the cinematographer of The Godfather films. At least one recent poll ranked The Godfather as Hollywood's top movie of all time, and it’s not surprising Coppola's epic crime drama is still revered after all this time. The incredible scope and power of the story still holds up, and it gave a generation of new actors like Al Pacino, Robert Duvall and James Caan their career breakthroughs. Not to mention it was one of Marlon Brando’s best roles, and the movie that revived his career.

    The Godfather also made cinema history by introducing a new style of cinematography.

    Before Willis shot The Godfather, movies were vastly overlit so they could be seen in the drive-ins and not disappear into the dark of the night. But Willis’ cinematography was a bold step forward, changing the look of movies forever. Because of The Godfather, studios actually had to make two sets of prints, a lighter one for drive-ins, and a darker one for theaters.

    It’s easy to take this for granted today because dark cinematography is an accepted norm, and with the latest digital cinema cameras you can shoot with almost no available light. But for the time, Willis’ approach was very groundbreaking, and many cinematographers followed his lead into the dark.

    Willis had shot several films before The Godfather, including Loving, which was directed by Irvin Kershner (The Empire Strikes Back), and The Landlord, which was directed by Hal Ashby (Harold and Maude). The Godfather was going to be filmed in New York, which meant that Coppola had to hire a cinematographer from the New York unions. Willis was recommended to Coppola by Matthew Robbins, a friend from the Bay Area who went on to write The Sugarland Express for Spielberg, as well as direct the fantasy Dragonslayer. (Robbins knew Kershner from USC, where the latter taught film.) Willis was also picked for the job because Coppola wanted a cinematographer that could capture a period look.

    In interviews, Willis made it clear there was no master plan to change cinema with his approach to the film.

    9 Amazing Photos From The Early Days Of Photography

    It’s hard to imagine that humans have been recording photographic images for almost two centuries. The first successful attempts to fix an image onto a physical medium happened in 1820, but it wasn’t until 1839 when Louis Daguerre introduced the first commercial process that things started to really blow up. Today, we’ll spotlight some incredible images from the 1800s that will blow your mind.

    How RAW Photography Will Change Smartphone Cameras

    One of the things we think Apple does better than other smartphone manufacturers is build great cameras into its phones. It's one of the reasons that iPhone is one of the most popular cameras in the world, period. Based on our experience, the iPhone 5S' camera produces better-looking photos than that on high-end Android phones like the Nexus 5 and HTC One M8, and it's a safe bet that the next iPhone will have yet another camera upgrade. Sony currently supplies the small CMOS camera in iPhones, and it's also the supplier of camera sensors on a variety of Android phones. The difference in photo quality between those devices, then, can partially be attributed to the lens system used. But photo quality is also tied to the imaging software built into the phone's OS. And on that front, Android may take a leap over iOS later this year.

    While Apple is opening up manual camera controls to developers in iOS 8, one feature that's sorely lacking is support for RAW photo capture. And coincidentally, that's one feature that Google is bringing to Android L--support for camera apps to write raw pixel data from the camera sensor as a DNG (digital negative) file. While this may not sound like a big deal for most smartphone users, it is in fact a huge deal for photographers who are doing more than just taking photos to immediately share on social networks. As I've said before, the post-processing of a photo is just as important to the whole of the photography process as the act of snapping the shutter. The ability to save smartphone photos as RAW files instead of just JPEGs is the equivalent to an immediate and free upgrade to the camera, regardless of the sensor make.

    Photo credit: iFixit

    To understand the benefits (and costs) of RAW, let's quickly go over the limitations of JPEG images. JPEG is a lossy file format, using image compression algorithms to reduce the file size of an image while retaining as much details as possible. Standard JPEG settings allow for a compression ratio of 10:1 from the original image without noticeable reduction detail, especially on small screens. JPEGs are also most commonly saved with 8-bit color profiles. That means that each of its RGB color channels top out at 256 gradiations. 256-degrees of brightness for each color channel is plenty for a photo, but camera sensors can actually record much more detail than that. Digital imaging chips can process light coming into sensors in 12 or 14-bits--light data that is lost when converting an photo to a JPEG. That extra data, when run through a RAW image processor, allows for more flexibility when editing and helps avoid image artifacts like light banding.

    Another limitation to JPEGs is the inconsistency of compression engines between smartphones. The amount of compression used to save a photo in the iPhone is different than that of an Android phone, and can vary between camera apps. For example, a 13 Megapixel photo taken on the new OnePlus One Android phone is compressed to a file between 1-2MB at the highest quality setting. The iPhone 5S, using a 8MP sensor also made by Sony, saves JPEG photos that are also around 1.5MB each. (By comparison, the 14MP camera on the Phantom 2 saves 4-6MB JPEGs). So where did that extra megapixel data go? While some camera apps have JPEG quality settings, the amount of compression isn't always transparent, so you don't know if you're getting the best possible photo you can from your phone.

    Photo credit: DPReview

    RAW image files eliminate that ambiguity, because it's just storing the unfiltered image data taken from the camera sensor. And the best part is that saving in RAW isn't a hardware limitation. All digital camera sensors have to pass that raw information through the phone's image processors--it's up to OS and camera software to give users a way to save that data before it gets lost in the JPEG output. The high-end Nokia Lumia phones have RAW photo capability, and previous phones like the Lumia 1020 were granted RAW file saving with a software update. DPreview ran a comparison of RAW and processed JPEG photos with the Lumia, and I ran own tests with a small-sensor camera to show you the image detail differences.

    Tested: Google Camera vs. Best Android Camera Apps

    So you've picked up a spiffy new Android phone, but the camera interface isn't to your liking. Even if you don't have any strong feelings either way, you may still wonder if there isn't something better out there. The Play Store has plenty to choose from, but most aren't doing anything particularly impressive. A few might be worth your time, though, and of course Google has thrown its hat into the ring recently with a stand-alone photography app. Let's see how the Google Camera app stacks up against the best third-party camera options.

    Photo credit: Flickr user janitors via Creative Commons

    Google Camera

    If you have a Nexus or Google Play Edition device, this is the stock camera app. For everyone else, it's an alternative downloaded from the Play Store. It's a complete redesign of the old default app from AOSP that fixes many of the issues people have been complaining about in the camera UI for years.

    This is by no means a unique feature among your camera options, but Google's camera app finally shows the full sensor frame. Previously, it would crop the top and bottom of 4:3 images in the viewfinder, making it hard to frame the shot. It now gives you the option if you want to take wide or square shots (the crop depends on the device). This alone makes it a better app for Nexus users.

    Some of the "advanced" features we used to see in the stock camera are gone with this new version, which might make it a deal breaker for some people. There's no timer mode, no white balance, and no color control. The user base for these features is probably smaller than the complaints online would make you think, and you DO still have manual exposure control. The rest of the features will probably trickle back in over time.

    In Brief: iOS 8's Manual Camera Controls

    We talked about this on last week's podcast, but I wanted to talk more about the changes coming to iPhone in Apple's upcoming iOS 8. The camera app in the new iOS, which presumably will work on the current iPhones, in addition to whatever iPhone models Apple releases this year. New features mentioned in the WWDC keynote include a built-in time-lapse recording mode and shutter delay timer, but the more exciting capabilities for photographer are the manual ISO, shutter speed, exposure compensation, and white balance settings. According to Anandtech, only exposure bias will make it into the default camera app, while the rest of the manual controls are exposed through the API for third-party apps to surface and exploit.

    Unfortunately, while these new controls are welcome additions to the iPhone, there's been no mention of any RAW shooting capability. The updated iCloud will support RAW file syncing for photos imported from external sources into iPhoto, which gives hope that Apple may announce RAW photo capabilities in the Fall with new hardware. RAW photography is already available on some smartphones like the Nokia Lumia 1020, and has been rumored to be an upcoming Android feature. While smartphone camera sensors are still limited by their physical size and lens, lossless RAW photography will avoid JPEG compression and let you make important post-processing adjustments like white balance. DPReview's evaluation of RAW processing of Lumia 1020 photos was promising, and I've had success using RAW processing to improve the photos taken by our DJI Phantom 2 Vision+ quad.

    Norman
    Why Adobe Lightroom Mobile for iPhone is a Killer App

    My favorite program to use on my desktop PC at home and MacBook Air is Adobe Lightroom. Other than the Chrome browser, it's the most essential app I rely on for day-to-day computing. I could even argue that it's one of the few reasons I still need an x86 PC--to process and develop my RAW digital photos. That's something that smartphones and tablets just aren't good at yet, even with the numerous photo tweaking apps available and the great displays on devices like the iPad Air. Adobe's goal for Lightroom is to convince photographers of all skill levels that the post-processing of digital photos--the bits-and-bytes equivalent of analog film development--is just as important to photography as the act of setting up and snapping the shutter button. It's a sentiment I agree with.

    That's why I was a little disappointed by the release of Lightroom Mobile for iPad earlier this year. The way the app was designed--and the constraints limited by the iPad and iOS--made it difficult to incorporate the app into my photo developing workflow. The iPad app was a way to load synced Smart Previews of photos from your desktop to the tablet and do light tweaking or flagging. You could send JPEGs from your iPad's camera roll back to Lightroom, but not RAW files. There was no way to use the iPad as a funnel to ingest RAW photos from your DSLR on the go and have them pop up back on the desktop.

    Today's release of Lightroom Mobile for iPhone doesn't change much, but the shift from tablet to smartphone is quite a big deal (I'll get to why in a moment). Adobe did address some feedback from users of Lightroom for iPad, adding the ability to give star ratings to photos in synced library collections and custom sort order. Functionally, the iPhone version has feature parity with the iPad version, just rescaled for the iPhone's aspect ratio and screen. The app is still free and connects to the updated Lightroom 5.5 through a Creative Cloud subscription.

    Convention thinking would make it seem like Lightroom Mobile is a better fit for iPad than iPhone, given the tablet's better screen. But I think smartphones are actually the more natural fit for this application, because they're the devices with the better cameras. The iPhone is the most popular single camera platform in the world, and the photos taken with it are rarely processed the same way you would a RAW photo from a DSLR. With Lightroom Mobile and Cloud syncing, all the JPEG photos taken on the iPhone can be piped (at full resolution) back to Lightroom on desktop for post-processing. That makes it much more useful than Apple's own PhotoStream for organizing and making use of your smartphone photos and not letting them sit idle on the phone. That makes the smartphone much more useful as a complementary camera to my DSLR.

    Living with Photography: Testing the Sony a6000

    For the video we shot with The Wirecutter's Tim Barribeau discussing what makes a great entry-level DSLR camera, I rented Sony's new a6000 mirrorless interchangeable lens camera. It wasn't part of my quest to find a great companion camera for my DSLR, but I wanted to use it on camera to show as an example of a mirrorless alternative to a relatively cheap DSLR. I've only had a6000 for a few weeks, and haven't tested it as extensively as the RX100 II or Fuji X100s. While those and the other cameras I've tested so far this year are technically mirrorless cameras (in that they don't have the flipping mirrors and pentaprism optical viewfinders), the a6000 is the first mirrorless interchangeable lens camera I've really used since my NEX-5R. I was curious about the state of MILCs since regularly using one. Since the NEX-5R came out in 2012, Sony has launched several mid-range follow-ups, and even nixed the NEX moniker with the a5000 and a6000 cameras.

    So the following are some testing notes from my time spent with the a6000, based on my relatively limited time with it. It never served as my primary camera as a daily carry or the main camera for any major events; I just put it in my bag alongside the 6D and used it when I had spare moments. The lens I paired with it was the Zeiss 35mm f/2.8--an expensive $800 piece of glass that's more expensive than the a6000 body itself. It's roughly the equivalent of a 50mm lens on a full-frame camera, which is a focal length I've grown quite fond of. Long enough for flat portraits, wide enough to capture small scenes by taking a few steps backward. Together, this is quite a nice kit--definitely not something I would consider entry-level or a camera to learn manual shooting with. It's not a high-end camera, either, lacking a full range of physical controls or a full-frame sensor. It occupies an interesting middle ground--a sub $1000 camera that's aimed at experienced shooters who are either switching to their first MILC or upgrading from something like a first-gen NEX or older Micro four-thirds. It's for people who are considering something like an Olympus OM-D EM-1 or a Canon 60D. In other words, I'm not the target user for this camera.

    Tim from The Wirecutter recently ranked the a6000 as his second favorite MILC for under $1000, placing it under the Olympus OM-D E-M10. Having not used it, I can't speak to the Olympus, but trust his judgement. I can't recommend one over the other, so consider my testing notes extra experiential anecdotes that may help you make a more informed purchasing decision. Let's start with the size of the camera.

    Testing: Sony RX100 MK II Compact Camera

    The search for a pocketable companion camera for my DSLR continues. For the past month, I've been shooting with the Sony RX100 II, the second of their three point and shoot camera with 1" sensors (Correction: while the sensor is called 1-inch, it is actually only 16mm in diameter). This is the same sensor size as that found in Nikon's CX line of interchangeable lens cameras, which is far larger than the sensors you'd find in cheap point and shoots and smartphones. For reference, the iPhone 5S has a 1/3.2" (5.68mm) sensor, while Canon's Powershot G12 has a 1/1.7" (9.5mm) sensor. A 1" sensor is roughly the same size as a Super 16mm film frame, but still considerably smaller than micro 4/3rds and APS-C sensors found in higher-end mirrorless cameras.

    While the RX100 II has already been succeeded by the new Mark III model being released this month, that $800 camera is actually considered a step up, and Sony will keep selling the Mark II at its $650 price. I previously tested the first RX100 when it was released in 2012, and the Mark II's upgrades are more feature-based than improvements in image quality. An improved back-illuminated CMOS sensor claims better low-light sensitivity, but it has the same zoom lens as the first release (still available for $500). It also has new features like a tilting LCD, hot shoe, 24p 1080p video recording, and Wi-Fi.

    In my testing of this camera, I wanted to figure out whether it could be a sufficient alternative to my DSLR in places where carrying around a large camera would be impractical. So I took along for a trip to my friend's wedding, on a day trip wandering around the Bay Area, and to an evening event at the local science museum. Each of these were places where I could have taken the DSLR, but chose not to in order to 1. try enjoy the event more and not make it a photography trip, and 2. challenge the RX100 II in situations where people would normally take smartphone photos. Here's what I found.

    In Brief: Photographer Dan Winters' Man Cave

    Even if you've never heard of the name, it's very likely that you've seen Dan Winters' work. The famed editorial photographer has taken cover photos for just about every prestigious publication (his Wired magazine photos are a favorite of mine), and his subjects have ranged from actors to musicians to entrepreneurs (team Oculus included!) to heads of state. But Winters' interests expand beyond photography; this Wired gallery of his studio and workspace show obsessions (in the best sense of the word) with military history, mid-century ephemera, and entomology. The artifacts and handmade props that permeate his creative space are as beautiful as they are varied, but project a sense of belonging--together forming a portrait of the man. And yes, that does look like a jetpack on the floor of his shooting gallery. I'd love to be able to see him at work. Winters' friend and actor Nick Offerman also wrote this lovely tribute to the photographer for Time back in 2012.

    Norman
    Living with Photography: Eyes Up Here

    Going to run through a simple but practical tip today--something I've been trying out lately and has been working well. It's about being able to properly focus on your subject when you're using your camera's autofocus system. As I've mentioned before, one of the reasons I prefer using a DSLR over a mirrorless camera is the optical viewfinder. The clarity of the real-time image you see through a DSLR's lens, by way of mirror, cannot be matched by an LCD or EVF, even though those systems let you see exactly what the camera sensor is registering. It doesn't matter if your camera's LCD or EVF has 3.8 million dots; the resolution of reality is only limited by the rods and cones in your eyes. That degree of clarity is a tremendous help with finding focus, with the trade-off being that you don't get the same kind of digital overlay that you would get on an LCD, like edge peaking.

    My method for finding focus is a pretty standard one--using the center focus point in one-shot AF mode to find my focus and then reframing the shot to get the composition I want. There are a few reasons I choose center focus instead of full-on autofocus, which may apply to your shooting style. The first is that DSLRs have a limited number of autofocus points. On the Canon 6D, that number is on the low side, at 11. A high-end model like the 5D Mark III has many more (61 points), which makes autofocus much more useful and accurate as a guiding tool. But even with many auto-focus points, not all of them have the same abilities. The autofocus points on a DSLR are phase-detect sensors, but some of them can only detect contrast along one dimension (either vertical or horizontal), while some are cross-type, meaning that they can detect details across two axes. The center autofocus point is also typically the best at doing its job, with the center point on the 6D rated to detect detail at a full stop of lower than its siblings.

    Additionally, the array of autofocus points on a camera sensor are usually grouped toward the center of the frame, in a cross-like pattern. You're not going to get autofocusing ability near the edges of the frame, much less the corners. So while you can manually set which specific autofocus point (or grouping of points) to use in a specific shot, that's only going to work if the subject you want in focus is covered by that array. In the simulated viewfinder image below, the auto-AF system wouldn't be able to focus on either of the players' faces, or even the basketball. Even if your subject is withing the AF grid, manually selecting that point can take precious seconds away from the shot, and compromise your ideal composition. By default, DSLRs require that you press the AF Grid button before tapping the directional pad to select your AF point. But even when you turn this off in the camera's menus, I find that it's not as fast as using center focus and then readjusting the framing .

    Image credit: DPReview

    The problem with using the center point autofocus method, though, is that you leave yourself susceptible to losing that focused point through the simple act of reframing.

    Borrowing Cameras: How to Test Before You Buy?

    For the past week, I've been testing Sony's RX100 MK II compact camera, as part of my quest to find the perfect companion camera for my DSLR. Along with the Fuji X100s and Sony's A7, it's a camera that I've loaned from Borrowlenses, who've graciously allowed me to borrow cameras and lenses for trying out and experiential testing. In each of these cases, I've used the cameras for a solid month, which has been the baseline for how long we want to live with products to get a feel for how they can fit into our daily lives and workflow.

    With cameras, I've found that getting used to a new model isn't like swapping to a new laptop or smartphone; while there are plenty of quantitative attributes to evaluate, the qualitative elements are oftentimes the most important to consider. Image quality as a technical benchmark is of course paramount, but photographers also need to know how the camera feels when they're shooting with it. Ergonomics, button placement, menu design, and playback responsiveness are qualities that affect every photo you take. The fact that the zoom and focus rings on Nikon cameras rotate in the opposite direction as Canon cameras is completely arbitrary design difference, but could be a meaningful one if you're switching ecosystems. Those qualitative traits don't just vary from manufacturer to manufacturer, they're different for every single camera.

    With my testing of the RX100 MK II, another interesting thing happened--Sony announced the impending release of the RX100 MK III, a successor that improves on the MK II's processor and optics while bringing the price up by over $100. That doesn't make my testing of the MK II moot; in fact, it underscores an important idea that we too easily forget: you don't necessarily need the latest gear. My current use of the MK II won't just be context to judge whether the MK III's improvements are worth it, but also to see if last year's camera is good enough for my needs.