Quantcast
Latest StoriesPhotography
    Tested Asks: How are Holograms Made?

    While in New York, Norm stops by Holographic Studios, one the last remaining independent holography galleries and holography studios still operating. Its founder, Jason Sapan, has spent almost 40 years practicing the art of holographic imagery. We figure he's the best person to explain to us what exactly is a hologram, and how they're painstakingly made.

    NIkon's D750 Temps me to Switch from Canon

    Early last month, Nikon announced its D750 full-frame DSLR camera. It sits between the popular D810 and entry level full-frame D610. The two aforementioned cameras are the Nikon equivalents of the Canon's successful Canon 5D MK III and 6D, but there's no comparable camera in Canon's lineup to the D750. And now that I've read some early reviews of the D750, this is beginning to worry me as a Canon user. First off, some specifications. The D750 is a 24MP full-frame camera running Nikon's EXPEED 4 processor. It basically combines the 24MP sensor of the D610 with the 51-point autofocus system of the D810. The processor bumps the framerate up to 6.5fps (using appropriate SD cards), and video recording features are adopted from the higher-end D810. New for Nikon FF DSLRs is a tilting LCD display and Wi-Fi for photo transfers. It's on sale now for $2400 (body only). This video below does a good job giving an overview of its specs.

    I've been very happy with my Canon 6D, and was looking forward to upgrading to Canon's next 5D release, if that happens in a year or two. For these full-frame cameras, upgrading the body every 3-4 years or so makes sense, since the lenses are where the money's at. But this new review by photographer Ross Harvey gives me a little bit of envy. Harvey demonstrates the tremendous low-light auto-focusing abilities of the D750 in a wedding shoot, and the image quality of photos he shot at ISO 9000 made my jaw drop.

    The best way to use a camera is to adjust your shooting style to the capabilities of your equipment. Camera performance dictates best practices. For example, the FF sensor on the 6D and a wide zoom lens lets me shoot pretty great low light photos, but I know I have to frame and compose my shots quickly because of the limited autofocus points. I shoot center point focus because I can't rely on full auto. A 51-point AF system that can lock in focus at -3EV, as well as the tilting LCD would absolutely change my shooting style, or at least expand my shooting options. It's like unlocking new abilities in a photography skill tree.

    Since I'm actually in no rush to buy a new DSLR body, all I can hope is that Canon has a good answer to the D750 in the next year. Based on recent trends, I'm not sure that's going to happen. Canon has been putting a lot of effort into video recording, from the 7D Mark II to its professional Cinema cameras (and respective lenses). The last Canon product that really excited me was its PowerShot G7X, and that was a response to Sony in the point-and-shoot market. Nikon is really impressing with its continuing innovation in traditional DSLRs, while Sony has lead the way in new format cameras like the A7r.

    In terms of ecosystem, I'm about $4000 invested in the Canon EF format. That's not a lot compared to some photographers, but it makes switching to Nikon and Sony something I can't just do on a whim. For those of you who have switched, how did you go about doing it and how was the transition?

    Living with Photography: The Nonprofessional

    Surprise, time for a long overdue Living with Photography column--I missed writing this regularly. I'm returning this week to talk about something that's been on my mind for a while now--a question I've been struggling to wrap my thoughts around since the summer: what defines a professional photographer? Or more specifically: what's the line that separates an amateur and a professional photographer?

    This question started bothering me at this year's Comic-Con, where I met a bunch of Tested readers at our Incognito party. A few of you brought up the photo galleries I post on on the site (mostly praise--thanks!), and there was even a request to schedule a photo shoot back in San Francisco. I had to respectfully decline, because I honestly don't think I would be qualified to do so as a hired photographer. I really don't consider myself a professional photographer.

    In the work I do for the site and in my own time, I've taken thousands of photos of products, events, and people. Through that experience, there are types of photos that I'm well on my way to having spent the requisite 10,000 hours of practice taking. If you need a photo of a smartphone for an article, my brain can immediately pull up the dozen different ways to illustrate its features to a viewer. If you need a photo of a cosplayer posing outside the San Diego Convention Center on one of the last weekends of July, I'll know how to frame a few good shots. There are photos in my Lightroom library that I really like, and some that I think could be considered "professional" in quality. But I am not a professional photographer. No way. And every time I get a compliment from people I respect, I feel like an impostor.

    Which leads to my original question, which I want to discuss and explore with you guys: what defines a professional photographer? Maybe the best way to start is to consider the attributes that I don't think define a Professional with a capital P.

    Education is probably the most logical attribute belonging to a professional photographer. The study and practice of photography under an academic setting--whether it's a photojournalism class or at one of those photography seminars or retreats run by notable photogs. Education is great, and goes a long way to giving you a structured understanding of the important technical aspects of taking photos. I've always wanted to take a few weekend classes for myself. But I don't think it's a requirement. It's not essential. There are plenty of working photographers who are self-taught or never had any formal training.

    Ah, so maybe that phrase--working photographers--can point us in the right direction. Is a photographer a professional if they've been paid for their work? I suppose that in the strictest sense, making money from photography would define you as a professional. But I don't think that's the case, either. Paid photography says as much about the photographer as it does about the client purchasing the photos. It's subjective. And just because someone liked a photo you took enough to pay you to license it, doesn't necessarily mean you would be qualified to do the same kind of work again. I've sold two photos before, but they were far from my favorite photos--they just suited what the licensee needed. Just because you see a great photo on Flickr doesn't mean that the photographer would be capable of taking an assignment to produce the same caliber of work. Photography is fickle, and new technology has made it easier than ever to take a good photo without explicitly knowing what you're doing.

    That's not to say I don't take the photos I shoot for Tested seriously. When I shoot product photos for stories, YouTube thumbnails, or behind-the-scenes materials, my mind is absolutely "on-assignment." So can the definition of a professional photographer be something literal: a sense of responsibility and professionalism? Again, I think that falls short of a proper definition. Professionalism and a purposeful approach to photography are valued qualities of a professional photographer, but not what I would consider essential for professional practice. We're getting closer, I know it.

    How about that Gladwell-notion of mastery, then. Is the number of photos taken or how many hours you've spent practicing the craft that makes you a professional photographer? I don't doubt that 10,000 hours spent taking photos would give anyone a technical mastery of photography, but this is still talking about experience in terms of quantity, and not quality. There's likely a strong correlation between quantity and quality that converges as you reach a certain amount of experience, but this still is too abstract an association that doesn't satisfy a concrete personal definition. As someone who isn't a professional photographer, I want a objective definition that doesn't feel like an arbitrary goal.

    So after much thought, here's my proposed definition of a professional photographer--the standard I hold myself against as an amateur:

    In Brief: GoPro Announces Hero4 Line of Action Cameras

    Another year, another GoPro release (how many people actually upgrade every year?) This generation of the ubiquitous action cams builds on last year's strengths--more high-speed fps recording options and better 4K video. On the high end, the $500 GoPro Hero4 Black now shoots 4K video at 30fps (double that of the Hero3 Black), as well as 120fps at 1080p (and other resolution/framerate options). The $400 Hero Silver has the same recording capabilities of last year's Black edition, but now includes a touchscreen for viewfinding and control on the back. GoPro also now has a budget option in the $130 Hero, which can record 1080p at 30fps and is also waterproof. 120fps at 1080p is appealing, but I care more about the usability improvements. The controls have apparently been reworked for faster access to recording settings, and new night shooting modes add manual control to the camera shutter. We'll likely buy one for testing, but have not had the best experience using the GoPros for our own productions. For long videos like shooting Still Untitled podcasts, the GoPros have overheated a few times.

    Norman
    In Brief: FAA Begins Granting Production Companies Drone Waivers

    Last Thursday, the FAA announced that it has begun granting video production companies exemptions to its unmanned aircraft systems (UAS) regulations. Six companies now have permission to use quadcopters and drones for production purposes, after convincing the FAA that their operations would meet a minimum standard for safety. Operators at these companies, for example, would hold private pilot certificates, keep the aerial systems within line of sight at all times, and keep flights restricted to designated "sterile areas" on set. The FAA would still have to inspect the aircraft before each flight, and nighttime aerial production is still prohibited. But this establishes a precedence and procedure for commercial companies to seek regulatory exemptions for drone flights with the FAA. 40 more requests are being considered, and the FAA is encouraging interested firms to work with their respective industry associations to create the appropriate safety manuals and operating procedures required for new exemptions. In other quadcopter news, DHL has begun a monthlong trial of autonomous aerial delivery of medicine and supplies to a sparsely populated island off the coast of Germany.

    Norman
    Beautiful Quadcopter Video Over Prague

    I really enjoyed this aerial tour of Prague, shot with a Phantom DJI and GoPro with a three-axis stabilizer. The filmmakers promise that they were cautious in their filming, but I think it would still be considered reckless by veteran multi-rotor hobbyists who are struggling with regulatory limbo and dickishness here in the States. It makes me wonder if this genre of videos--beautiful aerial montages from the uniquely mobile perspective--is a flash in a pan, and will go away as governments tighten down their restrictions on where hobbyists and professional videographers can fly. I was actually surprised to see how few drone or aerial RC videos came out of Burning Man this year. DJI's Eric Cheng shot a great montage, but I expected the desert skies to be littered with these things.

    In Brief: Canon Announces PowerShot G7 X, Long-Awaited 7D Mark II

    It's been a crazy week in technology already, and a few bits of news escaped me until today. Photokina is going on right now, and camera companies are making some pretty big announcements there. New "entry-level" Leica cameras, Instagram-styled Polaroid instant cameras, and plenty of lenses. I honestly have trouble keeping up with all of it. But Canon has two cameras that extremely notable. First is the PowerShot G7X, a direct competitor to the Sony RX 100 III. It's a compact that uses the same 1-inch type 20MP sensor, but is $100 cheaper than the RX 100 III and has a better 24-100mm equivalent f/1.8-2.8 lens. Apparently, the lens stays wider longer, and Canon goes directly after Sony with a clicky lens dial and a dedicated exposure comp dial. No EVF, but that's not a big miss. Canon knows where fans are at and are finally addressing high quality compact needs. Also announced was the long-awaited EOS 7D Mark II, the best APS-C camera you can get from Canon before going full-frame. Like the 7D, it's designed for video shooters, with a fast AF system equipped with 65 cross-type focus points. Recording is still limited to 1080p50, but now there's a headphone jack and uncompressed clean HDMI out. I bet it shoots pretty good photos, too.

    Norman 3
    Tested In-Depth: Sony RX100 III Compact Camera

    We sit down to discuss Sony's latest high-end compact camera, the RX100 Mark III. Having tested both predecessors to this model, we evaluate its new features like the electronic viewfinder and improved zoom lens, as well as its image quality compared to big DSLR cameras. Here's why it's one of our favorite new cameras to use!

    Testing: Sony RX100 MK III Compact Camera

    I've been on the hunt for a pocket camera to complement my DSLR, and spent time with cameras with sensors ranging from micro 4/3 to full-frame. My test of Sony's RX100 Mark II made me seriously consider the trade off between body size and sensor size. There were several things that held it back from being ideal for my day to day use, but I realized that getting a compact camera with an APS-C or Full-Frame sensor would compete too directly with my DSLR. Going for portability made more sense for a secondary camera. So for my recent birthday, I ended up buying Sony's RX100 Mark III, based on the praise other photographers have given it. It arrived a little over a week ago, and I've been shooting a lot with it since. And even though I've already committed to the camera, I'm still running it through the practical testing that I would give any new camera to gauge its strengths and weakness, and to relay that experience to you. So here's what that shooting experience has been like so far.

    One of the reasons I felt I would be comfortable buying the RX100 III before using it is because it inherits almost all of the great things I like about the RX100 II. That includes size, weight, tilting LCD, image quality, manual controls, and wi-fi features. The size and weight are perfect for these cameras to be stowed in a jacket pocket (though the MK III is slightly thicker and heavier than both previous models). I was already satisfied with the RAW and JPEG image quality from Sony's 20MP 1" -type sensor, even if the lens on the MK II was a little lacking. And I have been very impressed with the Wi-Fi connectivity of Sony's cameras, which I used extensively on the a7 and RX100 II. In using this 3rd-generation RX100 the things I wanted to specifically test for were the new zoom lens and autofocus speed, as well as the digital viewfinder.

    The OLED viewfinder is probably the most noticeable addition to the RX100 line, and surprisingly doesn't add to the heft or bulk of the camera. There's still a built-in flash, and the only thing you lose is a hotshoe that was on the MK II. This EVF pops up on the left side and needs to be extended a little bit before use, so you can't switch to it as instantaneously as you would a fixed EVF like on the Fuji cameras. The eye proximity sensor has proven to be accurate, though. I found the 800x600 resolution (100% coverage, .59x magnification) sufficient for framing and focusing, since I use digital peaking assists anyway for finding focus. I know some people who only use EVFs for their shooting, but I typically can't stand the latency--my brain wants the response of an optical viewfinder. But I have been using the EVF on this MKIII outdoors and even for reviewing photos. If Sony offered a version of the MK III without the EVF for a lower price, I would've gone with that one. But the $150 price difference between the models accounts for the EVF, the new lens, and new processor.

    Below are my sample photos taken so far, with notes on what they say about the camera. The photos were not post-processed at all, just RAW files ingested in Lightroom and resized/exported as JPEGs. Click each of them to enlarge.

    Testing: Instagram's Hyperlapse App for iOS

    Instagram today announced and released a new iOS video app called Hyperlapse. It was a pet project of Instagram engineers Thomas Dimson and Alex Karpenko, and impressed Instagram founder Kevin Systrom enough that the company developed it into a full-fledged app. Wired Design's Cliff Kuang has an exclusive story about the app's origins, if you're curious. But after a morning of testing, here's what you need to know about it.

    Hyperlapse is a time-lapse app for iOS, much like Studio Neat's Frameographer or the time-lapse feature built into many smartphones. Unlike those apps, three isn't much to configure--you don't set the interval time between snaps, nor the framerate of your output video. You just hit record and Hyperlapse starts record, at a default rate of five frames a second (assuming 30fps output). That translates to one second of video for every six seconds of time passing--pretty fast for a time-lapse. But what makes these time-lapses a "hyperlapse" is the stabilization between captured frames, making it look like your time-lapse video was shot on a gyro-stabilized gimbal. And technically, your video is gyro-stabilized, since the app takes into account the iPhone's gyro data to match frame angles and smooth out the video movement. The result is smoother time-lapses that you'd get than just putting your phone on a tripod, without using complex motion-correction algorithms like Microsoft Research's hyperlapse project.

    I shot a few Hyperlapse videos to post on Instagram, and frankly wasn't very impressed by the output. The gyro-stabilization works to some extent, but doesn't do a good job compensating for very shaky movement. You still have to try to keep your hands still or your phone held steady against a fixed object. Also, the video output on my iPhone 5 took a long time to process for a minute-long clip, and compressed the hell out of it. Hyperlapse is really only ideal if you're shooting the Instagram-preferred 15 second clips (about three minutes in real time), and if you don't care about video compression whisking away HD details. Full clips are saved to the iPhone's camera roll, like the video I uploaded to Vimeo and embedded below. A two minute clip ended up being only 120MB on my phone, and looked worse than a stationary time-lapse I shot and exported with Frameographer.

    Airplanes Taking Off and Landing in Time-Lapse

    Photographer Milton Tan was granted access to a restricted runway at the Singapore Changi Airport to shoot this time-lapse compilation of planes taking off and landing at one of the busiest airports in the world. Tan details the technical aspects of his shoot on his blog, explaining that he used long exposure shots with a Canon 5D III and 7D with a range of zoom lenses. 7000 photos were processed in Lightroom and edited in Premiere. This is how I imagine a Star Trek-style spaceport to look like in real-time, with planes warping off as beams of light. (h/t Petapixel)

    Real-Time Face Tracking and Projection Mapping

    This is one of the coolest things I've seen in a long time. PICS, a Japanese video production company, experimented with face tracking and projection mapping to animate and transform the face of a model in real-time. The model's face was marked with tracking dots and painted in reflective make-up, which allowed a computer system to match an 3D animation with her head movements. From afar, the positional matching and low latency of the projection create a mesmerizing and surreal illusion. It's the kind of effect that I would love to see used in movies, shot in-camera instead of done in post with CGI.

    Filming The Light and Dark Side of The Godfather

    Gordon Willis, who passed away on May 18, 2014, will always be best known as the cinematographer of The Godfather films. At least one recent poll ranked The Godfather as Hollywood's top movie of all time, and it’s not surprising Coppola's epic crime drama is still revered after all this time. The incredible scope and power of the story still holds up, and it gave a generation of new actors like Al Pacino, Robert Duvall and James Caan their career breakthroughs. Not to mention it was one of Marlon Brando’s best roles, and the movie that revived his career.

    The Godfather also made cinema history by introducing a new style of cinematography.

    Before Willis shot The Godfather, movies were vastly overlit so they could be seen in the drive-ins and not disappear into the dark of the night. But Willis’ cinematography was a bold step forward, changing the look of movies forever. Because of The Godfather, studios actually had to make two sets of prints, a lighter one for drive-ins, and a darker one for theaters.

    It’s easy to take this for granted today because dark cinematography is an accepted norm, and with the latest digital cinema cameras you can shoot with almost no available light. But for the time, Willis’ approach was very groundbreaking, and many cinematographers followed his lead into the dark.

    Willis had shot several films before The Godfather, including Loving, which was directed by Irvin Kershner (The Empire Strikes Back), and The Landlord, which was directed by Hal Ashby (Harold and Maude). The Godfather was going to be filmed in New York, which meant that Coppola had to hire a cinematographer from the New York unions. Willis was recommended to Coppola by Matthew Robbins, a friend from the Bay Area who went on to write The Sugarland Express for Spielberg, as well as direct the fantasy Dragonslayer. (Robbins knew Kershner from USC, where the latter taught film.) Willis was also picked for the job because Coppola wanted a cinematographer that could capture a period look.

    In interviews, Willis made it clear there was no master plan to change cinema with his approach to the film.

    9 Amazing Photos From The Early Days Of Photography

    It’s hard to imagine that humans have been recording photographic images for almost two centuries. The first successful attempts to fix an image onto a physical medium happened in 1820, but it wasn’t until 1839 when Louis Daguerre introduced the first commercial process that things started to really blow up. Today, we’ll spotlight some incredible images from the 1800s that will blow your mind.

    How RAW Photography Will Change Smartphone Cameras

    One of the things we think Apple does better than other smartphone manufacturers is build great cameras into its phones. It's one of the reasons that iPhone is one of the most popular cameras in the world, period. Based on our experience, the iPhone 5S' camera produces better-looking photos than that on high-end Android phones like the Nexus 5 and HTC One M8, and it's a safe bet that the next iPhone will have yet another camera upgrade. Sony currently supplies the small CMOS camera in iPhones, and it's also the supplier of camera sensors on a variety of Android phones. The difference in photo quality between those devices, then, can partially be attributed to the lens system used. But photo quality is also tied to the imaging software built into the phone's OS. And on that front, Android may take a leap over iOS later this year.

    While Apple is opening up manual camera controls to developers in iOS 8, one feature that's sorely lacking is support for RAW photo capture. And coincidentally, that's one feature that Google is bringing to Android L--support for camera apps to write raw pixel data from the camera sensor as a DNG (digital negative) file. While this may not sound like a big deal for most smartphone users, it is in fact a huge deal for photographers who are doing more than just taking photos to immediately share on social networks. As I've said before, the post-processing of a photo is just as important to the whole of the photography process as the act of snapping the shutter. The ability to save smartphone photos as RAW files instead of just JPEGs is the equivalent to an immediate and free upgrade to the camera, regardless of the sensor make.

    Photo credit: iFixit

    To understand the benefits (and costs) of RAW, let's quickly go over the limitations of JPEG images. JPEG is a lossy file format, using image compression algorithms to reduce the file size of an image while retaining as much details as possible. Standard JPEG settings allow for a compression ratio of 10:1 from the original image without noticeable reduction detail, especially on small screens. JPEGs are also most commonly saved with 8-bit color profiles. That means that each of its RGB color channels top out at 256 gradiations. 256-degrees of brightness for each color channel is plenty for a photo, but camera sensors can actually record much more detail than that. Digital imaging chips can process light coming into sensors in 12 or 14-bits--light data that is lost when converting an photo to a JPEG. That extra data, when run through a RAW image processor, allows for more flexibility when editing and helps avoid image artifacts like light banding.

    Another limitation to JPEGs is the inconsistency of compression engines between smartphones. The amount of compression used to save a photo in the iPhone is different than that of an Android phone, and can vary between camera apps. For example, a 13 Megapixel photo taken on the new OnePlus One Android phone is compressed to a file between 1-2MB at the highest quality setting. The iPhone 5S, using a 8MP sensor also made by Sony, saves JPEG photos that are also around 1.5MB each. (By comparison, the 14MP camera on the Phantom 2 saves 4-6MB JPEGs). So where did that extra megapixel data go? While some camera apps have JPEG quality settings, the amount of compression isn't always transparent, so you don't know if you're getting the best possible photo you can from your phone.

    Photo credit: DPReview

    RAW image files eliminate that ambiguity, because it's just storing the unfiltered image data taken from the camera sensor. And the best part is that saving in RAW isn't a hardware limitation. All digital camera sensors have to pass that raw information through the phone's image processors--it's up to OS and camera software to give users a way to save that data before it gets lost in the JPEG output. The high-end Nokia Lumia phones have RAW photo capability, and previous phones like the Lumia 1020 were granted RAW file saving with a software update. DPreview ran a comparison of RAW and processed JPEG photos with the Lumia, and I ran own tests with a small-sensor camera to show you the image detail differences.

    Tested: Google Camera vs. Best Android Camera Apps

    So you've picked up a spiffy new Android phone, but the camera interface isn't to your liking. Even if you don't have any strong feelings either way, you may still wonder if there isn't something better out there. The Play Store has plenty to choose from, but most aren't doing anything particularly impressive. A few might be worth your time, though, and of course Google has thrown its hat into the ring recently with a stand-alone photography app. Let's see how the Google Camera app stacks up against the best third-party camera options.

    Photo credit: Flickr user janitors via Creative Commons

    Google Camera

    If you have a Nexus or Google Play Edition device, this is the stock camera app. For everyone else, it's an alternative downloaded from the Play Store. It's a complete redesign of the old default app from AOSP that fixes many of the issues people have been complaining about in the camera UI for years.

    This is by no means a unique feature among your camera options, but Google's camera app finally shows the full sensor frame. Previously, it would crop the top and bottom of 4:3 images in the viewfinder, making it hard to frame the shot. It now gives you the option if you want to take wide or square shots (the crop depends on the device). This alone makes it a better app for Nexus users.

    Some of the "advanced" features we used to see in the stock camera are gone with this new version, which might make it a deal breaker for some people. There's no timer mode, no white balance, and no color control. The user base for these features is probably smaller than the complaints online would make you think, and you DO still have manual exposure control. The rest of the features will probably trickle back in over time.

    In Brief: iOS 8's Manual Camera Controls

    We talked about this on last week's podcast, but I wanted to talk more about the changes coming to iPhone in Apple's upcoming iOS 8. The camera app in the new iOS, which presumably will work on the current iPhones, in addition to whatever iPhone models Apple releases this year. New features mentioned in the WWDC keynote include a built-in time-lapse recording mode and shutter delay timer, but the more exciting capabilities for photographer are the manual ISO, shutter speed, exposure compensation, and white balance settings. According to Anandtech, only exposure bias will make it into the default camera app, while the rest of the manual controls are exposed through the API for third-party apps to surface and exploit.

    Unfortunately, while these new controls are welcome additions to the iPhone, there's been no mention of any RAW shooting capability. The updated iCloud will support RAW file syncing for photos imported from external sources into iPhoto, which gives hope that Apple may announce RAW photo capabilities in the Fall with new hardware. RAW photography is already available on some smartphones like the Nokia Lumia 1020, and has been rumored to be an upcoming Android feature. While smartphone camera sensors are still limited by their physical size and lens, lossless RAW photography will avoid JPEG compression and let you make important post-processing adjustments like white balance. DPReview's evaluation of RAW processing of Lumia 1020 photos was promising, and I've had success using RAW processing to improve the photos taken by our DJI Phantom 2 Vision+ quad.

    Norman
    Why Adobe Lightroom Mobile for iPhone is a Killer App

    My favorite program to use on my desktop PC at home and MacBook Air is Adobe Lightroom. Other than the Chrome browser, it's the most essential app I rely on for day-to-day computing. I could even argue that it's one of the few reasons I still need an x86 PC--to process and develop my RAW digital photos. That's something that smartphones and tablets just aren't good at yet, even with the numerous photo tweaking apps available and the great displays on devices like the iPad Air. Adobe's goal for Lightroom is to convince photographers of all skill levels that the post-processing of digital photos--the bits-and-bytes equivalent of analog film development--is just as important to photography as the act of setting up and snapping the shutter button. It's a sentiment I agree with.

    That's why I was a little disappointed by the release of Lightroom Mobile for iPad earlier this year. The way the app was designed--and the constraints limited by the iPad and iOS--made it difficult to incorporate the app into my photo developing workflow. The iPad app was a way to load synced Smart Previews of photos from your desktop to the tablet and do light tweaking or flagging. You could send JPEGs from your iPad's camera roll back to Lightroom, but not RAW files. There was no way to use the iPad as a funnel to ingest RAW photos from your DSLR on the go and have them pop up back on the desktop.

    Today's release of Lightroom Mobile for iPhone doesn't change much, but the shift from tablet to smartphone is quite a big deal (I'll get to why in a moment). Adobe did address some feedback from users of Lightroom for iPad, adding the ability to give star ratings to photos in synced library collections and custom sort order. Functionally, the iPhone version has feature parity with the iPad version, just rescaled for the iPhone's aspect ratio and screen. The app is still free and connects to the updated Lightroom 5.5 through a Creative Cloud subscription.

    Convention thinking would make it seem like Lightroom Mobile is a better fit for iPad than iPhone, given the tablet's better screen. But I think smartphones are actually the more natural fit for this application, because they're the devices with the better cameras. The iPhone is the most popular single camera platform in the world, and the photos taken with it are rarely processed the same way you would a RAW photo from a DSLR. With Lightroom Mobile and Cloud syncing, all the JPEG photos taken on the iPhone can be piped (at full resolution) back to Lightroom on desktop for post-processing. That makes it much more useful than Apple's own PhotoStream for organizing and making use of your smartphone photos and not letting them sit idle on the phone. That makes the smartphone much more useful as a complementary camera to my DSLR.

    Living with Photography: Testing the Sony a6000

    For the video we shot with The Wirecutter's Tim Barribeau discussing what makes a great entry-level DSLR camera, I rented Sony's new a6000 mirrorless interchangeable lens camera. It wasn't part of my quest to find a great companion camera for my DSLR, but I wanted to use it on camera to show as an example of a mirrorless alternative to a relatively cheap DSLR. I've only had a6000 for a few weeks, and haven't tested it as extensively as the RX100 II or Fuji X100s. While those and the other cameras I've tested so far this year are technically mirrorless cameras (in that they don't have the flipping mirrors and pentaprism optical viewfinders), the a6000 is the first mirrorless interchangeable lens camera I've really used since my NEX-5R. I was curious about the state of MILCs since regularly using one. Since the NEX-5R came out in 2012, Sony has launched several mid-range follow-ups, and even nixed the NEX moniker with the a5000 and a6000 cameras.

    So the following are some testing notes from my time spent with the a6000, based on my relatively limited time with it. It never served as my primary camera as a daily carry or the main camera for any major events; I just put it in my bag alongside the 6D and used it when I had spare moments. The lens I paired with it was the Zeiss 35mm f/2.8--an expensive $800 piece of glass that's more expensive than the a6000 body itself. It's roughly the equivalent of a 50mm lens on a full-frame camera, which is a focal length I've grown quite fond of. Long enough for flat portraits, wide enough to capture small scenes by taking a few steps backward. Together, this is quite a nice kit--definitely not something I would consider entry-level or a camera to learn manual shooting with. It's not a high-end camera, either, lacking a full range of physical controls or a full-frame sensor. It occupies an interesting middle ground--a sub $1000 camera that's aimed at experienced shooters who are either switching to their first MILC or upgrading from something like a first-gen NEX or older Micro four-thirds. It's for people who are considering something like an Olympus OM-D EM-1 or a Canon 60D. In other words, I'm not the target user for this camera.

    Tim from The Wirecutter recently ranked the a6000 as his second favorite MILC for under $1000, placing it under the Olympus OM-D E-M10. Having not used it, I can't speak to the Olympus, but trust his judgement. I can't recommend one over the other, so consider my testing notes extra experiential anecdotes that may help you make a more informed purchasing decision. Let's start with the size of the camera.