Breathing New Life Into Early X-Trans With Capture One

Following up on my last post, I’ve been going through some old RAFs in Capture One. The difference I’m seeing compared to what I can get out of Lightroom is shocking. Check out this terribly underexposed sunset image, captured with an X-E1 in 2013.

Original.jpg

My exposure compensation was still set at -2.0 from a previous shot, so I’m certainly not hoping to get anything useful out of that shot, but look at the difference in what I’m able to pull out of the exposure in Capture One 12 compared to Lightroom Classic (or whatever the hell they’re calling it these days).

Now check out this crop.

Capture One sharpening settings are at their defaults ( I would typically reduce them slightly). In Lightroom, I used the oft advised high Detail slider at 85, and relatively low Amount of 25. Any higher on the amount and Lightroom starts sharpening noise. Increasing the Luminance Noise Reduction blurs detail.

From the farm buildings, the detail in the foliage, the bizarre 8-bit-looking grass, Lightroom’s limitations with these older sensors is astounding. No wonder I decided to shoot JPEG.

Fuji have come a long way with their sensors, and I have already acknowledged that Lightroom started doing a lot better when X-Trans III came out. The point is, if you’ve decided to make the jump yourself and you’ve got some older X-Trans 1 RAFs in the archives,1 it could well be worth reprocessing them in Capture One.→

Luckily I got my exposure set a little better on my next attempt.

DSCF1015-RAW (DR400)-1500px.jpg
  1. Or you’re simply happy with an older body. For a lot of people, 16 megapixels is plenty. ↩︎

Switching to Capture One

It finally happened. After years (half a decade, even) of being unsatisfied with Lightroom’s handling of X-Trans RAF files, and an underlying dissatisfaction with my workflow workarounds, I’ve completely made the switch from Lightroom to Capture One Pro 12→1 for all of my personal2 photography.

The transition period was actually quite short, just a couple months or so. I started out importing all my current RAFs into Capture One, while maintaining my Lightroom library and doubling up my asset management in case I decided to move back. Well, my last import was only to Capture One. Those, and future images will likely never see Lightroom.

Adobe has certainly made some progress since the first and second generation of X-Trans, but it is the definition of “too little, too late” in my book. In my hopelessly dated RAW Converter comparison, I concluded that Capture One delivered superior results. A year later when Lightroom 6 came out, I left little doubt that Capture One produces cleaner images with better detail.

So why did I wait so long?

Honestly I’m not sure, but version 12 of Capture One does have some additional niceties that have made the switch easier, and Phase One’s commitment to Fujifilm cameras really helps.

Back to the RAF

The other big change to my workflow is I’ve gone RAF-only for my personal photography. One big reason for this is Capture One’s interpretation of Fuji’s Film Simulation Modes and lens correction profiles are so good, I don’t feel the need to capture both RAF and JPEG. Unlike Lightroom, Capture One imports my RAFs with the Film Simulation Mode I captured the image with applied automatically. This gives me the best of all worlds; I can make creative choices while I’m shooting, I can totally change my mind about those choices later, and I don’t have a second set of files I need to reference what I was thinking when I captured the image.

Another big reason is exposing to the right with the ETERNA Film Simulation Mode is a fantastic way to get the most out of your exposures, but that’s another article.

Have you heard about these things called presets?

I never got too deep into the preset world in Lightroom. There were just too many, and many seemed dreadfully overpriced, all too similar3 and would break or not adapt when Adobe updated their rendering engine. I feel like that market has settled down a little, and the better quality preset shops have added (or are starting to add) Capture One versions as well.

It can be easy for presets to become a crutch, but they can also just as easily be a starting point for discovering your own style. I’m enjoying editing photos more than I have in years, so clearly it’s the right thing for me right now.

Have I looked back?

Hardly. I miss some things like the dedicated slider for Dehaze once in a while, but in truth you can get similar results that are arguably more natural using the Luma Range mask, an absolutely stellar feature. Speaking of sliders, I also find myself reaching for the Blacks and Whites sliders, but I’m falling deeper in love with Levels and Curves as a result. Ironically, while I loved Lightroom’s separation of Blacks, Shadows, Highlights, and Whites, when I go back to Lightroom for an old photo, I find myself missing the Brightness slider.

Lightroom’s Library module is quite a bit more robust than what Capture One has in my experience so far. Filtering photos quickly is vastly superior in Lightroom. Capture One’s commercial photography origins likely didn’t call for the kind of filtering that’s available in Lightroom, but I love, neigh need to be able to quickly view photos taken with a particular lens, at a specific focal length, and certain apertures, etc. Even if I didn’t have this website, I do still like being able to see which images came from a certain lens without having to create a bunch of Smart Albums that clutter up my Library/Collection View. This may or may not matter in your workflow, and I’m certain with more time and perhaps a tweak or two to my own workflow, things will get faster in this area.

I appreciate how efficient Capture One’s UI is, but I do wish I could hit Command J to add or change the information that’s up on screen. I also used to resist the vertical image browser, but on my iMac, it actually allows my image to be displayed quite a bit larger.

I miss Selects (or Picks) and Rejects in Capture One, but I’ve adapted to colour labels which have more efficient keyboard shortcuts, especially if you’re on a full size keyboard.4

Performance wise, I’m not having much trouble on my 2015 27 inch iMac. I’ve run into the occasional memory crash when waking the computer up, but never while actually working. Capture One seems to use more of the computational resources available to it. My processors are actually working hard enough that the fans spin up on occasion.5

The image is what matters though, and even after all these years, Adobe just can’t compete. Maybe you can get similar results using the new Enhance Details, but for now the cost in time and storage space is way too high when there’s software that can do much better with RAFs natively.

Should you switch?

If you shoot RAFs, you should seriously consider it. And there couldn’t be a better time because Capture One Pro 12→ is 50% off for just a couple more days. I can’t say that pricing didn’t also help ease the transition. I went straight to Pro. Layer Editing is more than enough reason, but the Advanced Colour Editing is no joke. I grabbed the 6 Style Packs which bumped the total savings up to 60%. All told, I’m really happy with the change.

  1. The Fujifilm Edition, natch. ↩︎
  2. The Lightroom library I use for product shots for this site will still be used for the time being as I only shoot JPEGs for those. ↩︎
  3. Wedding, engagement, and pregnancy photos from 5–10 years ago are likely to be pretty easy to pick out. ↩︎
  4. Plus key = Green for Pick, Minus key = Red for Reject, Asterisk = Yellow, and I’ve set the Equals key to Clear rating. ↩︎
  5. This is a good thing as far as I’m concerned. I want software to use the processors I paid for. It does make that silent iMac Pro I’ve been waiting for an upgrade to more appealing. ↩︎

Adobe, Enhance Details, DNG Files, and Film Simulations

Way back on episode 40 of the FujiLove podcast—which, if you are reading this site and you like podcasts, you should probably be subscribed to—Jens and Billy Luong of FujiGuy fame had Product Manager Sharad Mangalick from Adobe on to talk about updates to Lightroom and ACR, and specifically the new feature “Enhance Details”.

Enhance Details

For those who don’t yet know, Adobe’s own dialogue box says:

“Enhance Details uses machine learning1 to improve details and reduce artifacts in most RAW files. The enhanced results will be saved as a DNG file.

It’s a computationally intensive process that performs best with a fast GPU. The estimate Adobe gives me on my Late 2015 5K iMac for a single image is 5 seconds, and that seems about right. What’s really unfortunate is the file size of the resulting enhanced DNG version of a 24.6 MB compressed RAF out of an X-H1 is 117.7 MB.

The results out of this feature would have to be pretty spectacular for me to consider taking on that kind of additional data.2 I’m seeing some decent results in my own limited testing, but weirdly, the Enhance Details dialogue preview totally betrays what the feature is actually capable of in some cases. I had one image that looked significantly worse in the dialogue box preview—more false detail, crunchy texture where there shouldn’t be—and I was ready to call Adobe out on it. After I actually rendered the image though, it did look a fair bit better than the default rendering.

It’s fantastic that Adobe are putting in the time and effort to support X-Trans—they could easily have made this a Bayer-only feature since they obviously support many more Bayer cameras than they do X-Trans. I’m looking forward to the day this feature is part of Lightroom proper, vs. behaving more like a plug-in, but one comment from Sharad on the podcast really jumped out at me:

“Fuji’s proprietary RAW file, the RAF, it doesn’t have an openly documented specification that Adobe can use to add the additional information.”

This is the result of Adobe having to create a separate (huge) DNG files. An obvious question is why doesn’t Fuji open up the gates enough to allow Adobe to write this additional information to the RAF/a sidecar file so we don’t need to go through this cumbersome workflow? I’m sure Billy has asked Japan this question, and either his hands are tied or there’s a really good reason that he agrees with.3 Either way, I wish we got an inkling as to what the issue is there.

Another obvious question is why are other applications able to generate better results without having to create a separate file? What does Phase One know that Adobe doesn’t? Adobe talk about having to balance performance with resolution, but Lightroom isn’t exactly blowing away the competition when it comes to performance with RAF.

For the time being, unless you’re all in on Adobe and a Creative Cloud subscription maxes out your budget allotment for photo editing (which is perfectly reasonable), a strong case can still be made for better, more specialized tools like Iridient Developer for those really important or really challenging images.

Film Simulation Modes

In the back half of the interview, the trio speak about Film Simulation Modes and how Adobe works with Fujifilm to ensure they have the same understanding of how each Film Simulation Mode should affect an image. This has been the official story for years now, and back when it was first told, Adobe’s interpretation of Fuji’s Film Simulation Modes were nowhere close to what we saw out of camera. To my eye, they’ve gotten better, but my recent experimentation with Capture One 12 suggests Phase One is quite a bit closer. Indeed, Phase One’s interpretations of Fuji’s Film Simulation Modes so far has been strikingly good. I’m on the cusp of switching to Capture One for much of my X-Trans processing needs for a few reasons, not needing to create whole separate files to get better detail out of my images and Film Simulation Modes are two big ones.

  1. What doesn’t use “machine learning” these days? Anyone else already tired of that? ↩︎
  2. In the early days of DNG, I remember reading about photographers converting to DNG wholesale with some going so far as to then delete the original RAW files. I’m awfully glad I never considered this kind of asset management. ↩︎
  3. Billy also does a remarkable job navigating these waters. He’s in a tough spot having to balance transparency and trade secrets, but he comes across as really quite genuine. Having spoken with Billy myself quite a few times in the past, I can say he truly does take user feedback to heart, and when he says he’ll take something back to Japan, he means it.

    I’m really digging this addition to FujiLove. It can come across as an advertisement at times, but it’s counterbalanced with honest upgrade recommendations to listeners who could perhaps skip a generation. ↩︎

Firmware and App Updates

X-T2

Fujifilm released a firmware update for the X-T2 a couple days ago. Here’s a plainer language version of the key fixes:

  1. Tethering support. There is a lot to parse here, and it sounds as though some of it preemptively addresses software that is yet to be released.
  2. Buttons and dials can now be locked during shooting.
  3. A fix for shutter speed info not displaying under specific settings.
  4. Overexposure when AF-C and Face Detection are selected.
  5. Fix for poor AF performance when using the XF 18-135mm F3.5-5.6 WR at the telephoto end.
  6. This is a tough one. It sounds like the camera would freeze during menu selections for PC auto save.
  7. Fix for when using shoe-mounted flash and CH burst mode.
  8. Fix for Nissin i40 flash not firing.

iOS

An update to the Camera Remote app was released yesterday to address iOS 10 issues. The app was getting hammered in the reviews, last I checked. Hopefully this helps.

Mobile Workflows

While on the topic of mobile-only workflows, Hendrik Haseu has a nice write-up on how he’s gone lighter and ditched the PC for his post processing. It’s a process he’s been refining, so you can learn from his progression.

Hendrik has taken his workflow a little further than I have largely because I don’t personally worry about metadata when working mobile, but I do think about implementing his RAF+JPEG strategy at times. Making selects and processing images in camera can get tedious though. I love the technique of creating a neutral image from RAF’s that are perfectly suited to creative post processing, something I often do on desktop, but have yet to implement on tablet or smartphone. The advantage of rendering the JPEGs in-camera vs. a dedicated program on desktop is you get Fuji’s ridiculously good corrections and Lens Modulation Optimizer applied to your images. His post is yet another reminder of how much I want Fuji’s in-camera RAF processing available via a mobile app.

Image by Hendrik Hazeu

Image by Hendrik Hazeu

Affinity Photo

If traffic stats are anything to go by, the majority of my readers are Mac users, so apologies in advance to my Windows visitors for this Mac-only post. Having written that, this could be another reason to consider a shift in desktop platform.

Affinity Photo→ has been available on the Mac App Store since July 9th. I was waiting for a trial like they have for Affinity Designer,→ but with the end of the launch sale fast approaching, I took the plunge and bought both apps without being able to test out Photo.

I’m only a day in, but so far I’m very impressed. These applications are super fast. On my Early 2013 Retina MacBook Pro, Affinity is opening PSDs and AI files (saved with PDF compatibility) faster than Photoshop and Illustrator respectively are. The “Edit in...” command from Lightroom is also significantly quicker when sending the file to Affinity Photo as compared to Photoshop.

It might be the initial excitment talking, but as it stands right now, I’m hopeful that what amounted to two months worth of Adobe’s Creative Cloud will allow me to cancel my discounted subscription when it comes up for renewal at the end of next month. The timing couldn’t be better. My current plan is to eventually purchase what I can only assume will be the last stand alone upgrade for Lightroom, and transition away from Adobe for home use entirely after that.

Now, you might be thinking, “That’s great and all, but what about my Photoshop plug-ins?”

Tadaaa!!!

Yep, Nik plug-ins work just fine with Affinity Photo. A quick work around as noted below is required.

Yep, Nik plug-ins work just fine with Affinity Photo. A quick work around as noted below is required.

Initially I had some trouble figuring out exactly how to set the Plugin search folders and Plugins support folders, but a quick tweet to @MacAffinity lead me to this forum post, and the settings below.

To get “/” you need to point Affinity Photo to your boot volume. Hit Command+Shift+C, then double click your start-up disk.

To get “/” you need to point Affinity Photo to your boot volume. Hit Command+Shift+C, then double click your start-up disk.

There’s lots more to consider, but pretty much all of the big holes I’ve been worried about so far have been satiated. Wacom support, check. Proper Curves adjustment? Yup. All the blend modes you can handle with live preview so you don’t have to use some weird key command to cycle through them? You betcha.

I’m really just scratching the surface here, but I’ll continue to compare and contrast Affinity Photos with Photoshop over the next little while. If you’re on a Mac, and looking for a way to ditch the subscription, Affinity might be your way out. Creative Cloud offers many more applications, of course, but as a photographer, art director, and designer, I mostly work with Photoshop, Illustrator, and InDesign. With Affinity, I’ve got the first two replaced, and InDesign has been the least-used of the three for some time now. It would be fantastic if Affinity’s next project was a serious page layout application.

 
 

Lightroom 6.1 / CC 2015.1

Adobe Lightroom.png

I originally set out to write up a quick blog post about the relatively marginal differences in how the two latest versions of Lightroom handle sharpening. We’re still in limbo while Adobe “collaborates with Fujifilm in investigating methods to improve fine detail rendering and overall edge definition,“ after all.

It somehow morphed into what I hope was an interesting exercise in confirming what the optimal methods of sharpening in Lightroom are (Amount vs. Detail, which slider will emerge victorious???), the difference between Clarity and the new Dehaze feature, as well as whether or not Lightroom is best tool for the job of extracting detail from RAFs.

If you’re up for a fairly long post detailing subtle differences via loads of fancy new before and after slider images, check out my latest Extras piece. It focuses on detail for now, but I hope to add an examples of the “reduced colour blur” once I find a suitable image.

Lightroom 5.7 vs. 6.1 for X-Trans

The Evolution of Mobile

Fuji Fujifilm Camera Remote App and the X100T.jpg

Fuji’s Camera Receiver app was pretty cool when it came out. Being able to email a street photography subject’s photo to them on the spot is awesome. Then the Camera Remote app arrived, and that took things to another level. I’ve used that app to capture images for this site1 and for work. It’s so much easier than going to and from the camera to adjust settings, set the timer, run back to the front of the camera, repeat. It’s a fantastic app when it works (most users have no trouble, a few have all kinds).2 With that written, I hope Fuji are putting serious development time into the app, and mobile connectivity in general.

Here’s What I’d Like to See:

  1. I want to be able to push the WiFi button on my camera, launch the app, and be connected. No selecting networks (or at the very least, ask to disconnect me from my current network), accepting the connection, etc. It should just work. Additionally, I’d love the option to have photos pushed to my phone without user interaction via some sort of tethering.3
  2. Remove the limit of 30 photos at a time, and give me an option to import all new photos. The task of tapping each image and being restricted to 30 at a time is tedious.
  3. Allow me to switch between Functions (Remote, Receiver, Browse, Geotagging) without disconnecting the camera, which results in a power-cycling and reconnection juggling act.
  4. I’d love to be able to pair the app with my camera to sync/backup and restore my camera and custom settings for all my cameras. Those custom settings banks, by the way, should be nameable, and transferable from camera to camera too.
  5. All the available in-camera processing should be available in the app. Whether the rendering happens in-camera or on the device (more on that in a minute), I don’t really care for now. I just want access to real Fuji colour, curves, profiles, and film knowledge in post, on my iPhone.
  6. For new flagship cameras, a touch screen that allows us to make these adjustments on the rear LCD of our camera (until editing on our device is possible), and then push it to our phones via that tethering is another possibility. Fuji should not be like all those “Smart TVs” and connect directly to our social networks, offer us weather info, stock prices, or play Netflix. We have devices that do that well already.

Connectivity is going to be as important a feature as whatever next generation sensor is in Fuji’s cameras. They’ve nailed image quality, colour, and optics. They need to nail the ease of use customers expect from devices that capture photos. Being able to connect via WiFi is great, but it’s not as seamless as it could be. I leave photos on my camera with the intention of connecting later,4 and I often forget until I copy photos over via the SD card reader in my Mac, like an animal. Maybe I’m just the lazy exception, but I doubt it.

Here’s Why

Since I started shooting Fuji, like many others, I’ve largely abandoned RAFs for my workflow. For a while, I was shooting RAW + JPEG, but more often than not — I’m talking 90% of the time — I would end up deleting the RAF. Part of this is being satisfied with how JPEGs are rendered in-camera, and the other part is a shift in mindset from “RAW tinkerer” to “shoot and (mostly) be done with it.” I still enjoy post-processing, but I really like being able to do it on my phone wherever and whenever I want.

Workflow

Due to this shift, I’ve been in workflow limbo for the last 18 months. Vacation photos have been copied to Lightroom and forgotten about, while daily photos are sometimes left on my SD card for weeks on end. What changed recently is Apple’s Photos app. I’m attempting to move away from Lightroom for my daily hobby shooting,5 and my SD card stays in my camera as photo transfers are done using the Camera Remote app. This is why the 30 photo limit is getting painful. I’ll still capture RAW + JPEG when I’m out to “make a picture,” but for the most part these days, Film Simulation Bracketing + iPhone editing gets me most of what I want. It’s great, but I want more.

Instant On

This is also why I want it to be as quick and easy as possible to connect my phone to my camera. There really ought to be two taps, the WiFi button, and launching the app. Even better, make the app intelligent enough to be “paired” with whatever Fuji cameras are owned, and connect auto-magically when the app is launched.6

RAF Processing

The next level is for the app to see RAF files, and prompt me to choose my Film Simulation mode via taps on screen. Then, I should be able to make selections on all image aspects that are currently handled in-camera — Dynamic Range, White Balance, Noise Reduction, Highlights, Shadows, and Sharpness — followed by a “Done” button that pushes the resulting JPEG to my iPhone’s Camera Roll. Again, the actual processing could still be handled in-camera7 if Fuji can’t or don’t want to port their secret sauce to another platform for some reason, all the more reason to make connecting flawless.

Conclusion

Fuji has been on quite a tear with their X-Series system. The hardware release schedule continues to astonish and they’re well on their way to becoming the preeminent mirrorless camera company. For Fuji, or any camera manufacturer to continue to be reached for instead of the “good enough” smartphone, they need to put serious resources against mobile connectivity to make it as easy as possible for users to get their superior photos — selfies, eggs benadict → and all — off the camera, and into their social world.

Perhaps what I’m asking for has already been considered, maybe even attempted. Maybe it’s impossible. I have a feeling it just hasn’t been a high priority. If it was, the app would probably be optimized for iPhone 6 Plus by now. I hope app development hasn’t stalled completely.

  1. If only it was around when I made my Versus image.
  2. For those having trouble, here’s how I’ve had success with iOS devices:

    1. Tap settings > WiFi, then push WiFi on your camera
    2. Your camera should show up in the list of available networks. Tap it.
    3. Once your phone has connected to the camera as WiFi, launch the remote app, either select a function (Remote, etc.) and/or tap connect, you may then need to accept on the camera.
    4. You should be good to go from here, but you may need to hit the “OK” button on the camera in order to establish the initial connection. A prompt should pop-up on the camera’s screen.
  3. This could even be a notification saying the app has detected new photos on my camera, and asking if I would like to import them. Bluetooth may be required for this sort of communication.
  4. That happens much more in the winter when I don’t want to take my gloves off to fumble with devices, but if I could hit a button, tap an app, or just accept a notification and be done, I’d be much more likely transfer photos sooner.
  5. Whether or not this ultimately works is another story. I intend to write a post dedicated to this in the near future, but I’m already finding challenges, namely, being able to quickly and easily view all the photos captured with a particular device or lens.
  6. Connectivity could go even further. Photos could bypass internal storage of any kind entirely, and move straight from the buffer or a cache to a mobile device with adequate storage, then up to the cloud.

    It’s not hard to imagine a day when our cameras become “dumb boxes” with exquisite lenses attached to them that capture and push sensor data to a mobile device where vendor (Fuji)-specific demosaicing and post processing algorithms can be applied to the images before being saved to the camera roll. Maybe one day.

    Oh, and Apple could really help out by making their damn SD card reader compatible with their own phones.

  7. RAF process can currently be done in-camera already, but the process is, clunky.