Adobe, Enhance Details, DNG Files, and Film Simulations

Way back on episode 40 of the FujiLove podcast—which, if you are reading this site and you like podcasts, you should probably be subscribed to—Jens and Billy Luong of FujiGuy fame had Product Manager Sharad Mangalick from Adobe on to talk about updates to Lightroom and ACR, and specifically the new feature “Enhance Details”.

Enhance Details

For those who don’t yet know, Adobe’s own dialogue box says:

“Enhance Details uses machine learning1 to improve details and reduce artifacts in most RAW files. The enhanced results will be saved as a DNG file.

It’s a computationally intensive process that performs best with a fast GPU. The estimate Adobe gives me on my Late 2015 5K iMac for a single image is 5 seconds, and that seems about right. What’s really unfortunate is the file size of the resulting enhanced DNG version of a 24.6 MB compressed RAF out of an X-H1 is 117.7 MB.

The results out of this feature would have to be pretty spectacular for me to consider taking on that kind of additional data.2 I’m seeing some decent results in my own limited testing, but weirdly, the Enhance Details dialogue preview totally betrays what the feature is actually capable of in some cases. I had one image that looked significantly worse in the dialogue box preview—more false detail, crunchy texture where there shouldn’t be—and I was ready to call Adobe out on it. After I actually rendered the image though, it did look a fair bit better than the default rendering.

It’s fantastic that Adobe are putting in the time and effort to support X-Trans—they could easily have made this a Bayer-only feature since they obviously support many more Bayer cameras than they do X-Trans. I’m looking forward to the day this feature is part of Lightroom proper, vs. behaving more like a plug-in, but one comment from Sharad on the podcast really jumped out at me:

“Fuji’s proprietary RAW file, the RAF, it doesn’t have an openly documented specification that Adobe can use to add the additional information.”

This is what results in Adobe having to create a separate (huge) DNG files. An obvious question is why doesn’t Fuji open up the gates enough to allow Adobe to write this additional information to the RAF/a sidecar file so we don’t need to go through this cumbersome workflow? I’m sure Billy has asked Japan this question, and either his hands are tied or there’s a really good reason that he agrees with. 3 Either way, I wish we got an inkling as to what the issue is there.

Another obvious question is why are other applications able to generate better results without having to create a separate file? What does Phase One know that Adobe doesn’t? Adobe talk about having to balance performance with resolutions, but Lightroom isn’t exactly blowing away the competition when it comes to performance with RAF.

For the time being, unless you’re all in on Adobe and a Creative Cloud subscription is your budget for photo editing (which is perfectly reasonable), a strong case can still be made for better, more specialized tools like Iridient Developer for those really important or really challenging images.

Film Simulation Modes

In the back half of the interview, the trio speak about Film Simulation Modes and how Adobe works with Fujifilm to ensure they have the same understanding of how each Film Simulation Mode should affect an image. This has been the official story for years now, and back when it was first told, Adobe’s interpretation of Fuji’s Film Simulation Modes were nowhere close to what we saw out of camera. To my eye, they’ve gotten better, but my recent experimentation with Capture One 12 suggests Phase One is quite a bit closer. Indeed, Phase One’s interpretations of Fuji’s Film Simulation Modes so far has been strikingly good. I’m on the cusp of switching to Capture One for much of my X-Trans processing needs for a few reasons, not needing to create whole separate files to get better detail out of my images and Film Simulation Modes are big ones.

  1. What doesn’t use “machine learning” these days? Anyone else already tired of that? ↩︎
  2. In the early days of DNG, I remember reading about photographers converting to DNG wholesale with some going so far as to then delete the original RAW files. I’m awfully glad I never considered this kind of asset management. ↩︎
  3. Billy also does a remarkable job navigating these waters. He’s in a tough spot having to balance transparency and trade secrets, but he comes across as really quite genuine. Having spoken with Billy myself quite a few times in the past, I can say he truly does take user feedback to heart, and when he says he’ll take something back to Japan, he means it.

    I’m really digging this addition to FujiLove. It can come across as an advertisement at times, but it’s counterbalanced with honest upgrade recommendations to listeners who could perhaps skip a generation.

Firmware and App Updates


Fujifilm released a firmware update for the X-T2 a couple days ago. Here’s a plainer language version of the key fixes:

  1. Tethering support. There is a lot to parse here, and it sounds as though some of it preemptively addresses software that is yet to be released.
  2. Buttons and dials can now be locked during shooting.
  3. A fix for shutter speed info not displaying under specific settings.
  4. Overexposure when AF-C and Face Detection are selected.
  5. Fix for poor AF performance when using the XF8-135mm F3.5-5.6 WR at the telephoto end.
  6. This is a tough one. It sounds like the camera would freeze during menu selections for PC auto save.
  7. Fix for when using shoe-mounted flash and CH burst mode.
  8. Fix for Nissin i40 flash not firing.


An update to the Camera Remote app was released yesterday to address iOS 10 issues. The app was getting hammered in the reviews, last I checked. Hopefully this helps.

Mobile Workflows

While on the topic of mobile-only workflows, Hendrik Haseu has a nice write-up on how he’s gone lighter and ditched the PC for his post processing. It’s a process he’s been refining, so you can learn from his progression.

Hendrik has taken his workflow a little further than I have largely because I don’t personally worry about metadata when working mobile, but I do think about implementing his RAF+JPEG strategy at times. Making selects and processing images in camera can get tedious though. I love the technique of creating a neutral image from RAF’s that are perfectly suited to creative post processing, something I often do on desktop, but have yet to implement on tablet or smartphone. The advantage of rendering the JPEGs in-camera vs. a dedicated program on desktop is you get Fuji’s ridiculously good corrections and Lens Modulation Optimizer applied to your images. His post is yet another reminder of how much I want Fuji’s in-camera RAF processing available via a mobile app.

Image by Hendrik Hazeu

Image by Hendrik Hazeu

Affinity Photo

If traffic stats are anything to go by, the majority of my readers are Mac users, so apologies in advance to my Windows visitors for this Mac-only post. Having written that, this could be another reason to consider a shift in desktop platform.

Affinity Photo→ has been available on the Mac App Store since July 9th. I was waiting for a trial like they have for Affinity Designer,→ but with the end of the launch sale fast approaching, I took the plunge and bought both apps without being able to test out Photo.

I’m only a day in, but so far I’m very impressed. These applications are super fast. On my Early 2013 Retina MacBook Pro, Affinity is opening PSDs and AI files (saved with PDF compatibility) faster than Photoshop and Illustrator respectively are. The “Edit in...” command from Lightroom is also significantly quicker when sending the file to Affinity Photo as compared to Photoshop.

It might be the initial excitment talking, but as it stands right now, I’m hopeful that what amounted to two months worth of Adobe’s Creative Cloud will allow me to cancel my discounted subscription when it comes up for renewal at the end of next month. The timing couldn’t be better. My current plan is to eventually purchase what I can only assume will be the last stand alone upgrade for Lightroom, and transition away from Adobe for home use entirely after that.

Now, you might be thinking, “That’s great and all, but what about my Photoshop plug-ins?”


Yep, Nik plug-ins work just fine with Affinity Photo. A quick work around as noted below is required.

Yep, Nik plug-ins work just fine with Affinity Photo. A quick work around as noted below is required.

Initially I had some trouble figuring out exactly how to set the Plugin search folders and Plugins support folders, but a quick tweet to @MacAffinity lead me to this forum post, and the settings below.

To get “/” you need to point Affinity Photo to your boot volume. Hit Command+Shift+C, then double click your start-up disk.

To get “/” you need to point Affinity Photo to your boot volume. Hit Command+Shift+C, then double click your start-up disk.

There’s lots more to consider, but pretty much all of the big holes I’ve been worried about so far have been satiated. Wacom support, check. Proper Curves adjustment? Yup. All the blend modes you can handle with live preview so you don’t have to use some weird key command to cycle through them? You betcha.

I’m really just scratching the surface here, but I’ll continue to compare and contrast Affinity Photos with Photoshop over the next little while. If you’re on a Mac, and looking for a way to ditch the subscription, Affinity might be your way out. Creative Cloud offers many more applications, of course, but as a photographer, art director, and designer, I mostly work with Photoshop, Illustrator, and InDesign. With Affinity, I’ve got the first two replaced, and InDesign has been the least-used of the three for some time now. It would be fantastic if Affinity’s next project was a serious page layout application.


Lightroom 6.1 / CC 2015.1

Adobe Lightroom.png

I originally set out to write up a quick blog post about the relatively marginal differences in how the two latest versions of Lightroom handle sharpening. We’re still in limbo while Adobe “collaborates with Fujifilm in investigating methods to improve fine detail rendering and overall edge definition,“ after all.

It somehow morphed into what I hope was an interesting exercise in confirming what the optimal methods of sharpening in Lightroom are (Amount vs. Detail, which slider will emerge victorious???), the difference between Clarity and the new Dehaze feature, as well as whether or not Lightroom is best tool for the job of extracting detail from RAFs.

If you’re up for a fairly long post detailing subtle differences via loads of fancy new before and after slider images, check out my latest Extras piece. It focuses on detail for now, but I hope to add an examples of the “reduced colour blur” once I find a suitable image.

Lightroom 5.7 vs. 6.1 for X-Trans

The Evolution of Mobile

Fuji Fujifilm Camera Remote App and the X100T.jpg

Fuji’s Camera Receiver app was pretty cool when it came out. Being able to email a street photography subject’s photo to them on the spot is awesome. Then the Camera Remote app arrived, and that took things to another level. I’ve used that app to capture images for this site1 and for work. It’s so much easier than going to and from the camera to adjust settings, set the timer, run back to the front of the camera, repeat. It’s a fantastic app when it works (most users have no trouble, a few have all kinds).2 With that written, I hope Fuji are putting serious development time into the app, and mobile connectivity in general.

Here’s What I’d Like to See:

  1. I want to be able to push the WiFi button on my camera, launch the app, and be connected. No selecting networks (or at the very least, ask to disconnect me from my current network), accepting the connection, etc. It should just work. Additionally, I’d love the option to have photos pushed to my phone without user interaction via some sort of tethering.3
  2. Remove the limit of 30 photos at a time, and give me an option to import all new photos. The task of tapping each image and being restricted to 30 at a time is tedious.
  3. Allow me to switch between Functions (Remote, Receiver, Browse, Geotagging) without disconnecting the camera, which results in a power-cycling and reconnection juggling act.
  4. I’d love to be able to pair the app with my camera to sync/backup and restore my camera and custom settings for all my cameras. Those custom settings banks, by the way, should be nameable, and transferable from camera to camera too.
  5. All the available in-camera processing should be available in the app. Whether the rendering happens in-camera or on the device (more on that in a minute), I don’t really care for now. I just want access to real Fuji colour, curves, profiles, and film knowledge in post, on my iPhone.
  6. For new flagship cameras, a touch screen that allows us to make these adjustments on the rear LCD of our camera (until editing on our device is possible), and then push it to our phones via that tethering is another possibility. Fuji should not be like all those “Smart TVs” and connect directly to our social networks, offer us weather info, stock prices, or play Netflix. We have devices that do that well already.

Connectivity is going to be as important a feature as whatever next generation sensor is in Fuji’s cameras. They’ve nailed image quality, colour, and optics. They need to nail the ease of use customers expect from devices that capture photos. Being able to connect via WiFi is great, but it’s not as seamless as it could be. I leave photos on my camera with the intention of connecting later,4 and I often forget until I copy photos over via the SD card reader in my Mac, like an animal. Maybe I’m just the lazy exception, but I doubt it.

Here’s Why

Since I started shooting Fuji, like many others, I’ve largely abandoned RAFs for my workflow. For a while, I was shooting RAW + JPEG, but more often than not — I’m talking 90% of the time — I would end up deleting the RAF. Part of this is being satisfied with how JPEGs are rendered in-camera, and the other part is a shift in mindset from “RAW tinkerer” to “shoot and (mostly) be done with it.” I still enjoy post-processing, but I really like being able to do it on my phone wherever and whenever I want.


Due to this shift, I’ve been in workflow limbo for the last 18 months. Vacation photos have been copied to Lightroom and forgotten about, while daily photos are sometimes left on my SD card for weeks on end. What changed recently is Apple’s Photos app. I’m attempting to move away from Lightroom for my daily hobby shooting,5 and my SD card stays in my camera as photo transfers are done using the Camera Remote app. This is why the 30 photo limit is getting painful. I’ll still capture RAW + JPEG when I’m out to “make a picture,” but for the most part these days, Film Simulation Bracketing + iPhone editing gets me most of what I want. It’s great, but I want more.

Instant On

This is also why I want it to be as quick and easy as possible to connect my phone to my camera. There really ought to be two taps, the WiFi button, and launching the app. Even better, make the app intelligent enough to be “paired” with whatever Fuji cameras are owned, and connect auto-magically when the app is launched.6

RAF Processing

The next level is for the app to see RAF files, and prompt me to choose my Film Simulation mode via taps on screen. Then, I should be able to make selections on all image aspects that are currently handled in-camera — Dynamic Range, White Balance, Noise Reduction, Highlights, Shadows, and Sharpness — followed by a “Done” button that pushes the resulting JPEG to my iPhone’s Camera Roll. Again, the actual processing could still be handled in-camera7 if Fuji can’t or don’t want to port their secret sauce to another platform for some reason, all the more reason to make connecting flawless.


Fuji has been on quite a tear with their X-Series system. The hardware release schedule continues to astonish and they’re well on their way to becoming the preeminent mirrorless camera company. For Fuji, or any camera manufacturer to continue to be reached for instead of the “good enough” smartphone, they need to put serious resources against mobile connectivity to make it as easy as possible for users to get their superior photos — selfies, eggs benadict → and all — off the camera, and into their social world.

Perhaps what I’m asking for has already been considered, maybe even attempted. Maybe it’s impossible. I have a feeling it just hasn’t been a high priority. If it was, the app would probably be optimized for iPhone 6 Plus by now. I hope app development hasn’t stalled completely.

  1. If only it was around when I made my Versus image.
  2. For those having trouble, here’s how I’ve had success with iOS devices:

    1. Tap settings > WiFi, then push WiFi on your camera
    2. Your camera should show up in the list of available networks. Tap it.
    3. Once your phone has connected to the camera as WiFi, launch the remote app, either select a function (Remote, etc.) and/or tap connect, you may then need to accept on the camera.
    4. You should be good to go from here, but you may need to hit the “OK” button on the camera in order to establish the initial connection. A prompt should pop-up on the camera’s screen.
  3. This could even be a notification saying the app has detected new photos on my camera, and asking if I would like to import them. Bluetooth may be required for this sort of communication.
  4. That happens much more in the winter when I don’t want to take my gloves off to fumble with devices, but if I could hit a button, tap an app, or just accept a notification and be done, I’d be much more likely transfer photos sooner.
  5. Whether or not this ultimately works is another story. I intend to write a post dedicated to this in the near future, but I’m already finding challenges, namely, being able to quickly and easily view all the photos captured with a particular device or lens.
  6. Connectivity could go even further. Photos could bypass internal storage of any kind entirely, and move straight from the buffer or a cache to a mobile device with adequate storage, then up to the cloud.

    It’s not hard to imagine a day when our cameras become “dumb boxes” with exquisite lenses attached to them that capture and push sensor data to a mobile device where vendor (Fuji)-specific demosaicing and post processing algorithms can be applied to the images before being saved to the camera roll. Maybe one day.

    Oh, and Apple could really help out by making their damn SD card reader compatible with their own phones.

  7. RAF process can currently be done in-camera already, but the process is, clunky.


Last year I decided that the local backup strategy I had in place for my photos wasn’t cutting it. Not to mention leaving for vacation with my computer and external hard drive backups “hidden” didn’t leave me feeling particularly at ease. A break-in or fire could have resulted in my gear—and thus, all my photos—being gone forever.

I decided it was time to give Backblaze a try. For those who don’t already know, Backblaze is remote backup for everything that’s on your computer, including any connected external hard drives for $5/month. The price was never an issue; $60 a year for unlimited remote backup, and crazy fast retrieval of my files is a no-brainer. The problem I was faced with was my initial backup, which clocked in at just under 1 terabyte.

I’m not gonna lie, for me it sort of sucked. In Canada, we’re faced with mostly terrible options when it comes to ISPs that have comically low bandwidth caps, and even worse upload speeds. I initially tried to manage my backup to only allow my monthly allotted bandwidth. It was impossible. What I ended up doing was paying for unlimited bandwidth during the time it took for my initial backup to complete. It gets worse. If I dared use the entirety of my puny upload limit of 3 whole MB, my 35MB/second download speed would grind to a halt. This makes absolutely no sense, but that’s what we get for trying to use the service for which we’re paying, for legitimate reasons. This meant keeping my upload speed to 1MB or maybe 2MB for the majority of my initial upload. That was painful. The initial backup took well over a month, but that was 100% the fault of my awful ISP, and the fairly large amount of data I had to push upstream. I’ve heard many reports of people pushing many times the data I had in well under a week.

Now that my initial back up has completed, Backblaze could not be easier. I don’t even think about it. I leave my max upload speed at around 2MB/second, and take solace in the fact that I have a complete remote backup of all my stuff. I’ve tested the retrieval process a bunch of times, and it works flawlessly. The only minor inconvenience is having to pause my backup on occasion while watching Netflix. That terrible ISP again.

So why am I writing about this? Well, if there’s one thing I know my readers have, it’s photos. Losing all your photos would suck. I also like Backblaze so much that I can recommend them without hesitation. If you’re thinking about getting yourself a remote back up, and you should, consider using a link to my newest affiliate, Backblaze. A full year will probably cost less than your monthly cell phone bill, and I guarantee you will feel better knowing you digital stuff is safe. I know I do.

The “Best” X-Trans RAW Converter

Aperture vs Capture One vs Iridient Developer vs Lightroom vs LightZone vs Nik Sharpener vs OnOne vs Photo Ninja.jpg

Perhaps my most fussy article to date, I’m going to great lengths to determine what the “best” RAW converter is for X-Trans sensors. As it stands, I’ve only examined how these applications treat a typical wide angle image shot with a FUJINON XF 14mm ƒ/2.8 on an X-E1. I will add more images as time permits.

See for yourself what the best RAW converter for X-Trans is.