A review of the Stellar Phoenix Photo Recovery software

Having lost photos and videos in the past, I am fairly cautious about my media these days. I keep local and remote backups and I use hardware that writes my data redundantly onto sets of drives, so that I don’t lose anything if one of the drives goes down. I have also purchased data recovery software, just in case something goes bad: I own both Disk Warrior and Data Rescue.

When someone from Stellar Phoenix contacted me to see if I’d be interested in looking at their Photo Recovery software, I agreed. I wanted to see how it compared with what I have. In the interest of full disclosure, you should know they gave me a license key for their paid version of the software.

I put it to a test right away, on what I deemed the hardest task for data recovery software: seeing if it could get anything at all from one of the drives I pulled out of one of my Drobo units.

As you may (or may not) know, Data Robotics, the company that makes the Drobo, uses their own, proprietary version of RAID called BeyondRAID. While this is fine for the Drobo and simple to use for Drobo owners, it also means that data recovery software typically can’t get anything off a drive from a Drobo drive set. Indeed, after several hours of checking, Stellar Phoenix’s software couldn’t find any recoverable files on the drive. I expected as much, because I know specialized, professional-grade software is needed for this, but I gave it a shot because who knows, someday we may be able to buy affordable software that can do this.

Screen Shot 2018-02-24 at 16.30.13
The Seagate 8TB drive is the one I pulled out of the Drobo
Screen Shot 2018-02-24 at 18.29.31.png
What the software found is data gibberish; there were no MP3 or GIF files on that drive

Now onto the bread and butter of this software: recovering photos and videos from SD cards. I made things harder for it again, because I wanted to see what I’d get. I put a single SD card through several write/format cycles by using it in one of my cameras. I took photos until I filled a portion of the card, downloaded them to my computer, put the card back in the camera, formatted it and repeated the cycle. After I did this, I put the software to work on the card.

Before I tell you what happened, I need to be clear about something: because no camera that I know of and no SD card that I know of has any hard and fast rules about where (more precisely what sector) to write new data after you’ve formatted the card, the camera may very well write the bits for new photos/videos right over the bits of the photos/videos you’ve just taken before formatting the card. This makes the recovery of those specific photos that have been written over virtually impossible. What I’m trying to tell you is that what I did results in a file recovery crapshoot: you don’t know what you’re going to get until you run the software on the card.

When I did run it, it took about 40 minutes to check the card and it found 578 RAW files, 579 JPG files and 10 MOV files. Since I write RAW+JPG to the card (I have my camera set to record each photo in both RAW and JPG format simultaneously), I knew those files should be the same images, and they were.

Screen Shot 2018-03-12 at 18.05.31
The software found photos and videos from several sessions and dates
Screen Shot 2018-03-12 at 18.05.54.png
As you can see from the dates, they ranged from March 11 to February 13

I then told the software to save the media onto an external drive, so I could check what it found.

Screen Shot 2018-03-12 at 18.37.08.png
It took about 30-40 minutes to recover the data

When I checked the files, I saw that it recovered two sets of JPG files: each one contained 579 files, but one of the sets began its file names with “T1-…”; they were the thumbnails of the images. All of the JPG files were readable on my Mac. It was a different story with the RAW files. It recovered three sets of RAW files, each containing 578 files. The first set was readable by my Mac. The second set, marked with “T1-…” wasn’t readable at all and the file sizes were tiny, around 10KB in size; they were the thumbnails of the RAW files. The third set, marked with “T2-…” was readable, but the file sizes were around 1MB a piece; they were the mRAW files written automatically by the camera, at a resolution of 3200×2400 pixels. A typical RAW file from the camera I used for my testing ranges in size from 12-14MB and its resolution is 4032×3024 pixels. It’s kind of neat that the mRAW (or sRAW) files were recovered as well.

Now I took 3,328 photos with that camera from February 13th – March 11th. It recovered 578 photos, so that’s a 17% recovery rate. Granted, I made it very hard for it by writing to the card in several cycles and reformatting after each cycle. When I only look at the last set of photos recorded to the card, before the last reformat, I see that I took 523 photos on March 10th and 3 photos on March 11th. The software recovered 525 photos on March 10th (so there’s some doubling up of images somewhere) and 2 photos on March 11th. However, don’t forget about the JPG files, which contained the missing image. So that’s a 100% recovery rate.

In all fairness, there is free software out there that can do basic recovery of images from SD cards and other media, so the quality of a piece of software of this nature is determined by how much media it recovers when the free stuff doesn’t work. I believe I made things hard enough for it,and it still recovered quite a bit of data. That’s a good thing.

Let’s not forget about the video files. Those were written to the card with another camera and they ranged in dates from November 3-6, 2017. I’m surprised it recovered any at all. It gave me 10 video files, out of which 5 were readable, so that’s a 50% recovery rate.

Just for kicks, I decided to run Data Rescue on the SD card as well. It also found 579 JPG files and 578 RAW files. All were readable by my Mac. It also found 10 video files, but none were readable. However, I have Data Rescue 3, which is quite a bit old. Data Rescue 5 is now out, but I haven’t upgraded yet. It’s possible this new version might have found some more files.

Price-wise, Stellar Phoenix Photo Recovery comes in three flavors: $49 for the standard version (this is the one I got), $59 for the professional version (it repairs corrupt JPG files) and $99 for the premium version (it repairs corrupt video files in addition to the rest).

The one thing I didn’t like is that the Buy button didn’t go away from the software even after I entered the license key they gave me. As for the rest, it’s fine. I think it crashed once during testing and it didn’t happen while actually recovering data. The design is intuitive and at $49, this is software you should definitely have around in case something bad happens to your photos or videos. It may not recover all of what you lost, but whatever you get back, it’s much better than nothing, which is what you will definitely get if you don’t have it. It’s also a good idea to have multiple brands of this kind of software if you can afford them, because you never know which one will help you more until you try them all. And believe me, when you’re desperate to get your data back, you’ll try almost anything…

Remember, back up your data and have at least one brand of data recovery software in your virtual toolbelt. Stay safe!

A review of Google’s Backup and Sync

google-drive-to-backup-sync

Google launched this new service in the second half of 2017. I remember being prompted by the Google Drive app to install an upgrade, and after it completed, I noticed a new app called “Backup and Sync” had been installed, and the Google Drive app had become an alias.

Screen Shot 2017-12-26 at 14.08.28.png

The new app sat there unused for some time, until I discovered its new capability, namely to back up and sync other folders on my computer, not just the Google Drive folder. This was and is good, new functionality for Google, because it ties in very nicely with its Photos service, which has already been offering the ability to back up all of the photos and videos taken with mobile devices to the cloud through the Google Photos mobile app. I’ve been using Google Photos for several years, going back to when it was called Picasa Web.

I set it to back up all of my photos and videos, allowing Google to compress them so I could back up the whole lot. (It’s the “High quality (free unlimited storage)” option selected in the screenshot posted below.)

Screen Shot 2017-12-26 at 15.01.14.png

I already back up all of my data with Backblaze, which I love and recommend, but it doesn’t hurt to have a second online backup of my media, even if it gets compressed. Having lost some 30,000 images and videos a few years back, I know full well the sting of losing precious memories and when it comes down to it, I’d rather have a compressed backup of my stuff than none at all.

Screen Shot 2017-12-26 at 14.15.12.png

The thing is, there are shortcomings and errors with this new service from Google, which I will detail below. The backup itself was fast. Even though I have several terabytes of personal media, they were uploaded within a week. So that’s not the issue. After all, Google has a ton of experience with uploads, given how much video is uploaded to YouTube every single day.

Screen Shot 2017-12-26 at 14.08.51.png

As you can see from the screenshot posted above, it was unable to upload quite a few files. The app offers the option of uploading RAW files in addition to the typical JPG, PNG and videos, but it couldn’t upload RAW files from Olympus (ORF), Adobe (DNG) and Canon (CR2). They were listed among the over 2700 files that couldn’t be backed up.

Screen Shot 2017-12-26 at 14.09.56.png

I ended up having to add the extensions of RAW, PSD, TIFF and other files to an “ignore” list located within the app preferences. This is the full list I’ve added there so far: DNG, TIFF, RAF, CRW, MOV, PSD, DB, GRAPHDB, PLIST, and LIJ. It seems there’s a file size limit on images and on videos, because most of my large images (stitched panoramas) and videos of several GB or more didn’t get uploaded. That’s a problem for an app that promises to back up all your media.

There were also quite a bit of crashes. The app crashed daily during the upload process and even now, it crashes every once in a while. I set up my computer to send crash reports to Apple and to the app developers, so I assume that Google got them and will at some point issue an upgrade that fixes those bugs.

I also kept running out of space on my Google account. Given that I’d set the app to compress my images so I’d get “free unlimited storage”, and I’d also set it to back up only my images and videos, this didn’t and doesn’t make sense. Add to this the fact that it’s trying to back up unsuccessfully all sorts of other non-image files (see the paragraph above where I had to add all sorts of extensions to the ignore list) and once again, this app seems like it’s not fully baked. I ended up having to upgrade my storage plan with Google to 1 TB, so it’s costing me $9.99/month to back up most (not all) of my images and videos, compressed, to a service that offers “free, unlimited storage”. The app says I’ve now used up 408 GB of my 1 TB plan. Before I started backing up my media, I was using about 64 GB or so, adding together Gmail and Google Drive. So about 340 GB are getting mysteriously used by some invisible files that I can’t see in Google Photos or Google Drive, but they’re obviously stored somewhere by the Backup and Sync app.

Remember, this is Google. They have a ton of experience with apps, with images and with videos, so why did they push this out when it still has all these issues?

Fun with technology

I’ve had multiple Drobo units since 2007. To this day, I still enjoy adding a hard drive to a Drobo. It’s one of those things that can be an ordeal on other tech, but on a Drobo, it’s been made fun through proper planning and design.

It lets you that it’s low on space, you order a drive, and when it comes, you look at the app, which tells you exactly what size-drive is in each bay. Pressing a small lever on the side of the bay releases the drive, which slides out. You put the new one in, the Drobo immediately checks it and formats it, then begins striping the data set across it. By the way, that’s a screen shot showing my Drobo 5D.

Screen Shot 2017-12-21 at 12.48.30.png

I love this process. It’s so simple and so fun! The Drobo doesn’t care what hard drive you buy, as long as it’s larger than what you already had. It allows you to grow the capacity of your Drobo in time, as the prices for newer, bigger hard drives decrease, without any sort of headaches. This is technology done right.

A comparison of CrashPlan and Backblaze

I’ve been a paying CrashPlan customer since 2012 and my initial backup still hasn’t finished. I’ve been a paying Backblaze customer for less than a month and my initial backup is already complete. 

I’m not a typical customer for backup companies. Most people back up about 1 TB of data or less. The size of my minimum backup set is about 9 TB. If I count all the stuff I want to back up, it’s about 12 TB. And that’s a problem with most backup services.

First, let me say this: I didn’t write this post to trash CrashPlan. Their backup service works and it’s worked well for other members of my family. It just hasn’t worked for me. This is because they only offer a certain amount of bandwidth to each user. It’s called bandwidth throttling and it saves them money in two ways: (1) they end up paying less for their monthly bandwidth (which adds up to a lot for a company offering backup services) and (2) they filter out heavy users like me, who tend to fill up a lot of their drives with unprofitable data. My guess (from my experience with them) is that they throttle heavy users with large backup sets much more than they throttle regular users. The end result of this bandwidth throttling is that, even though I’ve been a customer since 2012 — at first, I was on the individual backup plan, then I switched to the family plan — my initial backup never completed and I was well on track to never completing it.

When I stopped using CrashPlan’s backup services, out of the almost 9 TB of data that I need to back up constantly, I had only managed to upload 0.9 TB in FOUR YEARS. Take a moment and think about that, and then you’ll realize how much bandwidth throttling CrashPlan does on heavy users like me.

Screen Shot 2016-10-20 at 23.37.07.png
After four years of continuous use, I backed up a grand total of 905.7 GB to CrashPlan

To be exact, counting the various versions of my data that had accummulated on the CrashPlan servers in these four years, I had a total of 2.8 TB stored on their servers, but even if you count that as the total, 2.8 TB in FOUR YEARS is still an awfully small amount.

Screen Shot 2016-10-27 at 00.42.14.png
Space used on CrashPlan’s servers: 2.8 TB

Tell me honestly, which one of you wants this kind of service from a backup company? You pay them for years in a row and your initial backup never finishes? If a data loss event occurs and your local backup is gone (say a fire, flood or burglary), you’re pretty much screwed and you’ll only be able to recover a small portion of your data from their servers, even though you’ve been a faithful, paying customer for years… That just isn’t right.

I talked with CrashPlan techs twice in these fours years about this very problematic data throttling. Given that they advertise their service as “unlimited backup”, this is also an ethical issue. The backup isn’t truly unlimited if it’s heavily throttled and you can never back up all of your data. The answer was the same both times, even the wording was the same, making me think it was scripted: they said that in an effort to keep costs affordable, they have to limit the upload speeds of every user. The first time I asked them, they suggested their Business plan has higher upload speeds, so in other words, they tried to upsell me. During both times, they advertised their “seed drive service”, which was a paid product (they stopped offering it this summer). The gist of their paid service was that they shipped asking customers a 1 TB drive so you could back up to it locally, then send it to them to jumpstart the backup. Again, given my needs of backing up at least 9 TB of data, this wasn’t a userful option.

Screen Shot 2016-10-31 at 15.57.25.png
This is false advertising
Screen Shot 2016-10-31 at 15.59.41.png
This is also false advertising

Some of you might perhaps suggest that I didn’t optimize my CrashPlan settings so that I could get the most out of it. I did. I tried everything they suggested in their online support notes. In addition to tricking out my Crashplan install, my computer has been on for virtually all of the last four years, in an effort to help the Crashplan app finish the initial backup, to no avail.

Another thing that bothered me about CrashPlan is that it would go into “maintenance mode” very often, and given the size of my backup set, this would take days, sometimes weeks, during which it wouldn’t back up. It would endlessly churn through its backup versions and compare them to my data, pruning out stuff, doing its own thing and eating up processor cycles with those activities instead of backing up my data.

Screen Shot 2016-10-22 at 19.40.33.png
Synchronizing block information…
Screen Shot 2016-10-23 at 14.39.36.png
Compacting data… for 22.8 days…
Screen Shot 2016-10-23 at 16.58.23.png
Maintaining backup files…

I understand why maintenance of the backups is important. But what I don’t understand is why it took so long. I can’t help thinking that maybe the cause is the Java-based backup engine that CrashPlan uses. It’s not a Mac-native app or a Windows-native app. It’s a Java app wrapped in Mac and Windows app versions. And most Java apps aren’t known for their speed. It’s true, Java apps could be fast, but the developers often get lazy and don’t optimize the code — or that’s the claim made by some experts in online forums.

Another way to look at this situation is that CrashPlan has a “freemium” business model. In other words, their app is free to use for local (DAS or NAS) backup or offsite backup (such as to a friend’s computer). And one thing I know is that you can’t complain about something that’s given freely to you. If it’s free, you either offer constructive criticism or you shut up about it. It’s free and the developers are under no obligation to heed your feedback or to make changes because you say so. As a matter of fact, I used CrashPlan as a free service for local backup for a couple of years before I started paying for their cloud backup service. But it was only after I started paying that I had certain expectations of performance. And in spite of those unmet expectations, I stuck with them for four years, patiently waiting for them to deliver on their promise of “no storage limits, bandwidth throttling or well-engineered excuses”… and they didn’t deliver.

Here I should also say that CrashPlan support is responsive. Even when I was using their free backup service, I could file support tickets and get answers. They always tried to resolve my issues. That’s a good thing. It’s important to point this out, because customer service is an important aspect of a business in the services industry — and online backups are a service.

About three weeks ago, I was talking with Mark Fuccio from Drobo about my issues with CrashPlan and he suggested I try Backblaze, because they truly have no throttling. So I downloaded the Backblaze app (which is a native Mac app, not a Java app), created an account and started to use their service. Lo and behold, the 15-day trial period wasn’t yet over and my backup to their servers was almost complete! I couldn’t believe it! Thank you Mark! 🙂

I optimized the Backblaze settings by allowing it to use as much of my ISP bandwidth as it needed (I have a 100 Mbps connection), and I also bumped the number of backup threads to 10, meaning the Backblaze app could initiate 10 separate instances of itself and upload all 10 instances simultaneously to their servers. I did have to put up with a slightly sluggish computer during the initial backup, but for the first time in many years, I was able to back up all of my critical data to the cloud. I find that truly amazing in and of itself.

Screen Shot 2016-10-14 at 21.36.27.png
This is what I did to optimize my Backblaze installation

As you can see from the image above, I got upload speeds over 100 Mbps when I optimized the backup settings. During most of the days of the initial upload, I actually got speeds in excess of 130 Mbps, which I think is pretty amazing given my situation: I live in Romania and the Backblaze servers are in California, so my data had to go through a lot of internet backbones and through the trans-Atlantic cables.

The short of it is that I signed up for a paid plan with Backblaze and my initial backup completed in about 20 days. Let me state that again: I backed up about 9 TB of data to Backblaze in about 20 days, and I managed to back up only about 1 TB of data to CrashPlan in about 4 years (1420 days). The difference is striking and speaks volumes about the ridiculous amount of throttling that CrashPlan puts in place for heavy users like me.

I also use CrashPlan for local network backup to my Drobo 5N, but I may switch to another app for this as well, for two reasons: it’s slow and it does a lot of maintenance on the backup set and because it doesn’t let me use Drobo shares mapped through the Drobo Dashboard app, which is a more stable way of mapping a Drobo’s network shares. CrashPlan refuses to see those shares and requires me to manually map network shares, which isn’t as stable a connection and leads to share disconnects and multiple mounts, which is something that screws up CrashPlan. I’m trying out Mac Backup Guru, which is a Mac-native app, is pretty fast and does allow me to back up to Drobo Dashboard-mapped shares. If this paragraph doesn’t make sense to you, it’s okay. You probably haven’t run into this issue. If you have, you know what I’m talking about.

Now, none of this stuff matters if you’re a typical user of cloud backup services. If you only have about 1 TB of data or less, any cloud backup service will likely work for you. You’ll be happy with CrashPlan and you’ll be happy with their customer service. But if you’re like me and you have a lot of data to back up, then a service like Backblaze that is truly throttle-free is exactly what you’ll need.

The value of a good backup

While working on the fifth episode of RTTE, I learned first hand the value of a good backup. The hard drive on my editing computer (my MacBook Pro) died suddenly and without warning. Thankfully, my data was backed up in two geographically different locations.

The day my hard drive died, I’d just gotten done with some file cleanups, and was getting ready to leave for a trip abroad. I shut down my computer, then realized I needed to check on a couple things, and booted it up again, only this time, it wouldn’t start. I kept getting a grey screen, meaning video was working, but it refused to boot into the OS. And I kept hearing the “click of death” as the hard drive churned. I tried booting off the Snow Leopard DVD, but that didn’t work either. I’d tested the hard drive’s SMART status just a couple of weeks before, and the utility had told me the drive had no problems whatsoever.

I had reason to worry for a couple of reasons:

  1. The laptop refused to boot up from the OS X DVD, potentially indicating other problems than a dead hard drive. I do push my laptop quite a bit as I edit photos and video, and I’d already replaced its motherboard once. I was worried I might have to spend more than I wanted to on repairs.
  2. All of the footage for the fifth episode of RTTE was on my laptop. Thankfully, it was also backed up in a couple of other places, but still, I hadn’t had reason to test those backups until now. What if I couldn’t recover it?

I had no time for further troubleshooting. I had to leave, and my laptop was useless to me. I left it home, and drove away, worried about what would happen when I returned.

A week later, I got home and tried to boot off the DVD again. No luck. I had to send it in, to make sure nothing else was wrong. In Romania, there’s only one Apple-authorized repair shop. They’re in Bucharest, and they’re called Noumax. I sent it to them for a diagnosis, and a couple of days later, I heard back from them: only the hard drive was defective, from what they could tell.

I was pressed for time. I had to edit and release the fifth episode of RTTE, and I also had to shoot some more footage for it. I didn’t have time to wait for the store to fix the laptop, so I asked them to get it back to me, while I ordered a replacement hard drive from an online store with fast, next-day delivery (eMag).

The hard drive and the laptop arrived the next day. I replaced the hard drive, using this guide, and also cleaned the motherboard and CPU fans of dust, then restored the whole system from the latest Time Machine backup. This meant that I got back everything that was on my laptop a few hours before it died.

I’d have preferred to do a clean OS install, then install the apps I needed one by one, then restore my files, especially since I hadn’t reformatted my laptop since I bought it a few years ago, but that would have been a 2-3 day job, and I just didn’t have the time. Thankfully, OS X is so stable that even a 3-year old install, during which I installed and removed many apps, still works fairly fast and doesn’t crash.

Some might say, what’s the big deal? The laptop was backed up, and you restored it… whoopee… Not so fast, grasshopper! The gravity of the situation doesn’t sink in until you realize it’s your work — YEARS of hard work — that you might have just lost because of a hardware failure. That’s when your hands begin to tremble and your throat gets dry, and a few white hairs appear instantly on your head. Even if the data’s backed up (or so you think) until your data’s restored and it’s all there, you just don’t know if you can get it back.

I’ve worked in IT for about 15 years. I’ve restored plenty of machines, desktops and servers alike. I’ve done plenty of backups. But my own computer has never gone down. I’ve never had a catastrophic hardware failure like this one until now. So even though I’ve been exposed to this kind of thing before, I just didn’t realize how painful it is until now. And I didn’t appreciate the value of a good backup until now.

So, here’s my advice to you, as if you didn’t hear it plenty of times in the past… BACK UP YOUR COMPUTER!

If you have a Mac, definitely use Time Machine. It just works. It’s beautifully simple. I’ve been backing up my laptop with Time Machine to the same reliable drive for years. It’s this little LaCie hard drive.

But the LaCie drive might fail at some point, which is why I also back up my data with CrashPlan. For this second backup, I also send my data to a geographically-different location. Since we live in Romania these days, I back up to my parents’ house in the US, where the backup gets stored on a Drobo. And the backup is also encrypted automatically by CrashPlan, which means it can’t be intercepted along the way.

It’s because of my obsessive-compulsive backup strategy that I was able to recover so quickly from the hardware failure. Thankfully, these days backups are made so easy by software like Time Machine and CrashPlan that anyone can keep their work safe. So please, back up your data, and do it often!

One more thing. You know the old saying, every cloud has a silver lining? It was true in my case. When I ordered the new drive for my laptop, I was able to upgrade from its existing 250GB SATA hard drive with an 8MB buffer and 5400 rpm to a spacious 750GB SATA hard drive with a 32MB buffer and 7200 rpm, which means my laptop now churns along a little faster, and has a lot more room for the 1080p footage of my shows. 🙂

An update on CrashPlan

Updated 11/01/16: I’ve revised my opinion of CrashPlan. See here for the details.

Back in April, I wrote about CrashPlan, a wonderful, multi-platform piece of software that lets you back up to friends’ computers for free. I said that I used it to do trans-Atlantic backups, from my computer in Romania to my parents’ computer back in the USA.

It’s been a while since then, so I thought I’d give you a quick update. I’m still using it, and I still like it. It works.

When I wrote the other post, I mentioned I kept hitting some bandwidth ceiling somewhere along the line between Romania and the US, around 2 Mbps. Somehow, that ceiling has since disappeared. I’m now getting speeds up to 25 Mbps, though it’s usually around 5-7 Mbps.

So if you’re in need of a way to back up to a remote location, inexpensively, then CrashPlan is the way to do it. If you don’t have a friend who’s willing to help, that’s okay, you can back up to the CrashPlan servers for $4.50/month, or you can get the family plan, which covers all of your household computers, for $8.33/month. That’s very reasonable, and it’s definitely worth it.

Metadata: DNG vs RAW

Generally speaking, I prefer the Adobe DNG format over the proprietary RAW format given to me by a camera, because I like the fact that it’s more or less future-proof. With a DNG file, the meta-data resides inside the file — like with a JPG — but the format is lossless, same as a RAW file — and unlike a JPG.

In spite of the fact that it’s a “publicly available archive format”, I would like to see more camera manufacturers adopt it, so I can feel more comfortable using it. I realize companies like Hasselblad and Leica have already adopted it, and you can take photos directly in DNG format on some of their cameras, but until the big camera manufacturers like Canon and Nikon adopt it, it won’t have the mass acceptance it needs to ensure its long-term survivability.

Still, I have begun to convert the RAW files in my photo library to DNG. By my count, I have converted about 30% of my 77,000 photographs to DNG format, and I am converting more of them every day. Let’s hope Adobe sticks to its word in the future and I’m not left holding the bag, having locked my photos into a format that might become obsolete.

Long-term benefits and potential caveats aside, I should point out a more current disadvantage between DNG and RAW. It has to do with metadata.

Yes, it’s true that with a RAW file, you’re stuck working with your metadata in sidecar XMP file, and that file may get corrupted, or you may lose it, thus losing your metadata and the processing directives for Camera Raw or Lightroom or whatever you’re using to process your photos. With a DNG, everything resides inside the file. There’s no XMP file, which is a good thing, most of the time.

But when you’re backing up your library, and let’s say for the sake of the argument that you’ve got to back up 20,000 photos, which is what I’m doing right now, and you’ve made minute changes to the metadata of all those files — only changed one EXIF or IPTC field — the backup software won’t care. You’ll have to back up 20,000 DNG files, each (in my case) between 12-24 MB. That’s going to take a LOT longer than backing up 20,000 XMP sidecar files, each of which is only 15-25 KB, because those are the only files that will have been changed if I update the EXIF or IPTC data for a whole bunch of RAW files.

That’s one area where RAW trumps DNG. I’m willing to overlook it if DNG will indeed prove to be a future-proof format, but that remains to be seen.