Lightroom – subscribe or not?

For some time now I’ve been a happy user of Adobe Lightroom. I brought it back when Lightroom 4 was released, skipped version 5, then paid to upgrade to 6.

Since then, Adobe has discontinued the perpetually licensed version. The only way to legally obtain Lightroom is by paying £120-240 per-year for one of their Creative Cloud subscriptions.

Unfortunately the new subscription model is a rather poor fit for my needs.

I want to state upfront that I don’t object to subscription-based pricing models for software in general. It makes a lot of sense from a development point of view, as maintenance and support costs don’t go away once the product is shipped. But in my opinion Adobe has reached too far, and is trying to steer customers towards cloud solutions for reasons that don’t really align with their best interests.

Subscription models done right

For a good approach to subscriptions, I’d like to draw your attention to Jetbrains (makers of software development tools which I use). They switched to a subscription model in 2015, a little while after Adobe’s subscription-based Creative Cloud product launched.

The section titled “reasons for our move to subscriptions and concerns raised”, is worth a read, and makes a good case for subscriptions. The TL;DR version is that customers get new features sooner, and it’s easier to prioritise important under-the-hood improvements which wouldn’t typically make good marketing material, or bullet points on a “what’s new” feature list.

Despite this, the initial feedback was extremely negative. But to their credit, Jetbrains listened and improved the offer, by allowing customers to use the base version, that was shipping at the time the subscription was taken out, in perpetuity. And giving a discount for repeat subscriptions.

In my view this is fair, even if you don’t like subscription models. You pay one annual subscription, but you can continue to use the software as it was at the time of subscription if you no longer wish to pay for updates. The bet is that most customers will find the upgrades in the 12 months since the subscription worth the money they paid. I certainly do; the improvements are steady, and I’ve kept my subscription active since.

What’s wrong with Adobe’s Lightroom subscriptions?

A number of things actually, but it all boils down to value for money.

One is that unlike Jetbrains, you don’t get to use the base version in perpetuity. Once your subscription expires, Adobe disables import and editing. You can browse and export your previously edited files, but no more changes are possible.

Second is the lack of flexibility.

I am not a particularly frequent user. I use Lightroom to process my photos, but I’m not a professional photographer; I use Lightroom to process photos from trips away or any events such as weddings I attend.

It all adds up to 5 or 6 times a year, for a few hours at a time, and it’s not unusual for me to go 3-4 months without opening it. But Adobe doesn’t let you subscribe for a month here and there. Nope, you have to pay for a minimum of 12 every time.

It feels a bit like Adobe is taking all the benefits of the subscription model here, while giving little to the customer (in all fairness the 12-month term is true of Jetbrains as well, but it’s compensated for by being able to use the base version in perpetuity).

The Options

The relevant plans are the Photography plans. Everything else is way more expensive and includes things I don’t need, so let’s look at the 3 photography options (I’ll ignore monthly cost because they are all annual plans):

Adobe Creative Cloud plans
  • Photography Plan (20GB) – £119.76/yr
    • Lightroom
    • Lightroom Classic
    • Photoshop
    • 20GB of cloud storage
  • Photography Plan (1TB) – £239.64/yr
    • Lightroom
    • Lightroom Classic
    • Photoshop
    • 1TB of cloud storage
  • Lightroom Plan (20GB) – £119.76/yr
    • Lightroom
    • 1TB of cloud storage

Of these, the 20GB Photography Plan is the closest fit. The product I’m after is Lightroom Classic, and I have no need for cloud storage (I use a local NAS backed up to Blackblaze B2).

But let’s look at the individual items:

  1. Lightroom
    • This is not actually the same as the old standalone Lightroom; it’s a cut-down, simplified app/web version that I don’t want or need. It’s an online cloud-first solution, which stores your photos in the cloud, hence the need for storage. Being able to log in and edit your photos from anywhere is nice, but the problem for me is the lock-in; it’s much harder to move to a competing solution compared to offline tools that organise, store and edit files on your local computer.
  2. Lightroom Classic
    • This is what was previously known as just “Lightroom”. It’s the same offline desktop product we know and love, and as is what any professional or power-user would want to use.
  3. Photoshop
    • Possibly the cheapest way to get Photoshop… but I don’t need it.

The best fit for me is the Photography Plan (20GB) for £120/year.

My previous expenditure on Lightroom is £106.48 for an outright purchase of Lightroom 4 in 2013, and then £59.09 to upgrade to Lightroom 6 in 2015.

Considering I still use it today, that’s £165.57 over 6 years, so £27.60/year or £2.30/month.

If it was still possible to buy a standalone version, I would have done so by now, thus it’s probably fair to factor in another £60 upgrade, which brings it to £225.57 for 6 years. That’s £37.60 per year, or £3.13/month.

Whichever way you slice it, Adobe is asking me to spend at least three times more to use Lightroom than I have in the past.

A no-brainer for professionals

If I was a professional photographer that used Lightroom every day, I’d subscribe in a heartbeat. By way of comparison, Jetbrains’ IntelliJ is a tool I use every day. It also costs £119 for the first year, the same as the Photography Plan (the price drops for subsequent years though).

For a tool I use daily it’s a bargain, and if I was a full-time photographer it would be decent value, even if I never used Lightroom Cloud and its storage.

The problem is that, as an occasional user, the extras don’t justify the cost. I’m being asked to pay for two products I don’t want, and cloud storage I have no use for.

Au revoir Lightroom

If it wasn’t already apparent, I’m going to vote with my wallet and decline to upgrade to Creative Cloud. I’ll keep using Lightroom 6 for now, as my DSLR is even older and the raw files are supported, but it’s starting to show its age. I’m already getting notifications on my Mac saying it needs updating for future versions of MacOS.

So I’m in the market for an alternative. Has any ex-Lightroom user found an offline, non-cloud photo-editing and organisation tool that they’d recommend? Organisation by filesystem is a requirement, and compatibility with Lightroom’s side-car preset files would be a huge bonus!

Domain Expert vs Generalist

When should you use a blunt generalist tool, and when should you use a sharper domain-specific tool?

I posted a question on Serverfault recently, and received a relevant answer that wasn’t quite what I was looking for:

Systemd – How do I automatically reload a unit, when another oneshot service is fired by timer?

My reply to the answer thanked him for it, but mentioned that I think systemd is the right place to do this “sort of thing”. In reply to my reply, he told me that systemd is “absolutely the wrong place” to do this sort of thing, which is pretty strong language!

I think we’re approaching this from different perspectives here, so let’s break the problem down in general terms.

Continue reading

The 80/20 Rule Applied to Personal Finance

It’s hard to watch The Big Short, and not come away thinking that the odds are stacked against you as a would-be individual investor. It’s a great film that makes some very valid points, but leaves you thinking.

Surely if there are all these hedge funds that mismanage their clients’ money, and getting a seat at the big-boy’s table requires vast amounts of capital, there’s a gap in the market for cooperatively run mutual funds that actually act in their clients’ interests?

It turns out that there are already companies in this space, but the chances are you wouldn’t hear about them from a financial advisor.

Continue reading

Improving your privacy with a custom email domain

This blog post is a follow-up to It’s Time to Ditch Gmail. It began as a review of Fastmail, and my experience of moving to it from Gmail, but I quickly found myself going on a tangent. Since privacy was the main driver in my decision to move to Fastmail, and using a custom domain is one of the ways that I protect my privacy, I figured it was important enough to warrant its own post.

One of the factors that made it easier to move away from Gmail is my use of a custom domain for most of my mail. Before moving to Fastmail, this domain was tied to a GSuite account which forwarded everything to my standard Gmail account. This made switching in anger much easier, as I had fewer accounts to log in to and update my email address, and those that were still pointing directly at Gmail tended to be older low-value accounts that I no longer use anyway.

In this article though, I want to take a detour to explain why I use a custom domain, and how it can aid your privacy. Continue reading

It’s Time to Ditch Gmail

I haven’t written much about privacy on this blog, despite often behaving, by some people’s standards, like a paranoid schizophrenic where my data is concerned. Until fairly recently I used to run a rooted phone with XPrivacy installed, which is about as private as you can get without ditching smartphones altogether. These days I’ve gone back to a stock un-rooted phone, partly because Android permissions have improved (although you do have to be careful with apps targeting older APIs), and partly because rooting is more risk and burden to me as a user. Also, some apps actively attempt to block rooted devices for quite legitimate (if, I would argue, misguided) reasons.

Anyway, I could go on for hours about Android privacy, but the subject of this post is Gmail. We all know that Google mines your personal data for targeted advertising purposes. But when giving data to companies, there’s a balance between functionality that is useful to you, and commercialising your data for purposes that, often, are not in your best interest.

While Gmail was once an innovative service, I’d argue that the scales have long been tipped in favour of commercialisation, and that today the data cost of Gmail outweighs its value as a service. Continue reading

Provisioning Vault with Code

A couple of years ago, Hashicorp published a blog post “Codifying Vault Policies and Configuration“. We used a heavily modified version of their scripts to get us going with Vault.

However there are a few problems with the approach, some of which are noted in the original post.

The main one is that if we remove a policy from the configuration, applying it again will not remove the objects from Vault. Essentially it is additive only, and while it will modify existing objects and create new ones, removing objects that are no longer declared is arguably just as important.

Another problem is that shell scripts inevitably have dependencies, which you may not want to install on your shell servers. Curl, in particular, is extremely useful for hackers, and we don’t want to have it available in production (in our environment, access to the vault API from outside the network is not allowed).

Finally, shell scripts aren’t easy to test, and don’t scale particularly well as complexity grows. You can do some amazing things in bash, but once it gets beyond a few hundred lines it’s time to break out into a proper language.

So that’s what I did.

The result is a tool called vaultsmith, and it’s designed to do one thing – take a directory of json files and apply them to your vault server.

Continue reading

Upstream Bug? Fix it.

This is a blog post I originally wrote more than two years ago, in reaction to “spirited debates” I was having with developers. I didn’t post it, but perhaps I should have! Anyway the ideas within are as true to me now as they were then, so I thought I’d post it today after a bit of revision.

How many times have you had a developer shrug their shoulders at you and say “it’s an upstream bug”?

I heard it today, and it is so, so wrong, that it is practically an admission of guilt.

Do you say that to your customers when their personal data is leaked from your database? When your app crashes their device? No? Good, because it’s your problem.

It’s great that you can use third party libraries to do your job more efficiently, but doing so does not absolve you of responsibility if the product breaks. You made the decision on what library to use, and you are ultimately responsible for delivering functionality. Continue reading

Life after Crashplan

Crashplan’s email to home customers

If you’re reading this and don’t know me personally, you’re probably aware that Crashplan decided to “sunset” their Crashplan Home offering on August 22nd last year. No new subscriptions are being taken, and it will cease to exist from August 2018. Unfortunately, my subscription expired in December.

I was hugely satisfied with Crashplan, and thought it was by far the best online cloud backup solution in the market for the average home user.

  • It offered free peer-to-peer backups which meant I could backup my devices to my own server, or even trade encrypted backups with friends.
  • The client to backup to your own devices was free, and the cost for online cloud backups was a very-reasonable $150 USD for 12-months of unlimited backup storage.
  • By virtue of being written in Java, the client was available for Windows, Mac and Linux (I have all 3).
  • It supported headless operation, albeit with a bit of jiggery-pokery, i.e. editing the client config file to point to another agent via an SSH tunnel. This meant I could run it on my home NAS device, which naturally stores my important data (Photos mainly).
  • No limits on the number of devices that were backed up, or charges per-device.

Naturally, I was disappointed when they announced they were discontinuing it. “No worries!” I thought, there must be something else out there. As it turns out, Crashplan Home was almost too good to be true. Continue reading

From Ivy Bridge to Threadripper Part 1 – A Water Cooling Retrospective

Some of the links in this article are Amazon affiliate links, which pay me a commission if you make a purchase.

I could have brought a plain old Ryzen, a Core i7 or even another Core i5. But with Intel sitting on its hands the past 5 years in the face of no competition, I decided it was time to splash out and reward AMD for not only investing in CPUs again, but making an interesting high-end desktop product while not nickel & diming its customers over PCI-E lanes.

And so, I brought a 1920X.

I don’t really need 12 cores. Other than general browsing, my PC is used for work, (coding) plus a bit of gaming, and a gaming CPU this is not. Running multiple VMs and M.2 devices without slowing down will be nice, but this build is mostly overkill for my needs. And that’s really the point! Continue reading