Security features are features too!
All aspects of the user experience (UX) must be considered first class priorities for any feature that hopes to provide effective security.
What does it mean to implement a security feature?
If you haven’t read our previous post on the Shared Responsibility Model, quickly glance through the short introduction. It clearly outlines the responsibilities of service providers and end-users in their joint effort to address authentication (“who are you?”) and authorization (“what are you allowed to do?”) challenges.
One obvious responsibility of service providers is to implement strong, easy to use security features for end-users. Users cannot enable features that don’t exist.
“Well, yeah, of course,” you’re thinking to yourself. “Obviously, users cannot use a feature that doesn’t exist. We’ll implement the feature, then they can use it. Problem solved.”
Sure, it seems obvious at first glance, but things start to go off the rails as soon as we consider the word implement; likely, it means something different depending on who you ask. Go ahead! Ask a software engineer what it means to implement a security feature. I bet they give you a different answer than a designer. You’ll probably get a different response from a technical writer, a marketer, or a customer success teammate too.
Not so obvious after all.
People will give a response that is heavily influenced by their role within the organization. Software engineers are likely to think about the software architecture and what code they’ll need to write. Designers are probably thinking about the overall workflow and how to organize the user interface. Technical writers are brainstorming how to explain this feature to users in a way they will understand; marketers are dreaming up content for emails, tweets, and blog posts; and support staff is ready to list the resources they will need to help customers who inevitably have questions when something doesn’t work as intended.
Though there is bound to be overlap between answers, everyone will highlight a different problem that needs to be solved. So, who is right? Unsurprisingly, they all are!
Security and user experience (UX)
Implementing any type of feature requires an interdisciplinary set of skills to design, create, and maintain. Don Norman, co-founder and principal of Nielsen Norman Group and Director of the Design Lab at UC San Diego, initially coined the term user experience (UX).
"User experience" encompasses all aspects of the end-user's interaction with the company, its services, and its products.
Every feature that a company implements is an attempt to achieve a specific business goal; increased sales, higher conversion rates, more social sharing, etc. The overall user experience, not just the technical implementation, will determine whether the feature succeeds or fails to achieve those goals. This is by no means a novel idea. A product manager’s job is to champion the needs of end-users and make sure that the correct internal departments are working together to create a user experience that will achieve those business goals.
Unfortunately, security features often do not follow the same lifecycle as “core” product features and are bolted on as an afterthought. Perhaps, you’ve experienced this first hand. Imagine a company that hasn’t focused on the security of their product (shocking, I know!) and is now presented with a long list of compliance requirements during a security audit. The company does not want to slow progress on core product features, so it decides to allocate a small team to address the security features required for compliance. An engineer gets to work with the resources available, writes some code, runs some tests, and reports that the security feature is working as required. Management happily checks one box off the compliance checklist and gets ready to move on to the next issue…
Wait! What about all of that other stuff that we just talked about? Where are the contributions from the designers, tech writers, marketers, and support staff necessary to implement the other critical components of the user experience? Unfortunately, the motivating goal in this (unfortunately, realistic) hypothetical example was simply to check a box on a compliance checklist and not to achieve a meaningful business objective. It is critical to treat security features the same as any other feature and clearly define the business goal and the overall user experience required to achieve it.
The ultimate goal when it comes to auth is to eliminate unauthorized access and to keep data secure.
A sound technical implementation is, of course, a critical step towards achieving that goal, but it is not the entire solution. Users need to know that the security feature exists in the first place (hi, marketers!) or they will never consider utilizing it. They need to understand why it is important and how to use it (hi, tech writers!) or they will conclude that it’s not worth their time to try it out. The overall workflow needs to be convenient and the user interface needs to be easy to understand (hi, designers!) or users will likely get frustrated. And, when something doesn’t work as intended, users need somewhere to go to get help (hi, customer success team!) or they will likely just give up entirely.
If users give up entirely, then how much effective security did the technical implementation provide? Basically, none.
Pretty Good Privacy (PGP) has pretty terrible UX
There is likely no better example of the importance of UX in a security solution than Pretty Good Privacy (PGP). Initially created by Phil Zimmerman in 1991, Wikipedia explains that PGP is an “encryption program that provides cryptographic privacy and authentication for data communication.” Come again? Put simply, PGP allows people to, among other things, send encrypted email to one another.
Matthew Green, a well known cryptographer and professor at Johns Hopkins University, wrote about PGP and highlighted the impact of its release, saying: “It’s hard to explain what a big deal PGP was. Sure, it sucked badly to use. But in those days, everything sucked badly to use. Possession of a PGP key was a badge of technical merit. Folks held key signing parties. If you were a geek and wanted to discreetly share this fact with other geeks, there was no better time to be alive.”
It’s great to hear that PGP solved a problem for the geeks in 1991, but that was 27 years ago. Surely, things have improved in the last three decades? Indeed, some things have changed. The PGP protocol was formalized as a standard, which was updated as recently as 2007. The resulting open source protocol is managed by the OpenPGP working group, which claims that it is “the most widely used email encryption standard” even today. It is used by journalists, activists, and many other privacy conscious individuals all around the world and is still widely considered to be the golden standard in terms of security strength.
How have things worked out for the non-”geeks”, though? Was PGP ever widely adopted by the masses? Nope, not even a little bit. Motherboard explains:
PGP has never taken off among non-techies because it's inherently hard to use, which makes it very easy to make a mistake that nullifies the good crypto behind it.
For an apt explanation of just how much PGP did not take off, we turn to Moxie Marlinspike, the co-author of the Signal protocol, co-founder of Open Whisper Systems, and former head of the security team at Twitter. In his musings about PGP, Moxie writes: “Even though GPG has been around for almost 20 years, there are only ~50,000 keys in the “strong set,” and less than 4 million keys have ever been published to the SKS keyserver pool ever. By today’s standards, that’s a shockingly small user base for a month of activity, much less 20 years.”
The main reason that PGP failed as a general security solution is simply that it is really really difficult to use. Quite frankly, the UX sucks, even for technical folks like myself. Non-techies don’t really stand a chance of getting it right.
Moxie agrees. “I think of GPG as a glorious experiment that has run its course. The journalists who depend on it struggle with it and often mess up ('I send you the private key to communicate privately, right?'), the activists who use it do so relatively sparingly ('wait, this thing wants my finger print?'), and no other sane person is willing to use it by default.”
Matthew Green, the cryptographer from John Hopkins, arrives at the same conclusion, claiming that “it’s time for PGP to die.” On the matter of UX, he comments:
“But as they say: a PGP critic is just a PGP user who’s actually used the software for a while.”
So, to summarize. The goal that PGP was trying to solve was allowing anyone to send encrypted email. Phil Zimmerman explained in 1999 in his own words that “PGP empowers people to take their privacy into their own hands. There has been a growing social need for it. That's why I wrote it.” PGP had a strong technical implementation, but the horrible UX prevented it (and the plethora of derivative products and tools) from seeing any mass adoption over the past 20+ years. Even the tech elite who know what PGP is and how it works have given up on it as a ubiquitous solution to the encrypted email problem.
The ultimate irony occurred in 2015 when Motherboard pointed out that “even the inventor of PGP doesn’t use PGP.” The irony is not lost on Zimmerman.
Hopefully, you agree that PGP serves as a clarifying example that highlights the importance of UX in a security solution.
Any motivated service provider eager to meet their responsibility of implementing strong, easy to use security features for end-users should be applauded. Just remember that all aspects of the user experience (UX) must be considered first class priorities for any feature that hopes to provide effective security.
Thanks to Jordan Fischer, Greg Busanus, Kelly Shultz, Reuven Gonzales, and Ray Gonzales for reading drafts of this.