When it comes to privacy and data protection, many businesses with either a physical or a digital presence keep their eye on compliance first and foremost – and that’s a big reason why consumers find it easy to lose trust in those businesses.
The year 2017 was a lesson in how focusing on the bottom line to the exclusion of too much can invite business disaster. We saw horrifying breaches – tech brands including Uber, Yahoo and Imgur all publicized significant data privacy loss incidents, though the Equifax breach in September was the most alarming of all. Fortune went so far as to wonder out loud in a headline whether The Equifax Breach Could End the Credit Industry as We Know It. But it wasn’t just broken promises to respect customer data that caused loss of trust in many prominent brands. The growing uncertainty in the face of newfangled Internet of Things (IoT) products about the boundary between consumer and ecosystem is certainly raising new concerns. Indeed, when connected in-home devices are used to manage (meaning track!) everyday actions – from unlocking the front door to adjusting the baby-monitoring camera – it makes people feel much more intimately the risk that we’re handing over the most intimate details of our lives to device and cloud providers such as Google and Amazon.
Unfortunately for consumer businesses, privacy compliance doesn’t necessarily mean you’re “doing privacy” in a way that’s meaningful for people. When individuals choose not to share data with you, by omission or by never engaging with you in the first place, it’s hard to know the missed business value of those relationships that went dark.
Consider that a key motivation for the EU’s General Data Protection Regulation (GDPR) is to strengthen its Digital Single Market. The GDPR recognizes that this goal can’t be achieved without the trust of individuals. Even the short bits of “GDPR-compliant legalese” that will quickly become de rigueur for helping people know what their opt-in consent means won’t leave a warm and fuzzy feeling that a consumer actually has some sort of control of their data privacy destiny.
Given where we stand – with astounding breaches of public trust in our immediate past and the looming enforcement of GDPR provisions in the near immediate future – I propose the following three rules to test whether your product or solution offers a meaningful level of privacy and trustworthiness. Warning: While the rules are simple, they may not be easy to follow or implement. But businesses seeking a) to go beyond simple compliance to b) produce genuine value for customers through better data privacy need to start somewhere. Here we go:
Rule 1: Make the Right Choice Be the Easiest Choice
Too often – whether through lax security planning or poor product design – online services or applications don’t provide a clear, simple path for users to keep their data private. Product designers need to ask themselves questions like this one: Does our application encourage customers to give their passwords away to third-party apps rather than work to integrate a secure app authorization standard like OAuth? If your answer is “yes,” that’s bad! Yes, connecting apps to online services can enable the flow of customer data. But presenting users with confusing places to retype login information for partner sites breeds suspicion. In a world where all marketing is social, how badly do you want to retain the respect of that person? Further, screen-scraped app connections make it harder for your customers to withdraw their consent, impeding GDPR compliance. Making the right choice be the easiest choice is the best way to keep your relationships with customers and end users on the up-and-up.
Rule 2: Offer “Privacy Actions” That Bring Users Immediate, Unequivocal Benefits
Too many privacy tools suffer from privacy geek thinking. Consumers who don’t experience significant convenience or value, or whom you disappoint, will show you their backs pretty quickly. Privacy features that only enable tinkering around the edges of a person’s privacy stance, are legalistic in their nature or have experiential costs often don’t add up to a win and may even add up as a trust fail.
Some examples of this rule in action: 1) Ad-blocking extensions help manage privacy in a way that browser users just have to trust is working – but they also add value by blocking annoying, blinking, slow-to-load ads. So that’s a great way to bring immediate and obvious benefits. 2) Turning on the Do Not Track setting in a browser depends on others’ actions (websites and apps) to bring it to life. Since unilateral end user action doesn’t buy much, it’s a harder sell to entice people to make an extra trip to turn off a setting. 3) Some tools that kill browser cookies in the name of privacy can also kill features many consumers love: embedded YouTube videos, Facebook comment widgets and more. Debugging how to get these features working again can be painful. That can lead to giving up on the whole notion of cookie management.
Rule 3: Make What People Actually Want to Do Possible
A debate has been raging in the health care community over “data blocking” or “information blocking” – the practice whereby clinicians or provider organizations withhold health data from individual patients. While the evidence is strong that this blocking is happening, the intent behind these episodes isn’t clear; some ascribe it to liability concerns, and others attribute it to concerns about a lack of transparency in increasingly automated health care practices. Some critics contend data blocking is inevitable in a fee-for-service healthcare system. Other stakeholders maintain that steadier regulatory guidance with accompanied investment in electronic health record (EHR) technology will resolve these conflicts. Which points to my third and final rule: Make what people actually want possible. Health data resonates so strongly because patients and their family members want and deserve clear visibility into this most personal set of information and the ability to decide what’s best for themselves and their own bodies: “No data about me without me,” as the saying goes. The principle applies to any kind of personal data.
The health care world has for a long time captured the ability of a patient to authorize release of data to, say, a relative on paper. Such forms are called consent directives. Little by little, health IT is working to enable paper-based consent to turn into electronic flows enabled by APIs, mobile apps and the like for many more than the original use cases. In fact, the notion of electronic consent directives perfectly captures Rule 2: It allows a “privacy action” that provides a tailored data-sharing benefit the patient was seeking. Many consumers already know a similar experience when they invite someone to a Google Doc by hitting a Share button and indicating that access is only for viewing, not editing, the document. The User-Managed Access (UMA) standard and its companion Health Relationship Trust (HEART) profiles targeted to the healthcare field help provide a standardized way to apply such capabilities to other services and apps.
Organizations that provide meaningful levels of privacy and trustworthiness guided by these rules are far more likely to thrive and succeed in the more stringent GDPR business environment. Yes, these rules require a commitment to go beyond mere compliance. They call on product designers, marketers and service providers to embrace a willingness both to trust individuals with their own personal data and to seek a deep understanding of digital transformation and its opportunities. But hey, if it were easy, everyone would do it, right?
About the Author
Eve Maler is vice president of innovation and emerging technology in ForgeRock’s Office of the Chief Technology Officer. She is a renowned strategist, innovator and communicator on digital identity, security, privacy and consent, with a focus on fostering successful ecosystems and individual empowerment. Eve drives privacy and consent innovation for the ForgeRock Identity Platform, enabling user-controlled and compliant data sharing across web, mobile and IoT contexts. She founded and leads the User-Managed Access (UMA) standards effort and guides the ForgeRock implementation of UMA and other privacy and consent solutions. She also directs the company’s engagement in interoperability standards such as Health Relationship Trust (HEART) and provides expert advice to public and private forums such as the Facebook/Ctrl-Shift research on A New Paradigm for Personal Data and the U.S. Health and Human Services API Task Force. Eve was formerly with Forrester Research, PayPal and Sun Microsystems; at Sun she co-founded the SAML standard. Previously she co-invented XML. In the dim past she co-authored Developing SGML DTDs: From Text to Model to Markup. Eve enjoys singing bluesy-funky rock ‘n’ roll.