Interviews

Software that respects user privacy

by Mark Rowe

On May 25, the European Union (EU) began enforcing the General Data Protection Regulation (GDPR), which requires companies to behave responsibly in their collection and management of personal data. Penalties for noncompliance include fines up to four percent of global revenue, adding to the damage that data breaches can do to customer confidence and institutional reputation. The rate of change in IT is only accelerating and businesses need to be able to go fast to keep up with innovation and stay competitive. While the general consensus agrees on the need for data privacy, only some companies have shifted away from speed and growth at all costs, toward building software that respects user privacy, writes Ian Huston, Data Scientist and Associate Director at software firm Pivotal Dublin.

Users are no longer ignoring privacy settings. In fact, they’re demanding accountability and organising movements to quit offending platforms. Given the change in user sentiment, businesses shouldn’t focus simply on complying with GDPR — or any other privacy regulation for that matter. Regulations can be amended, repealed, or superseded at any time. Instead, companies should design radically private software that fundamentally respects the privacy of its users. When privacy is a top priority, customers will trust businesses, and that trust will enable them to move faster. Here are my tips on how to build radically private software:

1. Acquire data progressively and only when needed

Only collect data you have a need for and only do it when you have the need for it. It’s no longer okay to have software that asks users to grant every permission under the sun without explaining why. For example, disclose at the outset why you’re asking for a user’s contacts rather than asking them to grant you liberal permission to email their contacts however, and whenever, your business wants.

Businesses should be able to honestly provide answers to “why,” “when,” and “how” they’re leveraging a user’s data, and specifically “what” data they’re using. Imagine a social media company explaining why it needs to collect a user’s phone call metadata, including names, phone numbers, and the length of each call made and received. “Why, we presumed that a 14-minute call with your cousins means you’re closer than a 12-minute call with your aunt.” Does the company really need to know the length of your calls, or does it just need to know who you talk to most often?

2. Explain data use and how that benefits the user

GDPR doesn’t allow the use of illegible terms and legalese that’s hard to understand. If a shopping app asks for a user’s name, address, and email address, it should tell that user — in plain, understandable terms — that it needs this information because it needs to know where to make deliveries. What’s more, it should state clearly that it’s not going to use the data to send spam mail and that it won’t sell user email addresses to a third party that will try to market stuff to them.

As Tim Berners-Lee tweeted in March: “General rules for us all: Any data about me, wherever it is, is mine and mine alone to control. If you are given the right to use data for one purpose, use it for that purpose alone. Get users’ active and informed consent for these uses of their data.”

3. Get active and inform consent for all data use

Radically private software means that if users don’t give informed consent, companies can’t use their data at all. It also means that if users give consent for one use, businesses can’t apply that consent to some other use.

What’s more, radically private software means that consent must be active. Offering a consent box that’s pre-checked is not okay and is unlikely to be acceptable under the GDPR because the user is not actively selecting that box. Users need to give active consent by clicking on and checking the box themselves.

Additionally, consent must be informed. No more user-interface tricks like the button to give consent is big and red, while the button to withhold data is small and grey. Those antics will no longer cut it. Being open and transparent about consent is critical for businesses to build trust with customers and help them better navigate privacy issues when they arise.

4. Issue a data receipt for each transaction

When users agree to share their data or give some other consent, businesses should consider emailing them a receipt for that consent, like a sales receipt. The benefit to the developer is that they have a record of the consent and can reference it via the email they sent the user. A data receipt can also ensure that both the company and the user are on the same page when it comes to understanding privacy policies.

5. Make opting out easy

As the privacy landscape evolves, we’ll see more people making the decision to opt out of some consents. Developers should offer this option. People are now choosing to #DeleteFacebook rather than opt out of data sharing because deleting is easier, and they don’t have more granular options for what they can opt in or out on. Companies need to make it easier for people to opt out and tell them what opting out means.

And if people decide to leave a service, they should be able to take their data with them. This is happening already in the banking industry. A regulation requires banks to make customer account data available in easy-to-use formats, so that they can change banks more seamlessly. For example, a banking app might contain a helpful budgeting feature that shows customers how much they spend on food, rent, entertainment, etc. The customer owns this data and, if they switch banks, can bring it to their new bank. Enabling this sort of data portability is important in radically private software.

6. Encrypt data in transit

Should you be encrypting all data, even when it’s in transit? The answer is yes. The barriers to entry are now so low that it’s unacceptable for a developer to put up a non-secure website. Everybody knows that you shouldn’t write your own encryption; therefore, businesses should be using strong industry standards for security instead.

7. Educate users about how automated decisions are made

A great example of educating users on automated decisions is the Stitch Fix Algorithms Tour. It transparently shows how “Your online personal stylist” uses data. It tells customers why it’s gathering their data: Because it wants to be a matchmaker, connecting customers with styles they’ll love and probably would not have discovered on their own. No one can accuse Stitch Fix of hiding its data collection efforts and intents. This helps the company build trust and win over customers.

This type of transparency has been a legal requirement in certain industries for a while. For instance, when a company makes a credit decision about a customer, it needs to explain the factors that went into the decision — for example, “The reason you’ve been denied credit is you have too many credit cards and you’re overdrawn too often.” Such explanations are not new; what is new is the complexity of the decision-making process and the resulting complexity of the explanations.

8. Understand that truly anonymising data is almost impossible

A recent example here of anonymised data causing problems comes from Strava, a mobile app that tracks your exercise via your phone’s GPS. Last November, it released a heat map that showed the activity of all users around the world. One analyst quickly pointed out that the map might be cross-referenced with a map of military bases to discern regular jogging routes, patrols, even the location of forward operating bases in Afghanistan.

Organisations often release anonymous data with good intentions. They want people to be able to do cool stuff with it. But it’s almost always possible to de-anonymise the data and trace individuals within it, so removing classic personal data such as names, addresses, and phone numbers is not enough. With data points like location and time of day, it’s still possible to construct a pretty good picture of a user.

9. Communicate clearly about a data breach

Clear communication is the key to trust. Breaches are almost inevitable these days, and while almost no company goes to this degree today, here’s what true transparency from a company could look like: Tell users up front what will occur when a breach happens. Here’s the process that will be implemented in the first 24 hours, here’s the team that will be pulled together, here’s where information to inform customers and keep them up to date will be posted, and here’s how customers can opt out quickly if a breach happens.

Privacy is not a policy companies should implement just to meet a legal requirement like the GDPR. And it’s not enough to do the minimum. Privacy is now front of mind for every consumer and, done well, it can work to a company’s advantage. Businesses can win the trust of customers and get a leg up on their competition if they build radically private software.

Related News

  • Interviews

    Forensics outlook

    by Mark Rowe

    A provider of mobile forensic and mobile data transfer products has forecast trends shaping mobile forensics this year. Cellebrite surveyed its customer…

  • Interviews

    December 2017 issue

    by Mark Rowe

    Now on desks is the December 2017 print issue of Professional Security magazine, that you can also read online. From the last…

Newsletter

Subscribe to our weekly newsletter to stay on top of security news and events.

© 2024 Professional Security Magazine. All rights reserved.

Website by MSEC Marketing