Experimental feature

Listen to this article

00:00

00:00

Experimental feature

or

When Mark Zuckerberg made his case earlier this year for how Facebook would become a more positive force in its users’ lives, the application of artificial intelligence played a big part.

One promise: to use AI to identify people who might be thinking of suicide, then intervene to try to save them.

This is one aspect of Facebook membership, however, that will pass Europeans by. The social network said this week that it was ready to go live with its suicide-prevention algorithms around the world — except in Europe. Privacy regulations there have made the company think twice about extending this new feature.

Leave aside, for a moment, the slightly unnerving idea that Facebook is analysing your mental state, or that it might be turning from Big Brother into Big Nanny. Or the risk of false positives (it would be creepy to be mistakenly labelled a suicide risk).

The combination of big data and machine learning promises to make many digital services more useful. Limits on data use are also a necessary safeguard, though they inevitably also restrict the kinds of services that are available.

In Europe, rules on personal data are becoming more restrictive with the General Data Protection Regulation coming into force next May. This is billed as a necessary corrective, handing people power over how information about them is used. But at what cost?

Facebook’s attempts at suicide prevention are a particularly conspicuous example. But there are many more mundane, everyday benefits of personal data analysis that European businesses — and, by extension, consumers — could miss out on.

Consider the public records database maintained by business information service LexisNexis in the US.

It is a mountain of 65bn records about most Americans, amassed from 10,000 sources. Details include university attendance, gun and pilot licences, court records, credit history, and much more besides. With so many data points about each person, it becomes easier to work out when individuals with slight differences in their records are really one and the same person — useful for fraud detection — or to predict behaviour.

That mass of information is applied in many different ways. It is used by all US car insurance companies, for instance, leading to lower prices for people thought likely to be safer drivers (and higher prices for the riskier). State governments use it to identify fraudulent income tax refund claims.

A database like this could not be assembled in Europe. Some data sources are off limits. Also, the use of some types of information is restricted to the original purpose it was collected for. A legacy of rules designed to prevent governments amassing too much knowledge about their citizens has put limits on the way agencies share information.

Besides laying down a general provision limiting how data are used, the GDPR puts a new regulation on machine learning: Europeans will have the right to ask how a computer-generated decision about them has been made. That sounds like a victory for humans fighting back against the all-powerful algorithms. But in the case of the most advanced “black box” deep learning systems, even the creators find it hard to understand how the machines reach their conclusions.

These are not simple problems. The Equifax data breach was a wake-up call about the risk of amassing huge amounts of information about a large part of the population in one place, particularly when few realised it was even being collected.

There are concerns, rightly, about whether data are used ethically — particularly if it results in different prices being offered to different groups of people, or limits on access to things like credit and housing.

There is also an obvious tendency for companies to complain about any new rules, even if they later find a way to work within them. With data becoming such a powerful input in many businesses, there is a lot at stake.

But Facebook’s decision to bypass Europe with its suicide-warning algorithms is a reminder that data regulation is not cost-free. For businesses in Europe looking to develop digital products and services — and for the many others that rely on such services to make their own operations more efficient — the stakes are rising.

richard.waters@ft.com

Leave a Reply

Time limit is exhausted. Please reload the CAPTCHA.