On 4 August 2005, the police department of Memphis, in the US state of Tennessee, made so many arrests over a three-hour period that it ran out of vehicles to transport the detainees to jail.
Three days later, 1,200 people had been arrested across the city – a new police department record. Operation Blue Crush was hailed a huge success.
By 2011 crime across the city had fallen by 24%. “Crush” policing is now mimicked across the globe.Crush stands for “Criminal Reduction Utilising Statistical History”. Translated, it means predictive policing.
Or, more accurately, police officers guided by algorithms. Criminologists and data scientists at the University of Memphis first developed the technique using IBM predictive analytics software.
They compiled crime statistics from across the city over time and overlaid it with other datasets – social housing maps, outside temperatures etc – then instructed algorithms to search for correlations in the data to identify crime “hot spots”.
The police then flooded those areas with patrols.
Not everyone is comfortable with the idea. Critics have dubbed it “Minority Report” policing, a reference to the science fiction film in which police use psychics to make arrests to pre-empt crime.
Algorithms have increasing influence on our lives. And, as their ubiquity spreads, so too does the debate around who, if anyone, is policing their use.
Such concerns were sharpened further by the revelations about how the US National Security Agency (NSA) has been using algorithms to interpret the colossal data it collects from its covert dragnet of global telecommunications.
“For datasets the size of those the NSA collect, using algorithms is the only way to operate for certain tasks,” says James Ball, data journalist.
“The problem is how the rules are set: it’s impossible to do this perfectly. If you’re, say, looking for terrorists, you’re looking for something very rare. Set your rules too tight and you’ll miss lots of, potential terror suspects. But set them more broadly and you’ll drag lots of entirely blameless people into your dragnet.”
From dating websites and financial trading floors, through to online retailing and internet searches (Google’s search algorithm is now a more closely guarded commercial secret than the recipe for Coca-Cola), algorithms are increasingly determining our collective futures.
“Bank approvals, store cards, job matches and more all run on similar principles,” says Ball. “The algorithm is the god from the machine powering them all, for good or ill.”
“By far the most complicated algorithms are to be found in science, where they are used to design new drugs or model the climate,” says Panos Parpas, a quantitative analysis expert at the Imperial College London. “But they are done within a controlled environment with clean data.
The difficulties come when they are used in social sciences and financial trading, where there is less understanding of what the model and output should be. Scientists take years to validate their algorithm, whereas a trader has just days to do so in a volatile environment.”
Most investment banks now have a team of computer science PhDs coding algorithms, says Parpas. “In financial trading, the various algorithms all follow each other, meaning you get results such as a flash crash.
They use them to speed up the process and to break up big trades to disguise them from competitors when a big investment is being made. It’s an on-going, live process.
In currency trading, an algorithm lasts for about two weeks before it is stopped because it is surpassed by a new one.
The idea that the world’s financial markets are now largely determined by algorithmic vagaries is unsettling enough for some. But the bigger questions surrounding algorithms centre on governance and privacy.
How are they used to access and interpret “our” data? And by whom?
“Most of us assume that ‘big data’ is munificent. The laws in the US and UK say that much of this [the NSA revelations] is allowed, it’s just that most people don’t realise yet.
But there is a big question about oversight. We now spend so much of our time online that we are creating huge data-mining opportunities.” says Ian Brown of Oxford University’s Cyber Security Centre.
There can be a fine line, though, between “good” and “bad” algorithms: “I don’t find the NSA revelations particularly scary. At the moment, they just hold the data.
Even the best data scientists would struggle to know what to do with the data. But they could really screw up someone’s life with a false prediction about what they might be up to.”