Wednesday, January 5, 2011

Dangerous reliance on flawed computer databases

Dumb cops treat them as an oracle when they should be treated as being no more reliable than any other kind of evidence

We humans have a persistent fear that the machines we endow with artificial intelligence will one day turn against us. Of course, deep down we know such concerns are irrational. Life is much easier if we accept that even though it might have burnt the bread, the toaster is basically on our side and doing its best.

Our natural instincts dulled, we let our guard down. And so, if you truly fear technology, expect to be dismissed as a Luddite or worse. I know all this, and yet I truly fear technology. Specifically, I fear how we rely on it; how we outsource our duty of care to computers that in fact rely on us to do their work properly.

When police and other law enforcement agencies, which have the power to deprive us of our liberty, place absolute trust in imperfect systems, the resulting injustice can be terrible and very difficult to remedy.

The Herald recently reported that a long-running glitch in the NSW government computer system is causing young people to be arrested and detained for breaching non-existent or expired bail conditions. Often these people must wait until they are brought before a court before they are released.

For more than three years, the Public Interest Advocacy Centre, the Public Interest Law Clearing House and Legal Aid NSW have been trying to resolve this. But still the cases have mounted up, leading to the repeated injustice of wrongful detention and a government compensation bill that runs into millions of dollars.

Even when a detained youth has tried to explain the true situation - in one case, his mother offered to fax to the police court documents containing the correct information - the authorities have doggedly relied on the police IT system. By presuming their technology to be infallible, these errors have caused a significant injustice.

An IT system relies on people to input the data. But from time to time, we fallible humans enter the information wrongly; sometimes it doesn't go in at all. While it's convenient to assume the computer is always right, that assumption should never prevail over clear evidence to the contrary.

There is also another, more subtle problem with IT systems. Their design constrains our actions - often more effectively than any law ever could. This principle does not just apply to IT, but to other forms of design as well. Take, for example, road safety. If the government wants to limit drivers' speed on a suburban road to 40km/h, the conventional method would be to impose a speed limit. If policed rigorously, this will probably improve compliance, but many people would continue to speed.

A far more effective (and cheaper) solution is to change the design of the road: to build speed humps, roundabouts and so on. This can create total compliance because you physically can't drive over the speed limit.

The same is true in IT systems. This can be a good thing: a well-designed system will ensure that important considerations are not forgotten by public servants who are often busy and under pressure.

However, it also means your options can be limited by the choices made by the government's computer programmer. You can be prevented from doing something, not because the law prohibits it, but because there is no such option in the drop-down menu.

The tragic case of David Iredale, the young bushwalker who died in the Blue Mountains in 2006, is a case in point. When he realised he was lost and in trouble, David called the ambulance service from his mobile phone and was repeatedly asked by the operator to provide a street address. Being in the middle of the bush, he could not. Nevertheless, the operator stuck to the system as designed.

The inquest into David's death disclosed that the ambulance service's call-response system required a street address. The absurdity of requiring such information in all circumstances is manifest. Such situations are more common when we rely on rigid IT systems that do not allow for situations outside of those predicted by the original computer programmers.

Of course, the solution to these problems is not to abandon technology. Instead, we need to be more realistic about the strengths and limitations of the systems we rely on, and to ensure that they are carefully monitored so as not to induce injustice.

SOURCE

No comments:

Post a Comment

Spammers: Don't bother. Irrelevant comments won't be published