The tyranny of technology

When police and other law enforcement agencies place absolute trust in imperfect systems, the resulting injustice can be terrible and very difficult to remedy, writes Edward Santow.

Commentary

We humans have a persistent fear that the machines
we endow with artificial intelligence will one day turn against us. Of course, deep
down we know such concerns are irrational. Life is much easier if we accept that
even though it might have burnt the bread, the toaster is basically on our side
and doing its best.

Our natural instincts dulled, we let our guard
down. And so, if you truly fear technology, expect to be dismissed as a Luddite
or worse.

I know all this, and yet I truly fear technology.
Specifically, I fear how we rely on it; how we out-source our duty of care to
computers that in fact rely on us to do their work properly.

When police and other law enforcement agencies,
which have the power to deprive us of our liberty, place absolute trust in imperfect
systems, the resulting injustice can be terrible and very difficult to remedy.

Last week, the Sydney Morning Herald
reported
that a long-running glitch in the NSW Government computer system is causing
young people to be arrested and detained for breaching non-existent or expired bail
conditions. Often these people must wait until they are brought before a court before
being released.

For over three years, the Public Interest
Advocacy Centre (PIAC), the Public Interest Law Clearing House and Legal Aid
NSW
have been trying to resolve this systemic problem. But still the cases have
mounted up, leading to the repeated injustice of wrongful detention and a government
compensation bill that runs into the millions of dollars.

Now PIAC is working with the leading law firm,
Maurice Blackburn, to prepare a class action to challenge these decisions. Our
aim is to obtain redress for many young people who have been wronged and, just
as importantly, to stop this from happening in future.

So what is the problem? Clearly, there is a
problem with the NSW Government’s IT system. In detaining these young people,
the police are relying on their computer system. It’s just that this
information is too often wrong or out-of-date.

Even when a detained youth has tried to explain
the true situation – in one case, the young person’s mother offered to fax to
the police the court documents containing the correct information – the
authorities have doggedly relied on the police IT system. By presuming their technology
to be infallible, these errors have caused a significant injustice.

The IT system relies on people to input the
data. But from time to time, we fallible humans enter the information wrongly;
sometimes it doesn’t go in at all. While it’s convenient to assume the computer
is always right, that assumption should never prevail over clear evidence to
the contrary.

Now that there’s overwhelming evidence showing that
the system is flawed, it would be unconscionable to leave it unchanged. No
democratic state should rely on a system that leads to so many unlawful
detentions.

There’s also another, more subtle problem with
IT systems. Their design constrains our actions – often more effectively than
any law ever could.

This principle doesn’t just apply to IT, but to
other forms of design as well. Take, for example, road safety. If the
government wants to limit drivers’ speed on a suburban road to 40 km/hr, the conventional
method would be to impose a speed limit. If policed rigorously, this will
probably improve compliance, but many people would continue to speed.

A far more effective (and cheaper) solution is
to change the design of the road: to
build speed humps, round-abouts and so on. This can create total compliance
because you physically can’t drive over the speed limit.

The same is true in IT systems. This can be a
good thing: a well-designed system will ensure that important considerations
are not forgotten by public servants who are often busy and under pressure.

However, it also means that your options can be limited
by the choices made by the government’s computer programmer. You can be
prevented from doing something, not because the law prohibits it, but simply
because there’s no such option in the drop-down menu.

The tragic case of David Iredale – a young
bushwalker who died in the Blue Mountains in 2006 – is a case in point. When he
realised he was lost and in trouble, David called the Ambulance Service from
his mobile phone and was repeatedly asked by the operator to provide a street
address. Being in the middle of the bush, he could not. Nevertheless, the
operator stuck to the system as designed.  

The inquest into David’s death disclosed that
the Ambulance Service’s call-response system required a street address. The
absurdity of requiring such information in all circumstances is manifest. Such
situations are more common when we rely on rigid IT systems that do not allow for
situations outside of those predicted by the original computer programmers.

Of course, the solution to these problems is not
to abandon technology. Instead, we need to be more realistic about the
strengths and limitations of the systems we rely on, and to ensure that they
are carefully monitored so as not to induce injustice.

This commentary was published in the Sydney Morning Herald on 4 January 2010.

Edward Santow
is CEO of the Public Interest Advocacy Centre.

Pin It on Pinterest