Tech-savvy investigators are ready to put algorithms under the microscope if companies let them - Action News
Home WebMail Saturday, November 23, 2024, 11:42 AM | Calgary | -11.9°C | Regions Advertise Login | Our platform is in maintenance mode. Some URLs may not be available. |
Science

Tech-savvy investigators are ready to put algorithms under the microscope if companies let them

Much as companies already have outsiders review their finances and the security of their computer systems, they might soon do the same with their decision-making code.

In the nascent field of algorithm auditing, researchers evaluate the behaviour of decision-making code

Given the growing role algorithms play in so many parts of our lives such as those used by Facebook, one of its data centres pictured here we know incredibly little about how these systems work (Jonathan Nackstrand/AFP/Getty Images)

Decades before you could buy a plane ticket on your phone, there were computerized reservation systems (CRS). These were rudimentary information systems used by travel agents to book customers' flights. Andthey had one devious flaw.

By the early1980s,80 per cent of travel agenciesusedthesystems operated by American and United airlines. And it didn't take long before the two airlines realized they could use that monopoly to their advantage namely, by writing code designed toprioritizetheir own flights onCRS screens over thoseof their competitors.

Naturally, U.S. aviation regulatorsweren'tpleased, and the companies were ordered to cut it out. But the case described in a 2014 paper from researcher Christian Sandvig lives on today as one of the earliest examples of algorithmic bias.

It's a reminder that algorithms aren't always as neutral or well-intentioned as their creators might think or want us to believe a reality that's more evident today than it's ever been.

Facebook founder and CEO Mark Zuckerberg said last month he doesn't want anyone to use his company's platform which serves content according to complex, unseen algorithms 'to undermine democracy.' (Justin Sullivan/Getty Images)

In U.S. courts, reports generated by proprietary algorithms are already being factored into sentencing decisions and some have cast doubts on the accuracy of the results. Sexist training sets have taught image recognition software to associatephotos of kitchens with women more than men.

And perhaps most famously, Facebook has been the target of repeated accusations thatits platform,which serves content according to complexalgorithms, helped amplify the spread of fake news and disinformation, potentially influencing the outcome of the 2016 U.S. presidential election.

Yet, giventhe important role algorithmsplay in so manyparts of our lives, we know incredibly little about how these systems work. It's whya growing number of academics have established a nascent field for algorithmic audits. Much likecompanies already have outsiders review their finances and the security of their computer systems, they might soon do the same with their decision-making code.

Algorithmic auditors

For now, it's mostly researchers operating on their own, devising ways to poke and prod at popular software and services from the outside varying the inputs in an effort to find evidence of discrimination, biasor other flaws in what comes out.

Some of the field's experts envision a future where crack teams of researchers are called in by companies or perhaps on the order of a regulator or judge to more thoroughly evaluate how a particular algorithm behaves.

There are signsthis day is fast approaching.

Last year, the White House called on companies to evaluate their algorithms for bias and fairness through audits and external tests. In Europe, algorithmic decisions believed to have beenmade in error or unfairlymay soonbe subject to a "right to explanation" though how exactly this will work in practice is not yet clear.

A Harvard project called VerifAI is in the early stages of defining "the technical and legal foundations necessary to establish a due process framework for auditing and improving decisions made by artificial intelligence systems as they evolve over time."

Mathematician Cathy O'Neil, author of the book "Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy," founded an algorithm consulting company last year to help data-savvy companies manage risk and use algorithms fairly. (Cathy O'Neil)

Harvard is one of a handful of schools including Oxford and Northwestern with researchers studyingalgorithmic audits, plus a new conference devoted to the subject will kick off in New York next year.

Outside academia, consulting giant Deloitte now has a team that advises clients on how they can manage "algorithmic risks." And mathematician Cathy O'Neil launched an independent algorithm consultancy of her own last year, pledging "to set rigorous standards for the new field of algorithmic auditing."

Scrutinizing secret code

All of this is happening amidst rising political backlash against some of the most powerful tech companies in the world, whose opaque algorithms increasingly shape what we read and how we communicate online with little external scrutiny.

One of the challenges, says Solon Barocas, who researchesaccountability in automated decision-making at Cornell University, will be determining what, exactly, to scrutinize and how. Tech companies aren't regulated the same way asother industries, and the mechanisms that are already used to evaluate discrimination and bias in areas such as hiring or credit may not easily apply to the decisions that, say, a personalization or recommendation engine makes.

And in the absence of oversight, there's also the challenge of convincing companies there's value in letting in algorithmic auditors.O'Neil, the mathematician and a well-known figure in the field, says her consulting firm has no signed clients "yet."

Barocasthinks companies "actually fear putting themselves in greater risk by doing these kinds of tests." He suggests some companies may actually prefer to keep themselves and their users in the dark by not auditing their systems, rather than discovera bias they don't know how to fix.

But whether companies choose to embrace external audits or not, greater scrutiny may be inevitable. Secret and unknowable code governs more parts of our lives with each passing day. When Facebook has the power to potentiallyinfluencean election, it's not surprising that a growing number of outside observers want to better understand how these systems work, and why they make the decisions they do.