There is a widely held belief that because math is involved, algorithms are automatically neutral.

 

 

This widespread misconception allows bias to go unchecked, and allows companies and organizations to avoid responsibility by hiding behind algorithms.

 

 

 

 

 

This is called Mathwashing.

 When power and bias hide behind the facade of 'neutral' math.

 

 

 

This is how it works:

There are 2 things you should realize:

 

 

1.
PEOPLE DESIGN ALGORITHMS

They make important choices like:

Which data to use..

How to weigh it..

 

2.

Data is not automatically objective either.

Algorithms work on the data we provide. Anyone that has worked with data knows that data is political, messy, often incomplete, sometimes fake, and full of complex human meanings.

 

Even if you have 'good' and 'clean' data, it will still reflect societal biases:

NEW TECHNIQUES LIKE "MACHINE LEARNING" ARE MAKING MATHWASHING A BIGGER PROBLEM:

Summarized:  


   
     

 

 

 

We can and we must deal with this:

1.
DEMAND ALGORITHMIC TRANSPARENCY

It shouldn't be a mission impossible to find out how and why a decision about you was made.

If you're deploying algorithmic systems, learn about their limitations. Hire an ethics expert to do an algorithmic audit.

 

2.
COMPARE ALGORITHMS TO THE LAW

Algorithms shouldn't be seen as simply a type of tool. Algorithms are a type of law.

Demand to know how 'what is good' is decided upon. In a democracy we decide this together. You should have a say. 

This is especially true in areas like "predictive policing" and "recidivism risk assesment", where algorithms are brought into our justice system, with little accountability or oversight.

Code is Law!


3.
THINK CRITICALLY

Understanding the limits of algorithms will help you judge their judgements.

By their very definition data and algorithms reduce a complex reality to a simpler view of the world. Only the parts of the world that are easily measurable can be used.

We each have a responsibility to avoid the  'religion of the algorithm', to see that people and situations have more facets than these simplified, reductionist visions would have us believe.

Resist the temptation of the algorithm. Always keep people in the judgement loop. In Europe this is now the law.

 


 

"Big data doesn't eliminate bias, we're just camouflaging it with technology". 


- Cathy O' Neil 

Algorithms are not neutral.
Algorithms are intransparent.

Just because it uses math, that doesn't make it science.

•  PEOPLE MAKE MISTAKES
•  PEOPLE ARE BIASED
•  PEOPLE HAVE AGENDAS
•  PEOPLE ARE NOT NEUTRAL

•  ALGORITMHS MAKE MISTAKES
•  ALGORITHMS ARE BIASED
•  ALGORITHMS HAVE AGENDAS
•  ALGORITHMS ARE NOT NEUTRAL

The term 'mathwashing' was first coined by Fred Benenson.