The Algorithm Conundrum
Algorithms are everywhere. From car insurance companies to digital advertising through to Facebook and Insta news-feeds, every aspect of our culture has been built on big data. Unfortunately this innovative and digitally-driven future is also kept in the hands of the rich elite.
Across this 90 minute documentary, Coded Bias follows documentarian Shalini Kantayya’s journey to discover the truth surrounding facial recognition software and its algorithmic bias. At least that’s where it begins anyway. This documentary goes a lot deeper than that, diving into different related topics including big data, world-consuming algorithms and more.
Against these challenging and difficult topics, Coded Bias hones in on three particular hotspots for this discussion – America, England and China. What begins as a look at racial bias through facial recognition software soon turns into a much more broad; a damning assessment of the authoritarian world we’re seemingly tumbling willingly toward.
The documentary itself is broken up nicely, separately out science fiction from the science in an easy to understand way. The narration is smart too, breaking things down with easy-to-understand knowledge and a call to action for us as individuals to fight back against those in control.
As someone who lives in the UK, it absolutely baffles me that we’re even having a discussion here about a Protest Bill coming into force, essentially making it illegal to “disrupt the peace”, with the threat of 5 years imprisonment if you do. This is a very frightening move toward that aforementioned authoritarianism, reinforced here through the facial recognition software the Met police use in this documentary.
Juxtaposing all of this is the case study of China, with specific emphasis on the Hong Kong protests and China’s transparency with its residents over how it’s using their faces and data.
Alongside that however, are many talking heads debating the “Wild West” culture in the US at the moment where there are no regulations or federal laws against algorithms or facial recognition software.
Where the documentary is particularly interesting though is in its depiction of bias. Or coded bias if you will. With machine learning driven by what’s happened in humanity’s history, it’s perhaps unsurprising to hear reports of sexist and racial inequality.
Numerous reports here show the shocking reality of algorithmic systems outright rejecting female applications for jobs. Alongside that, there’s also the precedent set for black men and women more likely to re-offend after being released from prison – at least according to this AI system in place anyway.
The biggest problem here, and something the documentary does touch on, is the regulation – or lack thereof – surrounding these machines. Sometimes a human touch is needed in certain situations but it seems like humanity is losing that in favour of a world run by machine learning and big data. It’s scary stuff and this movie certainly pulls no punches when it comes to showcasing the grim future we could be looking at.
Where Coded Bias is less effective, however, is its surface skimming of both sexism and racism. Sure they’re here, but what begins as a simple question driven by this very bias soon spirals and opens up into so many more avenues – too many perhaps for a 90 minute doc.
At the end of the day, data rights are human rights. If we have rules in place for the latter, should it not make sense that we also construct the same thing for the former? And if we really are that far down the rabbit hole, what will humanity’s future look like? It seems George Orwell’s 1984 should be renamed 2021; Big Data is watching you!