Code is political, algorithms are weapons of math destruction 0

Benjamin Cadon

We hear a lot about them, but we never see them. What are these algorithms? These invisible and tantalizing creatures that slip into our minds and inhabit our pockets. What are their intentions?

Formally speaking, an algorithm is nothing more than an inoffensive series of operations fed by data to produce a result. Nevertheless, they automate the resolution of a set of complex problems 2 and that is how some of them become high level Artificial Intelligence, thanks to companies that stuff them with data, kindly provided by us for free.

A bestiary 3 of algorithms

There is no comparison for knowing what they eat and identifying and better understanding their role in a society of informaticized humans. They were not born of an electrical spark at the bottom of a sulphurous sea of data. Their progenitors are the human beings who write the lines of code that produce a programme that carries within it a political and social project dictated by a public or private sponsor.

Algorithms are never “neutral” or impartial. They focus on carrying out the mission assigned to them, usually by western males from the higher classes, cradled by capitalism.

It is also important to mention that a stupid algorithm fed with lots of good data will be more successful than the famous artificial intelligence, even if the latter has sharper claws. How can we not cite those American ogres, the GAFAM (Google, Apple, Facebook, Amazon and Microsoft) or BATX, their alter-egos on the other side of the Pacific (the Chinese giants: Baidu, Alibaba, Tencent and Xiaomi). Their metabolism is based on the collection, with our help, of the maximum amount of data about our smallest acts and gestures, “increasing” our day-to-day with a large number of mobile apps and connected objects which are supposedly meant to make our lives easier.

Algorithms that eat our personal data

The resulting algorithms are polymorphous. They have grown, observing us from afar, spying on our activities online, and the places we frequent most. They then rose above our interactions in order to better determine who had authority, ignoring the logic of popular voting and classifications based on merit.

Then, in a third moment, they entered our digital intimacy, analysing the quality and frequency of our exchanges in order to assess our reputation and trace our affinities.

Finally, they hide from view in order to better predict the tiniest of our desires, in order to be able to shape them.

To one side Above Within Below
Example Audience measurement, Google Analytics, advertising tabs Google PageRank, Digg, Wikipedia Number of friends on Facebook, Retweets on Twitter, notes and opinions Recommendations on Amazon, behaviour based advertising
Data Visits Relationships Likes Tracking
Population Representative samples Votes census, communities Social networks, affinities, declarative Implicit individual behaviours
Type of calculation Vote Classification by merit Benchmark Machine Learning
Principle Popularity Authority Reputation Prediction

According to Domenique Cardon in “À quoi rêvent les algorithmes”. 4

These different generations of algorithms still live together, side by side, and are easily recognisable in that they very efficiently provide us with many services. They try to make us pay our “digital dividend” 5 because they discretize our existence, cutting it into the finest possible slices, in order to extract all monetizable information 6.

Every State breeds a terrifying ogre that works in surveillance. The interests of this ogre frequently mix with those of its friends the commercial ogres, as it shamelessly raids their stores, with their approval 7. Its insatiable appetite leads it to stalk those places with the most data traffic. It is assumed that it should be able to find a terrorist in a haystack, although it often suffers from myopia and obesity, proving more efficient at stealing political and industrial secrets than at trapping the bad guys before they take action.

Algorithms that eat public data

The different administrative strata of the forces of order also cultivate flowering gardens of many-flavoured data: biometric, fiscal, environmental, urban, professional, or even linked to health.

Apparently neutral and objective, the public algorithmic creatures would be the solution to inequalities in treatment in the face of the arbitrations of some civil servants. Nevertheless, they can turn entire families into Kafkaesque insects hanging from the typewriter in the film Brazil 8. In fact, it is they who determine which school our child should go to, whether you can benefit from social subsidies, what jobs you can apply for, and if your menstrual cycle is ripe to procreate.

The traders in personal data kindly offer to help public bodies to digitalise and clone the most beautiful plants in the public garden, be they cultural flowers or medicinal herbs. Like the traders, the forces of order pass from observations to predictions, and not only to optimise garbage collection, but also send police forces to where there is the highest possibility that a crime will be committed, thanks to their algo-dogs, PredPol CompStat or HunchLab 9.

Algorithms that eat money

Thomas Peterffy is a financier who dedicated himself to replacing the brokers and their manual operations with automated machines. In 1987, on seeing that the number of orders placed by Peterffy was surprisingly high, those in charge of the markets sent an inspector, who, where he expected to find a room filled with white men shouting and sweating, found nothing more than an IBM computer connected to a single official Nasdaq terminal 10. So it was that in 1987, algorithms were launched onto the financial markets.

These days, algo-trading is everywhere, and the serene, algorithmic blinking of the information networks has replaced the hysterical traders. However, even these digital financial creatures have allowed themselves to been overtaken by high-frequency algo-traders, which move at the speed of light. They build routes to arrive at the sale faster than the others 11, making profits with every operation. They currently find refuge in the many “dark pools” that the banks have been able to create thanks to the paradoxical relaxing of regulations. In the lucrative comfort sometimes seen in the “Flash Crashes” 12, the diversity of algorithmic species increases (Blast, Stealth, Sniffer, Iceberg, Shark, Sumo,... 13) on a par with the complexity of their strategies, making the “markets” more and more illegible and uncontrollable, even though the assumption is that they are regulated by the stroke of invisible hands.

Evidently, this all impacts on what we call “the real economy”, that is to say, people's lives. For example, when Syrian pirates compromise the White House's Twitter Account and post an alarmist tweet that is immediately read by the algo-trader robots, causing the stock market to fall 136 billion dollars in just 3 minutes 14.

A new algorithmic creature has emerged in the finance jungle, in the form of a worm that duplicates in all the receiving computers and gets fatter as it is used, devouring, as it passes, an impressive amount of electricity 15. It is called a “blockchain” 16 and it has made itself known through “Bitcoin”, the first dematerialised crypto-currency to pass through a central banking body attached to a State. Today bitcoin is worth 28 billion dollars 17.

Luckily, initiatives like Ethereum 18 have allowed the worms to mutate so that not only do they register transactions, but they also drive databases and “intelligent” applications (“smart contracts”). This encourages projects such as DAO 19 (Decentralized Autonomous Organisation), a decentralised investment fund with no directors, where everyone participates in decision making as a function of the capital they hold. This fund quickly found itself surrounded by different investors, to the tune of 150 billion dollars.

Nevertheless, a malicious joker managed to get away with a third of it, by exploiting a fault (they call it a feature) in the code, irreparably marked on the body of a DAO hosted by Ethereum. Will it be necessary to cut out the rings of the sick worm? Or kill it to create a new one? The latter is the solution that was adopted to enable investors recover their money, following many “political” discussions, despite the fact that they work from the libertarian principal that “the code makes the law”. This raises important legal questions, particularly for defining responsibility in a distributed network 20 or imagining forms of governance for this “code” that, in some domains, is replacing the law in the U.S.

There are other algorithmic creatures that are fans of money and which seek to replace the work of human beings, maximising productivity and costs and thus contributing to a greater concentration of capital. The major companies understand this well, so Foxcom announces the replacement of almost all their employees with a million robots 21 or the law firm BakerHostetler contracts ROSS, an artificial intelligence, to faster study complex legal files 22. The “death of work” has been declared 23, however it seems that the economic and social regime will barely be able to sustain it in the (near) future.

Algorithms that eat human brains

The final family to be identified in our bestiary of algorithms are those whose will is to fill the human brain, and those who, on the contrary, ultimately aspire to replace it. Artificial Intelligences must be fed with data in order to be able to replace humans in a wide range of processes. This is something Google does with its reCAPTCHA 24 project, those illegible images that we are asked to decipher and transcribe to show the server that we are not robots, but rather humans, passing the Turing test in reverse 25. The great innovation with reCAPTCHA is that the fruit of your responses goes directly to feed artificial intelligence and the evolution of Google programmes: deciphering text to improve the digitalization of books, identifying house numbers to refine mapping, and now identifying images containing animals or road signs, to make car autopilots less myopic. The accumulated results are becoming more and more relevant, and they represent millions of hours of human labour 26.

In terms of the algorithm that contributes to feeding our brains, this is, like it's friend the personal data collector, becoming ever more elaborate and subtle. We feed its brain daily with the aid of a search engine that shows us where to find the right place, the most precise information, the most emblematic video. At the beginning of 2017, in 92.8% of cases that search engine was Google. This makes it a cultural dictator in a totally new hegemonic position (and what are the competition doing?!). Not appearing within the first results pages is like not existing. Yet the Google search algorithm is a jealously guarded industrial secret and can only be countered by the right to be forgotten 27.

From the surrealist experience of the researchers in the laboratory that is Facebook 28, who conducted experiments in 2010 on 61 million users, during the U.S. congressional elections, it is known that controlling political messages has a direct influence on the people who are made unwitting guinea pigs, as well as that of their friends, and friends of friends.

From false news reports that have crushed the truth on the social networks, ultimately swell the ranks of post-truth. What political line do the algorithms that govern content on our “walls” take? Incorporating solutions to problems of incitement to hatred and harassment on these platforms too quickly will place the algorithms and their controllers in the official position of controlling the morals of a large part of society.

One might think that to faster reach the point of technological singularity 29, our digital creatures are crouching in the shadows and plot to make us servile.

Algorithmic governance 30 would be a new mode of governing behaviour, fruit of shifts in our relationship with the other, with the group, with the world, with the very sense of the things that have, thanks to or despite the digital turn, fundamental repercussions on the way norms are created, and with them, obedience 31.

When an algorithm eats from the human brain, this can also lead to the clinical death of the human in question. This can be said of the algorithms that predefine the victims of killer drones, even if they are piloted by men and women. How do the algorithms of a driverless car chose the lesser evil/or number of deaths, when they are involved in an accident that cannot be avoided? Cyber war flies low over our occupied networks, each country sharpening its algorithms to be more and more insidiously lethal than the enemy.

How do we know if an algorithm is bad or good?

Is a bad algorithm one which turns video surveillance cameras into an army of blood-thirsty botnets that come down in droves to strangle the servers? Is a good algorithm one which reminds me of my friends' birthdays? Setting the criteria is not so simple, because we have to consider interdependence between algorithms, the data they use and the intentions behind them. Nevertheless, it can be hoped that a good algorithm will comply with the following:

  • it should be “auditable” and therefore consist of open and documented source code;
  • it should be “open” and therefore only feed on sets of “open data”, that are complete and “harvestable” by others, which means access should be discriminated and should be paid for certain commercial uses;
  • it should be “loyal and fair” without the capacity to create discrimination or injustice (social 32, gender-based 33, etc.) nor to damage human beings 34;
  • it should be “transparent” 35 and capable of conducting systematic audits of its own operations and evolution (if it has learning or predictive capabilities) and be capable of subjecting itself to citizen's control;
  • it should be “modifiable” and ready to respond to complaints that could require changes to the function of the algorithm.

In this search for algorithmic morality it is also necessary to mention the “ports”, the APIs (standing for Application Public Interfaces), which permit these digital creatures to hunt data from other servers and services, or to place containers, or lay bait... these APIs can be considered a patent-pending for industry, a new form of patenting anti-open-source software. These ports can be opened or closed at the strategic discretion of the owner, or tolls can be implemented when an algorithm's traffic becomes abundant, if such monetarization becomes opportune.

In the public sphere and civil society, we can imagine that the above mentioned criteria of openness, transparency, accountability and modifiability might be respected some day. This is harder to imagine in the lucrative, private sphere, where data and the algorithms that consume it are being considered “the oil of the future” 36.

Thus a group of American researchers and some “giants” of the digital world have tried to formulate the “principles for responsible algorithms” 37 and they have met to start an encounter about the ethics of artificial intelligence 38. This is a good way to say to politicians and concerned citizens that that the private sector can “anticipate and administrate” this complexity with positive results, so there really is no need to legislate.

Nevertheless, the issue is not to demand transparency for the code of the algorithms, but rather for their aims. As these are not limited to commercial communication, it is necessary to deploy the law as a means of coercion 39. We can seek comfort in the participatory debate taking place in France about the “Law of the digital republic” which has led to the obligation of transparency regarding all algorithms used by the forces of order 40, or even INRIA's “TransAlgo” initiative 41 which aspires to assess the accountability and transparency of information robots.

Sovereign algorithmic futurutopias

So, how do we pass from an algorithmic beast we must suffer to a pet that we feed? Let us compost some earthworms to draw the biotechnological ramifications that drive men and technology to live in silicon harmony. How can we take our destinies back into our own hands, retake our mental autonomy, our technological sovereignty which today is driven by algorithms in a space of social control.

Code is a political objective, as in this “numerical” world filled with algo-bots that invade our realities. As political objects, we can therefore attack with the classic weapons: militancy, lobbying and awareness raising with the political power, attempts to influence and deepen regulatory processes, and valuing initiatives that contribute to autonomy and happiness for human kind. It is equally important to demand a more important rôle for civil society in the regulation and norms of the Internet, and the adoption of standards for network technology 42, taking the equivalent of an article of a country's constitution as an example.

At an individual level, it is necessary, without a doubt, to “de-googlise” the Internet 43. That means, as the Framasoft association proposes, to support hosting of autonomous, transparent, open, neutral services based on solidarity (see, for example, the KITTENS initiative 44), or self-hosting 45 in an unambitious mini-server. It is also possible to camouflage oneself using end-to-end encryption, although this is not always adaptable nor possible to adopt (PGP and emails); and depending on the situation there may be resources to create interference, trying to hide the “true” data within fictitious but credible data, which a friendly algorithm can provide in abundance.

From the point of view of public power, there is work to be done, the road to ethical transparency is open, they just need to be firmly pushed down it. Of course, these days you need a strange haircut and makeup 46 to escape the facial recognition systems 47. Biometric files and the linking of public databases and the digital derivatives of the state of emergency, which is now permanent, invite us to not put all our bytes in one basket.

It is also possible to take part in feeding garbage to these “algo-AI”, just like the Twitter users who managed to turn Microsoft's AI TAY sexist, racist and pro-Hitler in less than a day 48.

We could imagine instead raising little “algo-ponies” that would exclaim, with a wave of their multi-coloured manes, against a background of green fields of data, that “friendship is magic!”.

Cheesiness aside, it is perhaps necessary to propose a digital intermediary, a “proxy” between us, our data and the public and private actors that host them. This intermediary could comfortably host Eliza 49, my strictly personal AI that feeds on my activities and preferences to help me better share data and content, anonymously, giving them to public bodies as a matter of general interest, encrypting them or hiding them to escape with my friends who did not manage to get out of the commercial social networks. Distributed in everyone's pocket, personal AIs could become symbiotic, in agreement with their tutors, to tell micro fictions to humanity in the political and cultural context, with a view to building harmonious realities where algorithms, humans, nature and the inorganic world can cohabit peacefully.

1. This title refers to the book by Cathy O’Neil: *Weapons of Math
Destruction: How Big Data Increases Inequality and Threatens
Democracy*. Crown, 2016.
2. In this Isaac Asimov futuristic novel, the United States has converted to
an "electronic democracy" where the computer Multivac selects a single
person to answer a number of questions.  Multivac will then use the
answers and other data to determine what the results of an election would
be, avoiding the need for an actual election to be
held. https://en.wikipedia.org/wiki/Franchise_%28short_story%29
3. https://fr.wikipedia.org/wiki/Bestiaire
4. Dominique Cardon: *A quoi rêvent les algorithmes. Nos vies à l’heure: Nos
vies à l’heure des big data*. Le Seuil, 2015.
5. Evgeny Morozov and Pascale Haas: *Le mirage numérique: Pour une
politique du Big Data*. Les Prairies Ordinaires, 2015.
6. http://centenaire-shannon.cnrs.fr/chapter/la-theorie-de-information
7. https://fr.wikipedia.org/wiki/PRISM_%28programme_de_surveillance%29
8. Terry Gilliam: Brazil (1985). http://www.imdb.com/title/tt0088846/
9. Cathy O’Neil: *Weapons of Math Destruction: How Big Data Increases
Inequality and Threatens Democracy*. Crown, 2016.
10. Some days later, he stipulated that the orders should come from the
keyboard of the terminal and gave Peterfly a week to disconnect from
IBM.  In this time, Peterffy contracted engineers to build a camera-eye to
read the screen, and send the information to the IBM brain where
electromagnetic hands could take the orders and transmit them to the
terminal via the keyboard.
11. Sniper In Mahwah: Anthropology, market structure & the nature of
exchanges. https://sniperinmahwah.wordpress.com/
12. The Flash Crash of 6th May 2010 analysed by Nanex:
http://www.nanex.net/20100506/FlashCrashAnalysis_Intro.html and
https://www.youtube.com/watch?v=E1xqSZy9_4I
13. Laumonier Alexandre: 5/6. Zones Sensibles Editions, 2014.
http://www.zonessensibles.org/livres/6-5/

14: https://www.washingtonpost.com/news/worldviews/wp/2013/04/23/syrian-hackersclaim-ap-hack-that-tipped-stock-market-by-136-billion-is-it-terrorism/

15. This creature is so costly (a single operation requires as much
electricity as an average American home uses in a day and a half), that it
is principally based in China and is currently very slow.
http://motherboard.vice.com/read/bitcoin-is-unsustainable
16. https://marmelab.com/blog/2016/04/28/blockchain-for-web-developers-thetheory.html
17. Capitalisation and everyday movements of crypto-currencies:
http://coinmarketcap.com/
18. https://www.ethereum.org/
19. https://en.wikipedia.org/wiki/The_DAO_(organization)
20. Primavera De Filippi: “Ethereum: Freenet or Skynet?”. Berkman Center, 2014. https://cyber.harvard.edu/events/luncheon/2014/04/difilippi
21. http://www.theverge.com/2016/12/30/14128870/foxconn-robots-automation-appleiphone-china-manufacturing
22. https://www.washingtonpost.com/news/innovations/wp/2016/05/16/meet-ross-thenewly-hired-legal-robot/
23. Bernard Stiegler: La Société automatique. L'avenir du travail. Fayard, 2015.

http://www.philomag.com/les-livres/fiche-de-lecture/la-societe-automatique-1lavenir-du-travail-11454

24. https://www.google.com/recaptcha/intro/index.html
25. https://en.wikipedia.org/wiki/Turing_test
26. http://www.bizjournals.com/boston/blog/techflash/2015/01/massachusettswomans-lawsuit-accuses-google-of.html

27: https://www.google.com/webmasters/tools/legal-removal-request?complaint_type=rtbf

28. A 61-million-person experiment in social influence and political mobilization:

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3834737/

29. https://fr.wikipedia.org/wiki/Singularit%C3%A9_technologique
30. Antoinette Rouvroy and Thomas Berns: “Gouvernementalité algorithmique et
perspectives d'émancipation: Le disparate comme condition d'individuation
par la relation?”. Politics of algorithms.  Web-metrics.  *RESEAUX*, Vol.31,
n.177, pp. 163-196 (2013). http://works.bepress.com/antoinette_rouvroy/47/
31. ifapa.me is a collective dedicated to research and subvert the effects
of mathematization and quantification of daily life in necrocapitalist
societies: http://www.ifapa.me/
32. https://www.washingtonpost.com/opinions/big-data-may-be-reinforcing-racialbias-in-the-criminal-justice-system/2017/02/10/d63de518-ee3a-11e6-9973c5efb7ccfb0d_story.html?utm_term=.b7f5ab5df1f9
33. http://www.genderit.org/feminist-talk/algorithmic-discrimination-andfeminist-politics
34. https://fr.wikipedia.org/wiki/Trois_lois_de_la_robotique
35. http://internetactu.blog.lemonde.fr/2017/01/21/peut-on-armer-la-transparencede-linformation/
36. Documentary “Le secret des 7 soeurs”:

http://secretdes7soeurs.blogspot.fr/

37. http://www.fatml.org/resources/principles-for-accountable-algorithms
38. http://www.lemonde.fr/pixels/article/2016/09/28/intelligence-artificielleles-geants-du-web-lancent-un-partenariat-sur-l-ethique_5005123_4408996.html
39. http://www.internetactu.net/2016/03/16/algorithmes-et-responsabilites/
40. https://www.service-public.fr/particuliers/actualites/A11502
41. https://www-direction.inria.fr/actualite/actualites-inria/transalgo
42. The Internet Engineering Task Force (IETF): http://www.ietf.org/
43. http://degooglisons-internet.org/
44. http://chatons.org/
45. http://yunohost.org/
46. https://cvdazzle.com/
47. http://www.lemonde.fr/pixels/article/2016/10/19/inquietudes-autour-de-lareconnaissance-faciale-aux-etats-unis_5016364_4408996.html
48. https://www.theguardian.com/technology/2016/mar/24/tay-microsofts-ai-chatbotgets-a-crash-course-in-racism-from-twitter
49. http://elizagen.org

results matching ""

    No results matching ""