In 2019, people were worried that algorithms now know us better than we know them. No idea captures this better than supervisory capitalism, a term coined by American author Shoshana Zuboff to describe a gloomy new era in which like-minded Facebook and Google provide popular services while their algorithms are aging our digital footprint.
Paradoxically, Zuboff’s concern does not extend to the financial markets algorithms that have replaced many people in commercial floors. Automated algorithmic trading began in the early 21st century, first in the US but soon in Europe.
An important driving force was the high frequency trading, which ranges in blind speeds, up to billions of seconds. Investors are offered the prospect of taking advantage of their rivals while helping to provide liquidity to a market while ensuring that there is always someone willing to buy and sell at a specific price. High frequency trading is now well over half of the volumes in both stock and futures markets. In other markets, such as foreign exchange, algorithms have a smaller but still significant presence, with no indication that they will fall in the future.
The evils of the devices
People are still programming their algorithms and designing their trading strategies, although the rise of deep learning even makes this role threatening. But as the algorithms come alive in the markets, they act on their own initiative without human intervention, dancing with each other in gloomy and often unexpected ways.
At first glance, they don’t have much in common with us. They cannot think or feel, and despite the hype surrounding machine learning, it is still questionable and complicated to describe them as smart. Like human traffickers, however, they make decisions, observe others who make decisions, and adjust their behavior in response.
At speeds many times faster than people will likely get, these algorithms easily form expectations of each other’s expectations when placing their buy and sell orders.
For example, one algorithm may seek to manipulate another’s expectations of price changes by sending a large number of orders either to buy or sell a particular item. The first algorithm will quickly cancel its orders, hoping that it cheated on its opponent to make the wrong bet on the direction of the market.
Interestingly, sociologists believe that this kind of mutual prediction is a central feature of what it means for people to be social. They have long seen markets as highly social arenas. At the height of commercial floors, reading the social signs of other merchants correctly – a grimace or a smile, restless sounds, even the hub of the commercial floor – often traced the difference between wealth and destruction.
But if machines can be social, how similar or different is the way people really socialize? There are obvious differences, of course. While human traffickers of the past often knew each other well and often competed together after work, the algorithms traded anonymously. When sending orders to buy or sell assets, no other trader knows whether they come from a person or a machine.
Indeed, this is why they are planning to form expectations for each other. Face data is no longer available, but whole strategies have been developed to try to find out if several commands can be placed with the same algorithm and then try to predict what the next moves will be.
To avoid such attempts, algorithms are often designed so that they are not recognized as algorithms by other algorithms. As Scottish sociologist Donald MacKenzie put it, they can engage in intubation strategies and / or try to give a specific presentation of their ‘self’ to the public. These are again traits that sociologists have long considered essential aspects of metropolitan life.
Together with colleagues, I have spent the last few years in large financial hubs interviewing marketers, developers, regulators, exchange officials, and other financial professionals about these trading algorithms. This has brought out some other interesting similarities between humans and automated traders.
Developers readily recognize that as soon as their algorithms start interacting with others, they become entangled and act unpredictably, as if they were in a mob. Sociologists since the late 19th century have been studying how people are seduced by crowds and letting their autonomy cling to “social avalanches”, but so far we have ignored the fact that financial machines do something similar.
The “flash crash” on May 6, 2010 shows better what I mean here. In four and a half minutes, the frenzied interaction of fully automated trading algorithms has put US markets at the forefront, generating losses of about $ 1 trillion (GBP 768 billion) until it ends shortly.
Most of these commercial transactions were later canceled as “clearly incorrect”. Certainly no trader or developer had planned to create this massive price shift, but decades of sociological research tell us that this behavior is expected in large groups. We need to understand how our financial algorithms interact with each other before our tools become unreliable.
Of course, not all forms of social interaction are admirable or beneficial. Like humans, algorithms interact with one another in ways ranging from care and peaceful to cold and violent: from providing liquidity and maintaining market stability to generating manipulative orders and activating wildlife trading activities.
Addressing these interactions is not only the key to understanding contemporary negotiation and to preventing future barriers. Algorithms talk to each other in more and more fields today. Understanding how crowds behave hopes to shed light on areas where they are just starting to come into their own – to consider self-guided traffic systems or automated warfare, for example. It may even alert us to the awaiting avalanches.
The good, the bad and the ugly algorithmic commerce
This article was republished by Chat under a Creative Commons license. Read the original article.
What can we learn about ourselves from studying financial brokerage (2020, January 16)
retrieved on 16 January 2020
This document is subject to copyright. Except for any fair transaction for private study or research purposes, no
part may be reproduced without written permission. The content is provided for informational purposes only.