Be your own boss: Examining algorithmic burdens on ride-hailing platform drivers in Lagos

screenshot of blocked account message

By Daniel Arubayi for the OII’s #DigitalInequality blog series

It is another day as a platform driver roaming the streets of Lagos, Nigeria, waiting for a ping on a smartphone device signalling a trip assignment from a platform algorithm (Figure 1).

driver sits inside car in busy street

Figure 1: Roaming the streets of Lagos for ride-hailing gigs. Source: Author’s fieldwork, November 2018

This has become the reality of platform drivers globally, where ride-hailing platforms, through algorithms, can control and manage the labour process under the guise of autonomy and flexibility for its workers. In places like Lagos, there is evidence of what Birhane (2020) identifies as algorithmic colonialism. This refers to how the current platformisation of traditional labour frontiers is enabling economic and cultural domination by platforms, which accumulate profit from rented revenues such as intellectual property, and use big data for surveillance capabilities and social control. Uber, the forerunner of ride-hailing platforms globally, has created an entry for other platforms in Lagos such as Bolt, Oga-taxi and several others, with the utopian aim of empowering the unemployed and underemployed towards entrepreneurism or, as they label it, ‘be your own boss’. Platforms’ classification of drivers as ‘driver-partners’, ‘entrepreneurs’, or ‘bosses’ reinforces the concept of drivers as independent contractors. In tandem, drivers in Lagos have internalised such classifications as prestigious, a clear delineation from traditional taxi drivers and a reflection of what riders should see them as; they call themselves ‘pilots’ and ‘comrades’. However, platforms’ algorithms are powerful: are drivers really their own bosses?

Min Kyung Lee and colleagues coined the term algorithmic management in 2015 to refer to the administration and control of the labour process of gig work through software algorithms that rely on big data and surveillance. Algorithmic management has replaced the traditional confinements of managing the labour process faced by conventional taxis with what Foucault (2008) identifies as the visible and unverifiable power of panopticism. In this case, drivers are aware that the labour process is being managed and watched through the app. However, the information collected is opaque to platform drivers, and this information asymmetry limits the control they have in the labour process. Algorithms are Janus-faced: on the one hand, they effectively promise and assign work to drivers, and on the other hand, they transfer new responsibilities – or what I call algorithmic burdens.  Algorithmic management impacts the efficiency, productivity and overall labour process. It is a manipulative approach that transfers the burden of risk and the critical decision-making process to drivers. At the same time, it obscures and neglects the everyday realities of drivers, such as their navigation through the chaotic city of Lagos, their confrontations and interactions with social vices, riders and law enforcement agents. Consequently, the reliance on emotionless and rigid algorithms contributes to the exploitation of labour.

In Lagos, I observed a number of these burdensome impacts on ride-hailing gig work. A key example is the unfair deactivation of drivers’ profiles on the Bolt platform in Lagos due to low ratings, which constitutes arbitrary discipline and punishment. Ride-hailing platform drivers, on many occasions, are unsure of what constitutes a low rating and why challenging it is often one-sided. At the time of my data collection in 2018 and 2019, respectively, drivers with less than 4.6/5 and 4.5/5-star rating Bolt and Uber platform automatically gets blocked (see Figure 2 for example) [i]. One of my interviewees, Uchea B.Sc. holder who started working for Uber and Bolt in early 2018, explained how he was once unfairly blocked on the Bolt app [ii].

According to Uche (August 2019), he politely explained and pleaded with a woman to cancel a trip because it was late in the night, and the destination being requested was not safe. The woman insulted him and threatened to report him. In his words:

The next day, they just blocked me due to low ratings. Most times, they won’t tell you why they are blocking you; they just said low rating and blocked me for 24 hours. 

screenshot of blocked account message

Figure 2: A low rating driver is blocked on Bolt. Source: Author’s fieldwork, September 2019

In this scenario, the driver is expected to take the trip at the expense of his own safety, which could lead to robbery or death in some cases. The management of algorithms limits the narrative of drivers’ realities, and the reliance on this data by platforms creates an unfair judgement, leading to arbitrary discipline and punishment. The mismatch in realities is compounded by the difficulty drivers face in using communication channels to rectify such issues of unfairness. In my study in November 2018, another Interviewee, Etuk, called for platforms to review drivers’ history and call drivers to get both sides of the story before unfair deactivations [iii]. This resonates with research confirming the need for human managers as a complement to algorithmic management, capable of resolving problems beyond code (Lee et al., 2015).

So, tell me, who is the boss here? If ride-hailing gig work is genuinely flexible and autonomous, why is it difficult for drivers to challenge algorithmic decisions? Drivers should not be assigned the burden of deciphering decisions and risks based purely on data, especially when there is little or no contribution to the managerial process of labour. Instead, platforms should start addressing the information symmetries that vary across contexts, and empower drivers to understand why and how certain decisions are constructed.

 

Endnotes

[i] Note that these ratings are subject to yearly reviews and may change with time. Currently, drivers below 4.4/5-star ratings are blocked on the Bolt platform, Uber remains the same.

[ii] Uche (pseudonym) is one of the 25 platform drivers sampled in my study Lagos, August 2019.

[iii] Etuk (pseudonym) is also one of the 25 platform drivers interviewed in Lagos, November 2018

 

Selected References

Birhane, A., 2020. Algorithmic Colonization of Africa. SCRIPT-ed, 17(2), pp.389-409.

Foucault, M. 2008. “Panopticism” from “Discipline & Punish: The Birth of the Prison”. Race/Ethnicity: Multidisciplinary Global Contexts, 2(1), 1-12. Retrieved May 10, 2021, from http://www.jstor.org/stable/25594995

Lee, K., Daniel, K., Evans, M. and Laura, D., 2015. Working with Machines: The Impact of Algorithmic and Data-Driven Management on Human Workers. In: CHI ’15 proceedings of the SIGGHI conference on Human Factors in Computing Systems. Seoul: New York, NY: ACM, pp.1603 – 1612

Möhlmann, M. and Zalmanson, L. 2017: Hands on the wheel: Navigating algorithmic management and Uber drivers’ autonomy, proceedings of the International Conference on Information Systems (ICIS 2017), December 10-13, Seoul, South Korea.

 

 

 

Margie Cheesman

Margie Cheesman is a digital anthropologist based at the Oxford Internet Institute. Margie works with communities using and making digitalisation projects. Her DPhil research examines the intervention of decentralised digital infrastructures (blockchains) in humanitarianism, and involves fieldwork in Jordan’s refugee camps. She is interested in how the development of money and identity infrastructures intersects with socio-economic inequalities and rights.