RGS-IBG Session Report – Digital Representations of Place: Urban Overlays and Digital Justice

By Mark Graham, Martin Dittus, Muki Haklay, Yu-Shan Tseng, Gillian Rose, and Rob Kitchin who presented talks, and the attendees of the subsequent workshop on “Digital Representations of Place: Urban Overlays and Digital Justice” at RGS-IBG 2018 in Cardiff on Wednesday 29 August 2018.

Over the last few decades, our cities have become increasingly digital. Urban environments are layered with data and algorithms that fundamentally shape our geographic interactions: impacting how we perceive, move through, and use space. A growing body of knowledge documents the unexpected societal impact such digital representations can have, for example when they favour the interests of one privileged group (such as tourists) at the expense of others. It has thus become imperative to ask questions about who owns, controls, shapes, and has access to those augmented and hybrid digital/physical layers of place. Now that over half of humanity is connected to the internet, do we see greater levels of representation of, and participation from, previously digitally disconnected populations? Or are our digitally dense environments continuing to amplify inequalities rather than alleviate them?

We are a group of researchers and professionals who engage with these questions in our work. Through our diverse experiences we have come to recognise the significant and often unexpected effects digital representations of place have on people’s understanding of the world, and their actions in the world. We seek to systematise this knowledge, and to provide guidance for practitioners, researchers, and policy-makers to address imbalances and inequalities in representation.

We came together for a workshop at RGS-IBG 2018 to reflect on these questions, with a focus on three central concerns:

  • What kinds of imbalances and inequalities are inherent in digital representations of place, and by which means can we observe them?
  • What concepts and framings can we use to describe these undesirable effects of digital representations of place?
  • What kinds of mitigation strategies can we offer to system designers and policy makers who seek to avoid these undesirable outcomes?

This session report can provide a first glimpse of the many aspects we discussed in our sessions. Note that this is not intended to be an exhaustive list. Rather, we seek to offer some minimal guidance to researchers and practitioners who are embarking on new projects relating to digital representations of place. We hope that this report can provide you with starting points for your inquiry, and serve as a fruitful source of provocations.

Imbalances and inequalities

What kinds of imbalances and inequalities are inherent in digital representations of place, and by which means can we observe them?

It is hard to talk about this in a general sense, it really depends on each specific scenario. For this report we will simply provide some general examples to illustrate the broader point.

One of the recurring themes in our discussions was the central tension between system designers and system users: the interests of these two groups can potentially come in conflict, and they frequently do. This may become evident in the form of certain design choices, for example regarding the privacy properties of a digital system. Sometimes there are tensions between different participating groups with different capacities and interests. For example, we discussed scenarios involving citizen activist groups who campaigned on urban issues using digital platforms. In some cases, we found a social disconnect between activists and the affected citizens, two groups who might use different language, prioritise differently, socialise differently, etc. Similarly there are tensions between users and producers of information, for example around questions of visibility (“do we want to be seen?”) In all these cases, we invite to reflect about who gets to prevail when tensions arise.

In some cases, such imbalances are the result of the new arrangements by which digital representations are produced. In others, they are reflections of existing pre-digital circumstances. In one such scenario we encountered different groups mobilising online to address an emerging concern of governmental repression. Although these groups were largely aligned in their interests, they belonged to distinct social classes in a society that is characterised by strong social stratification: different segments of society would not meet, and as a result would only have limited exposure to each others’ experience and activities. One large group of lower-class citizens was directly affected by government policy; yet they were almost entirely disconnected from a second group, constituted of politically embedded and powerful actors. Each of these groups was engaged with the issue online in different ways, but due to social stratification they were unable to combine their efforts.

Concepts and framings

What concepts and framings can we use to describe these undesirable effects of digital representations of place?

The above examples illustrate that it is useful to talk about undesirable outcomes in terms of equity, and not merely inequality. The concept of equity captures considerations of transparency (is the process clear to all?), agency (who can influence the process and the outcomes?), power (who gets to influence these conditions?), and others. An equitable process is one which gives those who are affected by digital representations a seat at the table, and ensures their concerns are heard and responded to. When talking about the imbalances of digital representations it is worth considering any concepts that allow us to describe equitable and inequitable arrangements.

Other concepts and framings we discussed:

  • The concept of place, and its relationship to memory, culture, identity and belonging, etc; place as a site of contestation; and the complex and nuanced ways in which place is now constituted through digital means.
  • The hybridity of online-offline systems (Haraway, others). To what extent do we still want to distinguish the “virtual” and the “physical”? How do these sides/sites relate to each other?
  • Complex shifts in the separation of the public and private, the familial (Haraway, others).
  • Performativity (Austin, Butler, others): “language which affects change in the world and functions as a form of social action”.
  • The body as a participant, a carrier; the body as enhanced by and modified through technology (Haraway, others).
  • Forms of labour involved in the creation of representations, including social labour and communicative action (Habermas).
  • Capital as an inherently inequitable driver of action, and associated notions of competitive advantage, forms of confidentiality, the marketplace and competition as a dominant frame, forms of managing risk and benefit, the significant externalities of large digital platforms, etc.
  • Ecology and landscape to describe the broader context of technical systems.
  • Temporality. How and by whom is time constructed and structured in this system?
  • Semiotic and linguistic tools that allow us to talk about the material: the data, coordinates, pixels, lines, etc.
  • Categories and binaries. How is the world described in these digital representations?
  • Ontology bias: are there algorithmic encodings of existing human biases? (O’Neil)
  • The many dimensions of difference across which participants can find themselves in tension: urban-rural, class, caste, age, race, ethnicity, gender, social/political/economic capital, …
  • The subaltern in critical theory (Gramsci, Spivak, bell hooks, others), “populations which are socially, politically, and geographically outside of the hegemonic power structure”.
  • Decision-making power: who has influence when conflicting tensions arise, who gets to be the final arbiter? To what extent can outcomes be contested?
  • Visibility and invisibility. Which voices are allowed to dominate? Linked to this, the politics of aesthetics (Rancière): “the delimitation of the visible and the invisible, the audible and the inaudible, the thinkable and the unthinkable, the possible and the impossible”.
  • Epistemic communities (Haas): “a transnational network of knowledge-based experts who help decision-makers to define the problems they face, identify various policy solutions and assess the policy outcomes”.
  • The complex global-local linkages that emerge out of all these relationships.
  • Hidden and visible infrastructure that establishes these linkages.
  • … and many more.

Mitigation strategies

What kinds of mitigation strategies can we offer to system designers and policy makers who seek to avoid these undesirable outcomes?

We found that in order to develop mitigation strategies against imbalanced digital representations, one needs to first articulate the perceived problem clearly and explicitly. Too often, time is spent discussing potential solutions to design issues without first clearly understanding which concern is intended to be addressed.

This is exacerbated by a finding that certain solutions require a broader conversation around organisational change because they do not fit well within existing institutional agendas, which can further complicate dialogue.

We further believe that there cannot be a simple checklist to address imbalances of digital representation. Instead, we believe that one needs to treat every scenario on its own merit. This requires discourse, thought, theory, understanding of the domain/context, and other ways of engaging with the setting.

When trying to address participation inequalities where not all groups are being equally represented, a good first step is to spend time to better understand and address barriers to participation. (See Haklay, others.) Further, don’t simply regard participation as an option or an opportunity, and instead reframe it as a right or even a responsibility. What is necessary so that every affected person can make use of their right to participate?

We discussed the inherent inequity of algorithmic systems: sociotechnical systems are often intransparent in their workings, and designed by an expert elite. This in inherently inequitable. To counteract this, we should ask that human oversight always remains a central part of such systems, and we should demand that outcomes are legible to and interpretable by people affected by algorithmic processes. One might call this an “unblackboxing” of these systems.

Finally, we found that any arising tensions often exist by design, even if unintentionally so, because digital representations are meeting grounds for disparate groups. This has a number of important implications.

First, one needs to acknowledge that there likely is no disinterested party – all participating groups (including system designers) have potential conflicts of interest. This is really a problem of governance, and reveals an important limit in the ways in which many digital platforms operate today. We see a need for institutions which can accompany digital platforms and foster constructive discourse across the existing inequalities of power, access, etc. This can take many forms: ombudspeople, regulatory oversight, legal and ethical frameworks, independent observers, bridge-building institutions, …

Second, one needs to acknowledge that there always are inherent inequalities between participating groups. How can we resolve this? This becomes a matter of social justice, and there is no single approach to addressing it. Instead, there is a wide range of potential models: egalitarianism, utilitarianism, libertarianism, marxism, feminism, … (David Smith, Kitchin, others). Any choice of a social justice model needs to be made under consideration of the specific context, and the subjective preferences of participants.

Third, one needs to acknowledge that it may not always be possible to resolve all tensions. We recommend to develop practices of constructive coexistence, rather than forcing an outcome or ignoring the tensions. There are many feasible models. One useful guiding concept is the notion of the pluriversal in decolonial theory (Mignoli), as counterpoint of the universal (the common assumption that there is one approach that can address all concerns). Technology design is often universalist, and blind towards the cultural assumptions embedded in it. Can we instead rearticulate the process so that multiple articulations of a representation can coexist, and so that any tensions can be experienced and interacted with?