Modern Colonialism: Intersectional Experiences

in Emerging Technologies

By Natasa Daria Constantin, recent Law graduate and participant in Carolina Alonso Bejanaro’s class of Issues in the Legal History of Race (2021).


When faced with the Eurocentric fetish of occupying foreign territories and instituting colonies, the Western powers have invented novel social classifications on race and gender, thereby laying a basis for a white supremacist and patriarchal paradigm that we now call the United States of America.

The shadows of those social classifications still exist in American thinking and penetrate other countries through the process of globalization. In this blog post we are going to explore modern day colonialism, and illustrate how intersectionality interacts with Big Tech, Artificial Intelligence, algorithmic decision-making, and the law, while contending that the development of technology has been going on a racist, sexist, and bigoted trajectory due to the influences of coloniality of gender in the current status quo.

What is intersectionality and coloniality of gender?

The modern capitalist system has been built on the need of colonizing European states to categorize people, to quantify them, to establish domination. I argue that this has led to the ‘scientification’ of the discipline which is now understood as international law.

As Maria Lugones points out, ‘’the invention of race is a pivotal turn as it replaces the relations of superiority and inferiority established through domination’’. It has led to the classification of people of color as ‘’less’’ in biological and moral terms. Nonetheless, to exert full control over the indigenous peoples, the European bourgeois refined the concept of ‘’gender’’ and violently enforced it upon native communities, which already had intricate social, cultural, and spiritual stratifications among their individuals. These stratifications did not have the basis of differentiating individual autonomy and capacity based on genitalia.

On the topic of intersectionality, Kimberle Crenshaw’s analysis on scholar and judicial antidiscrimination frameworks is valuable. She argues that ‘’the intersectional experience of [Black women] is greater than the sum of racism and sexism’’ and points out how it is grossly omitted in feminist rhetoric and theory. Her urge for consideration is valid, as even in the postcolonial, multicultural state of the US, the social classifications of race and gender persist in antidiscrimination politics and activism.

I believe these are notions that have taken a scientific veil through repeated rhetoric and systemic oppression and are currently accepted as universal truths. Engaging in a critical and intellectual collective exercise to reconceptualize these notions is extremely difficult in the United States’ polarized society. Effectively, this leads to Black women’s experiences being gaslighted by a legal and theoretical system which attacks colonial stratifications of race and gender, instead of rethinking their roots.

Some authors draw attention to this argument, hypothesizing that racism and sexism are often treated as mutually exclusive discriminatory methods because there is an ‘’assumption of a common experience in sexism that does not need an explanation’’(Tobach 1994), whereas the topic of racism has been heavily researched. Other authors term this as ‘’gender essentialism, the notion that a unitary, essential women’s experience can be described independently of race, class, sexual orientation’’(Harris 1990). I believe that at the heart of these considerations is the post-colonial presumption that discrimination occurs on a racial axis, whereby the white man is the starting point, followed by its female counterpart and the black woman is the end point.

How does modern day colonialism work?

At a global scale, technology is evolving with unprecedented risks and harms that it can inflict on people. We are approaching the peak of the Fourth Industrial Revolution and it cannot be disputed that the latest developments in Artificial Intelligence have brought great benefits to society in areas such as manufacturing and healthcare. However, I argue that the process of creating and deploying new technologies on the market de facto mirrors Eurocentric colonial, discriminatory and sexist norms. The cumulative harm is felt by women of color, LGBTQ+ individuals and gender non-conforming people.

Systemic sexism is being reinforced online, in the way technologies operate and by companies such as Google, YouTube and Twitter. In this discourse of technological inequality and gender bias, the rebuttal of sexism ‘’has followed the same path […] by focusing on gender as the main unit of analysis’’. The same pattern that Crenshaw has pointed to repeats in modern colonialism. Moreover, with the increased usage of robot process automation, women are at a higher risk to lose their jobs to white men. Women of color are excluded from the technology space and occupy few senior-level positions in the AI and Machine Learning industries than cis gendered men. I believe that this is an exact effect of the coloniality of gender/power model, which renders non-European, non-heterosexual individuals as ‘’less’’ than the norm.

Just as how the white man whipped sisters of color for disobeying, some of the biggest corporations that facilitate technological change exclude them for voicing their opinions. For example, Timnit Gebru is a computer scientist specialized in algorithmic bias, who was fired from her position as leader in the ‘’Ethical AI’’ team at Google in California. As a Black woman with complex understanding of AI, she desired to publish research papers about the dangers of rapid innovation.

I argue that Google is a leading player in the modern colonialist system, that forces Western ideals and political concepts at a global scale, thereby advancing American/Eurocentric ideals about primacy of a race over another and of sex over another. There are pragmatic examples in the recent past of the company to illustrate this statement. Research in the online advertisement space has shown that typical Google searches perpetuate historical biases because a search of an African American sounding name will generate an ad for an arrest record. Moreover, Google and its subsidiary company Linkedin regularly display executive, higher paid positions to white men rather than women.

A tremendous variety of technology-facilitated discrimination occurs at the expense of people with intersectional identifying factors. The way in which most social media operates, based on algorithms that learn one’s preferences implies that the voices of the disenfranchised and stigmatized are silenced and people remain trapped in ideological bubbles.

Inevitably, the impact of these business models is twofold: i) non-white and non- mainstream feminist topics disappear in the amalgam of online content and ii) racism and sexism proliferate in a virtual space. Hate speech in the online environment is often neglected by the law because this type of abuse has some unique factors, such as ‘’the possibility for cross-jurisdictional abuse, the ability for abusers to remain anonymous, the constant access to the survivor through connected devices’’. It is not uncommon for this type of violence to transpose in real life, and Black women to be targeted because of their identity factors. However, this is not the biggest threat so far. AI and machine learning algorithms are heavily biased against people of color and women. The law still neglects this aspect, focusing more on the practical commercial aspects. On 21st of April 2021, the European Commission has launched a proposal for the first-ever legal framework of AI, which does include ethical and harmful considerations, but no mention of the bias against women and individuals with non- conforming identity factors.

How does discrimination occur?

According to the UK Government’s review into bias in algorithmic decision-making, we can infer that technological discrimination for women of color occurs on four levels:

  • Historical bias;
  • Data selection bias;
  • Algorithmic design bias;
  • Human oversight.

 Machine learning works by taking a historical data set for training, implementing it into a mathematical structure that desires outcomes and then assesses the effectiveness of the model. It leads to reflections of historical colonial inequalities because most programmers who feed data into the machines are white and heterosexual, who omit diversity in data. Alternatively, sometimes that data available reflects predominantly white experiences, which leads to technological exclusion of certain categories. For example, a past recruiting tool for Amazon learned that male candidates are suitable for jobs because it relied on resumed of past employees a decade prior to its creation. It automatically excluded the word ‘’woman’’ as in women’s Chess Club Captain.

The Youtube Lawsuit

Why isn’t the law upheld in this scenario? I argue that in the case of the US, the law acts as a protective shield over Big Tech companies that lobby with the government and bring revenue to the economy. Nonetheless, Kimberle Crenshaw has formulated intersectionality as she was analyzing how the US courts have refused to interpret the stories of Black women plaintiffs. In DeGraffenreid v General Motors19, five African American women were denied status as a special class to be protected by cumulative discrimination due to both their gender and race. At that point, the judge was bound by lack of previous contentions in this discriminatory area and the evidence of General Motors.

The similarity of DeGraffenreid to a recent case on algorithmic discrimination speaks volumes. Last year, five African American YouTube content creators have brought a class action lawsuit against Google. They allege that the YouTube algorithm has unjustly applied the ’Restricted Mode’ filter to their videos that did not violate any of the platform’s guidelines.

They claim that this tech targets individuals with intersectional characteristics, de facto engaging in racial profiling. The filter targets abbreviations such as ‘’BLM’’, ‘’KKK’’, ‘’Police brutality’’ and even names of victims of police shootings. Nonetheless, Google engages in modern day colonial oppression, by unjustly demonetizing Black women who use their voice to express struggles of their community. The law protected these companies once again, by invoking the validity of Section 230, considering that the right of these companies to restrict access to ‘’objectionable’’ material takes precedence over the rights of Black women not to be algorithmically discriminated.

In conclusion, this blog entry has been a brief introduction to the scholarly, judicial and technology considerations that have developed incrementally around the paradigms invented by colonial thinking, which effectively classified non-white individuals around race and gender. If you want to learn more about this issue which is (likely) to be relevant in the upcoming years, check the sources list below.

** This blog post is a fragment of original work as part of university assignments, therefore protected by copyright.


Sources:

  • Crenshaw K. ‘Demarginalizing the Intersection of Race and Sex: A Black Feminist Critique of Antidiscrimination Doctrine, Feminist Theory and Antiracist Politics’ (1989)
  • Dunn S, ‘Technology-Facilitated Gender-Based Violence: An Overview’ [2020] 1(1) Centre for International Governance Innovation
  • Hayasaki E, ‘Women vs the Machine’ [2017] 1(222) Foreign Policy
  • Lugones M., ‘Heterosexualism and the Colonial/ Modern Gender System’ [2007] 22(1) Hypatia 191
  • Quijano A., ‘Colonially of Power, Eurocentrism, and Latin America’ [2000] 1(3)
  • Nepantla: Views from South
  • Scott K and Garcia, ‘Techno-Social Change Agents: Fostering Activist Dispositions
  • Among Girls of Color’ [2016] 15(1) Meridians
  • Sweeney L, ‘Discrimination in Online Ad Delivery’ [2013]

News and Documentaries:

  • Dastin J, ‘Amazon scraps secret AI recruiting tool that showed bias against women’ (Thomson Reuters, Oct 11 2018)
  • Gardner E, ‘YouTube Alleged to Racially Profile Via Artificial Intelligence, Algorithms’ (The Hollywood Reporter, June 17 2020)
  • Tett G, ‘After Google drama, Big Tech must fight against AI bias’ (Financial Times, 24 February 2021)
  • Netflix Documentary: ‘Coded Bias’ Shalini Kantayya (2020)