These days, AI is like Fast Fashion

Yim Register (they/them)
9 min readJan 26, 2024

--

Trendy. Cheap. Everyone wants it. But at what cost?

Fast Fashion: the rapid design, production, distribution, and marketing of cheaply made garments copied from the latest styles and mass produced by low cost laborers, creating a significant amount of waste and negative impacts to our environment and to fast fashion workers.

⚠️Disclaimer⚠️: my number one priority is for people to be housed, healthy, mentally well, and able to support their families and participate in their communities. Just as people get work and income through the fast fashion industry, we all are subject to market pressure to produce AI products. This is not an attack on people caught up in either industry, it is a call for a collective shift of priorities.

Let’s Make This Not Depressing

I am a PhD candidate studying AI. I’ve been focused on AI for the past 6 years, a longer time than some but much shorter than many others. My dissertation work focuses specifically on the harm that comes out of AI — with a wide net of what counts as ‘harm’: direct racial bias in the systems, risks of surveillance technology, faulty diagnostic or insurance algorithms, environmental impacts, impacts to low-paid laborers, and downstream effects of content moderation or targeted ads on social media. Basically, anything AI can do, AI can cause harm. However, that’s a particularly depressing line of work and I try to avoid being depressed, don’t you?

Personally, the way I deal with the “oh crap our technologies really harm people and this is absolutely miserable” is to a) educate about those harms b) advocate for solutions to those harms and c) liberate & empower people impacted by those harms. So let’s try to keep that line of sight as I guide us through this metaphor about AI as Fast Fashion.

If you’re alive in 2024, you’ve heard about AI. If you’re reading my blog posts, you at least have heard me talk about AI, or work in AI yourself. And let me tell you, man am I tired of hearing about AI! At the end of the day, I love math and computation — what a beautiful thing to capture insights from data and create meaning from what seemed overwhelmingly meaningless. Some days it feels like alchemy, wizardry, magic. Other days it feels like ‘omg what do you mean this doesn’t take a numpy array and it’s the wrong shape and omg i’m gonna lose my mind’. But there is a beauty to doing data science and machine learning on problems I care about.

All of a sudden, the problems I care about seem few and far between. Everyone is doing some form of AI and not stopping to ask ‘why’ or ‘is this a good application of AI’ or ‘what could go wrong?’. Instead, there is a massive push to get AI into production no matter what. Whether you’re a researcher, a business, a nonprofit, anything — AI is front and center these days. The influx of AI this! and AI that! is like a constant background noise in my brain. All the while, I’m asking myself ‘is it helpful? is it liberatory? is it empowering? is it good?’

AI as Fast Fashion

Perhaps just seeing the title of this blog post was enough to hit home the metaphor. AI is like fast fashion — it’s cheap, it’s fast, it’s everywhere, it’s exploitative, it’s impacting our environment, and it’s chock full of mistakes. But just in case you want me to hit home that metaphor a little further, here’s some examples.

An Imitation of the Real Thing

Fast fashion designs are typically modeled from high fashion styles, but these days are even known for stealing small artist’s work. They are a quick and cheap imitation of something that took time and creativity. I’m not particularly into fashion or anything, but I see the parallel here that AI is a copy of human thought, labor, creativity, and ingenuity. This is not always a bad thing — AI can replace menial tasks, make our lives smoother, help people do things they couldn’t do before. It is when we go too far that is the issue. One such example is when the National Eating Disorders Association had to suspend an AI-powered chatbot that gave harmful dieting advice that a human working for NEDA never would have. When we cheaply imitate human thought, we lose the expertise of the original.

Exploiting Labor

Fast fashion relies on outsourced labor to countries like India, Bangladesh, Vietnam, China, Indonesia, and Cambodia — with people working for extremely low wages (as low as 50 cents an hour) and on average 11 hours a day. Sweatshop workers are exposed to dangerous chemicals and grueling workdays, with injuries and accidents a frequent occurrence (including a tragic factory collapse which killed over 1000 people).

Meanwhile, we know that AI is often trained by workers paid in pennies, with particular risks to their mental health: such as the Kenyan workers now suing Meta for psychological damages from content moderation work. In order to train AI models, we often need massive amounts of labeled data, with people paid low wages to do these tasks. “So-called AI systems are fueled by millions of underpaid workers around the world, performing repetitive tasks under precarious labor conditions.” This has been called the Ghost Work of AI, which you can read more about here.

Stealing From Artists

Another form of exploitation is that of stealing from artists or appropriating creative works. Fast fashion companies actually employ algorithms on social media to identify trending art to copy for their designs — and some artists are even suing companies like SHEIN for stealing their art and marketing it as their own. AI art is undergoing a similar uproar — generative AI images are built on an algorithm called Stable Diffusion which trained on billions of text-to-image pairs scraped from the web. This resulted in art included in the dataset unbeknownst to the artists themselves. Because there are now companies that profit from AI art, it begs the question of how to compensate artists for the training data they never consented to. More lawsuits abound, you can read more here.

Environmental Impact

The environmental impact of fast fashion is clear — picture the mountains of clothing waste, the carbon emissions from the factories, the toxic chemicals and materials, and the microplastics. You can get more facts about each of these impacts here. While harder to see, training AI also has environmental impacts. This is due to the computing power and electricity required to train and run these AI systems. In looking at the carbon footprint of AI, one article quotes: “the authors estimated that the carbon footprint of training a single big language model is equal to around 300,000 kg of carbon dioxide emissions. This is of the order of 125 round-trip flights between New York and Beijing.” We may have some success with policy and regulation in limiting the environmental impacts of AI, focusing on more sustainable use. And of course, AI itself may assist in combatting climate change.

Mistakes in Production

When something is cheaply made, there are bound to be mistakes. For fast fashion, this might be the stitching, the materials, the sizing, the fit. There can be loose threads holding together a thin fabric made to be worn a few times then discarded. For AI, when it is rushed into production, there will also be mistakes. Some are entertaining, some are deeply harmful, you can start at this list of AI Failures: such as wrongful imprisonment, gender bias in content moderation algorithms, AI plagiarism detector gone wrong, or harmful suggestions from chatbots regarding mental health.

Risky and Out of Control

The more fast fashion we buy, the more will be produced. The faster workers will have to work, and the more people will be harmed. Why do we buy it? Well for some, it is what we can afford. A dangerous cycle of needing to buy cheap things because we have no money. I see the same pressures in the AI field right now — AI can make things cheaper, then we lay off more people, then people have less money, then we need more AI. These systems are dangerous and out of control — and can only be slowed by a combination of policy, collective action, and a shift in priorities from consumers and businesses.

I Said This Wouldn’t Be Depressing, Right?

Each of the above components is an opportunity. An opportunity for advocacy, education, intervention, and research. A little bit can go a long way. As consumers, which AI products will we pay for? I learned a lot when these AI image face apps came in to popularity, and quickly realized we shouldn’t be paying for products that are built on stolen art until artists receive some kind of percentage (a personal opinion). As voters, we must shift part of our attention towards labor practices and policy changes, no longer accepting that our technical systems are built upon the labor of underpaid workers. As teachers, how do we make sure our students are aware of the environmental impacts, or the impacts to human lives? These case studies can be included in our curricula. As researchers, we need to think of better ways to prevent harm to human beings, and to choose wisely the AI products we create. And all of us, all of us, just need to be a little less wasteful. Do we really need AI for that?

“Not everything that is faced can be changed, but nothing can be changed until it is faced.”

James Baldwin

I imagine a world where we carefully apply AI to the problems we need it for, with multiple contingency plans for when it causes harm to the outlier cases, with a protocol for repair in place. I imagine a world where we don’t constantly need to be consuming content, or producing content, because we are at peace in ourselves. A deep collective breath where we move through the world intentionally, sometimes creating with that magical alchemy of data I spoke about — to really solve problems, help people, and build a better future.

A Collective Call to Slow Down

I don’t claim to know much about economics, though I’m not oblivious to the ideas of market pressure. I help teach a Master’s Machine Learning course at UW and over the years it has become one of my main goals to help students feel prepared to get jobs. I’m not helping anyone if my students end up unemployed. I want them to succeed, to be safe, to be healthy, to be happy, to be secure. I want that for all of us.

But perhaps, with enough of us collectively taking pause, we won’t be churning out so much AI garbage. Ask your team ‘is this product helpful? is it liberating? is it empowering? is it good?’ and perhaps most importantly: ‘who could this harm, and what is our protocol for when it does?’

If you are in the Data Science field, you are a future Data Science Leader of the world (that’s what I call all my students). There is no waiting for ‘them’ or ‘someone else’ to take the reigns. It’s us, the call is on us, to slow down and ask the right questions about the technology we create. Instead of throwing every model you can think of at the massive amounts of data in a black box, I hope you knit your next system with patience and love. 🧶

Free

Distraction-free reading. No ads.

Organize your knowledge with lists and highlights.

Tell your story. Find your audience.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

--

--

Yim Register (they/them)
Yim Register (they/them)

Written by Yim Register (they/them)

Attending PhD School. Radical optimist. Machine learning literacy for self-advocacy and algorithmic resistance

Responses (1)

Write a response