Content

Ethical Impacts of AI | Tech Colonization | Tech Coercion | Who makes the decisions? | So... do we use AI or not? | Addendum

Ethical Impacts of AI

[is this… my moral manifesto???]

We could spend an eternity comparing the levels of evil we commit every day. It’s a slippery slope to truly define the threshold of harm reduction in our personal lives, and it’s even harder to control in our professional lives.

The formula is: We need to commit harm on something/someone. What was the justification? Was it worth it? What/Who determines if it is worth it?

Eating a beef burger has a larger carbon footprint than 500 AI prompts. Regardless, when we make a choice to do either of those actions, what purpose does it serve in comparison to the harm it causes? Do we value the life of a tree that can live thousands of years as much as that of another human?

Regardless, that comparison is a distraction. The physical infrastructure of The Internet and Cloud Storage / Data Centers is the truly disastrous culprit. These disrupt ecosystems, pollute already under-resourced communities, and insidiously transfer costs AND environmental blame from corporations to residents. AI is not the only enemy, it’s simply the latest weapon. We don’t think about purchasing and using tech as purchasing and using knives or guns, but it’s the same. We just don’t see the blood when we use our computers and phones at home.

If environmental impacts by data centers affect reproductive health, how much do we factor that into our work? In a fantasy world, if the US moved all data centers to Thailand, would we still care if our main constituency was in the US?

And when is enough enough? This applies to both below scenarios:

Tech Colonization

When we think about American history, we think about how yt ppl colonized what we know as Mexico, how yt ppl colonized what we know as the US, how yt ppl colonized what we know as Canada.

Are we in the stage yet where we realize Tech Companies led by technofascist yt men have colonized all major nation states globally? They may have kneeled before the current administration, but certainly the administration is simply a stepping stone for them.

One of the slide decks from NTC had the following information:

colonization | noun

The act of taking land, labor, and resources from others, usually through the force of exploitation.

  • Forces the dominant group's culture, laws, and values as the "right" or "normal" way of life
  • Strips away power, voice, and self-determination from the people being colonized
  • Leaves behind lasting systems: wealth and privilege for some, poverty and exclusion for others

how we reinforce the system

  • Nonprofits often adapt to restrictive systems rather than challenging them
  • Competition for limited funding can discourage collaboration
  • Proposal narratives may sanitize system harm to appear "fundable"
  • Scarcity pressures shape organizational decisions
These dynamics are structural - not individual failures.

“I know as an Indigenous person that my healing is contingent on the oppressor also being a part of that healing process. I can heal on my own. But, so far, we can’t heal as a community at large until we are all engaged in [decolonization].”

— Edgar Villanueva, author of Decolonizing Wealth

Tech Coercion

Some of the common arguments to advocating for AI use My personal counterarguments
We don’t want to be left behind. I think this is a valid point, but it also ignores the power we have as a consumer collective to RESIST the status quo.

Humans are generally resistant to change. In a society that is practically religiously obsessed with Progress™️, it is very challenging to resist what they purport as change for the common good. No one wants to be Debbie Downer, and no one wants to admit that maybe she was right all along.
We have to become part of the customer base in order to become a stakeholder to shift the company’s direction and approach to AI ethics and responsibility. I think this is a completely invalid point. We’re all stakeholders in the tech industry at this point - what good has that done us so far? Did we change the mind of the rightwing CEO of Salesforce? Did we prevent Microsoft from deeply cutting nonprofit discounts?

We want to prevent the creation of additional data centers. How can we do that by using technology that creates additional need for more data centers? Granted, it’s largely not our consumer use that creates the need - it’s corporations, governments, militaries, etc. But they’ll certainly shift the blame to consumers.
We are already under-resourced and this would help us create capacity. I think this is a valid point, but I don’t think that AI literacy OR the product itself is READY in most cases for consumer use. I think AI could potentially help us create capacity if we were WILLING TO GIVE UP certain things. Ideally, we will wait a bit longer and see how the data privacy, corporate transparency, and functionality situations evolve.
This is just another new tool to learn and adapt to, like the camera or the car. But maybe we shouldn’t have even started using cars? It expanded the speed and distance of our individual reach, all while decreasing the viable land to live on, water to drink, air to breathe…

The framing is always, “humanity has always had to adapt to changing technologies, and this is no different,” but never, “humanity keeps creating technologies without considering harm reduction to humanity itself and the health of the ecosystem in which it requires to survive.”
The environmental effects are not as bad as many other things we do Just because we’re doing bad things, that doesn’t mean we should heap more bad things onto the pile.
We’ve already been using this tool, why complain now? We often experience situations where we don’t realize harm is being committed (to others or even to ourselves). Once we later have the consciousness, language, and strength to acknowledge that harm was done, we should say something, and we should change the status quo that allowed that harm to happen in the first place.
AI will help us solve the problems that AI is creating TBD and yes, I would like to remain optimistic, but instead I remain doubtful as the previous line of thought was that technology would help us solve the problems it created. Look where that got us.

Another slide deck from NTC had the following information:

What AI is and isn't

AI can help with: Without human oversight, AI cannot:
  • Brainstorming
  • Organizing & summarizing
  • Writing & editing
  • Clarifying
  • Repurposing
  • Researching
  • Presentations
  • Fundraising
  • Understand your community
  • Replace lived experience
  • Make ethical decisions
  • Build trust
  • Tell real stories
  • Determine qualified information sources

Who makes the decisions?

Yet another slide deck from NTC had the following information:

Two Key Questions

  • What happens if we ignore AI?
    • Personal/professional impact
    • Organizational impact
    • Sector impact
  • What happens if we embrace AI?
    • Environmental impact
    • Living in a capitalist system
    • More of our income/funding to tools

For nonprofit organizations, the decision making is typically made by the board and Executive Director(s).

As employees and people working together for a united cause, we can have open discussions about our work culture, our personal beliefs, and the intersections to influence decision making.

The people we work with and the ones we serve can also influence decision making, though they’re not typically invited to the conversation.

The people who are most harmed by these decisions are often not the same as any of the aforementioned people. Those are largely the people in the global south, though of course many of us feel residual affects to varying degrees.

So… do we choose to use AI or not?

In many cases, we don’t get to choose - it’s forced upon us. The tech industry largely doesn’t care about consent.

In some ways, it’s useful. In some ways, it’s frivolous. In all ways, it’s harmful, but that can be applied to almost every action we take, so we must lower our ethical thresholds to survive.

Finally, this slide deck from NTC had the following information:

Tech + Reproductive Justice

  • Analyze power systems -> how is the way we work / the tech we use encoding existing power structures?
  • Address intersecting oppressions -> are we choosing technology that can be used by everyone?
  • Center the most marginalized -> how is the tech that we use or make impacting oppressed people around the world?
  • Join together across issues and identities -> what technology strategies or choices do other movements and organizations make?

It comes down to:

Addendum

I'm obsessed with the concepts of Calm Tech.

When an escalator breaks down, it functionally becomes stairs. When a WiFi-enabled cat food dispenser breaks down, it functionally becomes a $200 piece of junk (also your pets will starve).

And thinking about windows as transparent technology that keeps dust and other things outside of your home. It's fascinating.

But then I think about our era of digital technology and industrial machinery. Big Tech™️markets itself with concepts like, "Save time with automation! Rely on technology to do the work for you! Simplify your life with TECHNOLOGY." They use the bells and whistles of the same product repackaged over and over again to distract us from the reality of tech overcomplicating our lives.

Bells and Whistles Repackaged

When I think about products like Dropbox, Google Drive, SharePoint, etc. They're all the same base product: a file storage and organization system with additional collaboration and sharing functionality. Then they competecoordinate on price markups with each other by introducing new bells and whistles, but their core products are still the same.

With this type of capitalist "competition" we, as consumers, also find ourselves experiencing perpetual choice paralysis. All the products are the same! Better remain loyal to XYZ brand. But wait, XYZ brand just ran out of money! I guess I'll switch to ABC brand. Oh, but they just got bought by Nestle! Wait, are all brands Nestle??


The Reality of Tech Complexity

Working in IT is hard. Tech shifts and evolves every second. New threats are created with every new technology, piece of code, and exhausted human being who simply can't keep up with this unprecedented rate of cultural change.

I don't blame anyone for not being able to keep up, for refusing to keep up, or for giving up completely. It's ridiculous. We're held hostage by 3-year old white boys who made up the game and keep changing the rules. Why are we even still playing this stupid game anyway? Oh, right, the stock markets and war crimes or something. #stonks





What can we do?

I don't fucking know! But here's what I believe.

Change starts with people and cultural shift. Kids need to WRECK their parents for maintaining the status quo like massive wankers. New parents need to raise cultural assassins and anarchists who believe in mutual aid, kindness, creativity, critical thinking, harm reduction, and learning/unlearning. We all need to get the fuck off of social media brain rot, build community and joy locally, stop thinking we always know what's best, and examine what we actually value as individuals (and whether or not our actions align with those values). Everyone needs to go to therapy! Therapy needs to be affordable! Does your job actually serve a societal function? Or does it just provide additional gambling money for the devils at the top?