Code Like A Girl

Welcome to Code Like A Girl, a space that celebrates redefining society's perceptions of women in…

Follow publication

Member-only story

Can AI Imagine A Female Software Engineer?

A Midjourney Case Study

Caroline Arnold
Code Like A Girl
Published in
5 min readDec 5, 2023

Male, female, and non-binary icon. Image generated using Midjourney v5.2.

It’s no secret that machine learning algorithms are biased when it comes to representing people who don’t happen to be male, white, and able-bodied. Numerous experiments have shown that facial recognition algorithms struggle to identify people of colour, resume scanners deem women less capable, and predictive policing algorithms tend to reinforce existing mistreatment of marginalized communities.

Generative artificial intelligence (AI) algorithms take user prompts and generate text or images based on that input. They pride themselves on being able to think of anything. Generative AI became popular last year when ChatGPT made its service easily accessible through a browser interface. Other companies have followed suit, and there is now a wide range of accessible generative AI tools. Here I focus on the image generation platform Midjourney.

Imagine a group of software engineers …

In an ongoing project to evaluate gender equality in generative AI, I tried the following prompt on the Midjourney platform:

Software engineers in a code review session, sitting together and providing feedback on each other’s code.

Create an account to read the full story.

The author made this story available to Medium members only.
If you’re new to Medium, create a new account to read this story on us.

Or, continue in mobile web

Already have an account? Sign in

Published in Code Like A Girl

Welcome to Code Like A Girl, a space that celebrates redefining society's perceptions of women in technology. Share your story with us!

Written by Caroline Arnold

AI Consultant, PhD in Physics. I write about artificial intelligence, data analysis, science, and diversity. https://www.linkedin.com/in/crlnarnold/

Responses (16)

Write a response

Interesting article! Dalle3 combats bias with active prompt rewriting and diversity injections. I'm currently working on a test to diagnose how this works

--

In my history of Science and Technology class we had a similar idea. Thanks, this was very thought provoking. Hopefully algorithm bias won't be to big of the problem. 🤞

--

I guess that you will have an interesting result also if you try to make good pictures of software developers with impairments.

On a side note, we found that the Xbox software for the Kinect camera did not recognize a person who sits in a wheel chair...

--