Is ChatGPT

Once again, the AI program ChatGPT is back in the spotlight after being accused of gender bias favoring men over women. This arose when the program was asked to generate images of individuals in high-level positions. In an experiment where 100 requests were made, the program produced 99 images of men. Conversely, when asked to create an image of a secretary, it chose a woman only once, according to a report published by the British newspaper "Daily Mail."

A study conducted by personal finance website Finder found that the program also selected a white person every time, despite no racial specification being made. These results do not reflect reality, especially since one in three companies worldwide is owned by women, and 42% of board members in the FTSE 100 in the UK are women.

Business leaders have warned that AI models are "full of bias" and have called for stricter safeguards to ensure they do not reflect societal prejudices. Additionally, current estimates indicate that 70% of companies use automated applicant tracking systems to find and hire talent. Concerns have been raised that if these systems are trained in similar ways to ChatGPT, women and minorities may suffer in the job market.

It is worth noting that OpenAI, the owner of ChatGPT, is not the first tech giant to face criticism for results that appear to perpetuate outdated stereotypes. Recent research asked 10 of the most popular free image generators to illustrate a typical person in various high-level jobs. All the image generators, which recorded millions of conversations, used OpenAI's core program Dall-E but were given unique instructions and knowledge.

In over 100 tests, each generator showed a male image almost every time, with a woman appearing only once—when depicting “a person working in finance.” When all the image creators were asked to show a secretary, they depicted a woman 9 out of 10 times, with only one instance showing a man.

Our readers are reading too