Showing posts with label Gender and Artificial Intelligence. Show all posts
Showing posts with label Gender and Artificial Intelligence. Show all posts

Artificial Intelligence - Gender and Artificial Intelligence.


Artificial intelligence and robots are often thought to be sexless and genderless in today's society, but this is not the case.

Humans, on the other hand, encode gender and stereo types into artificial intelligence systems in a similar way that gender is woven into language and culture.

The data used to train artificial intelligences has a gender bias.

Biased data may cause significant discrepancies in computer predictions and conclusions.

These differences would be said to be discriminating in humans.

AIs are only as good as the people who provide the data that machine learning systems capture, and they are only as ethical as the programmers who create and supervise them.

Machines presume gender prejudice is normal (if not acceptable) human behavior when individuals exhibit it.

When utilizing numbers, text, graphics, or voice recordings to teach algorithms, bias might emerge.

Machine learning is the use of statistical models to evaluate and categorize large amounts of data in order to generate predictions.

Deep learning is the use of neural network topologies that are expected to imitate human brainpower.

Data is labeled using classifiers based on previous patterns.

Classifiers have a lot of power.

By studying data from automobiles visible in Google Street View, they can precisely forecast income levels and political leanings of neighborhoods and cities.

The language individuals employ reveals gender prejudice.

This bias may be apparent in the names of items as well as how they are ranked in significance.

Beginning with the frequency with which their respective titles are employed and they are referred to as men and women vs boys and girls, descriptions of men and women are skewed.

The analogies and words employed are skewed as well.

Biased AI may influence whether or not individuals of particular genders or ethnicities are targeted for certain occupations, whether or not medical diagnoses are correct, whether or not they are able to acquire loans, and even how exams are scored.

"Woman" and "girl" are more often associated with the arts than with mathematics in AI systems.

Similar biases have been discovered in Google's AI systems for finding employment prospects.

Facebook and Microsoft's algorithms regularly correlate pictures of cooking and shopping with female activity, whereas sports and hunting are associated with masculine activity.

Researchers have discovered instances when gender prejudices are purposefully included into AI systems.

Men, for example, are more often provided opportunities to apply for highly paid and sought-after positions on job sites than women.

Female-sounding names for digital assistants on smartphones include Siri, Alexa, and Cortana.

According to Alexa's creator, the name came from negotiations with Amazon CEO Jeff Bezos, who desired a virtual assistant with the attitude and gender of the Enterprise starship computer from the Star Trek television program, which is a woman.

Debo rah Harrison, the Cortana project's head, claims that their female voice arose from studies demonstrating that people react better to female voices.

However, when BMW introduced a female voice to its in-car GPS route planner, it experienced instant backlash from males who didn't want their vehicles to tell them what to do.

Female voices should seem empathic and trustworthy, but not authoritative, according to the company.

Affectiva, a startup that specializes in artificial intelligence, utilizes photographs of six million people's faces as training data to attempt to identify their underlying emotional states.

The startup is now collaborating with automakers to utilize real-time footage of drivers to assess whether or not they are weary or furious.

The automobile would advise these drivers to pull over and take a break.

However, the organization has discovered that women seem to "laugh more" than males, which complicates efforts to accurately estimate the emotional states of normal drivers.

In hardware, the same biases might be discovered.

A disproportionate percentage of female robots are created by computer engineers, who are still mostly male.

The NASA Valkyrie robot, which has been deployed on Shuttle flights, has breasts.

Jia, a shockingly human-looking robot created at China's University of Science and Technology, has long wavy black hair, pale complexion, and pink lips and cheeks.

She maintains her eyes and head inclined down when initially spoken to, as though in reverence.

She wears a tight gold gown that is slender and busty.

"Yes, my lord, what can I do for you?" she says as a welcome.

"Don't get too near to me while you're taking a photo," Jia says when asked to snap a picture.

It will make my face seem chubby." In popular culture, there is a strong prejudice against female robots.

Fembots in the 1997 film Austin Powers discharged bullets from their breast cups, weaponizing female sexuality.

The majority of robots in music videos are female robots.

Duran Duran's "Electric Barbarella" was the first song accessible for download on the internet.

Bjork's video "The Girl And The Robot" gave birth to the archetypal white-sheathed robot seen today in so many places.

Marina and the Diamonds' protest that "I Am Not a Robot" is met by Hoodie Allen's fast answer that "You Are Not a Robot." In "The Ghost Inside," by the Broken Bells, a female robot sacrifices plastic body parts to pay tolls and reclaim paradise.

The skin of Lenny Kravitz's "Black Velveteen" is titanium.

Hatsune Miku and Kagamine Rin are anime-inspired holographic vocaloid singers.

Daft Punk is the notable exception, where robot costumes conceal the genuine identity of the male musicians.

Sexy robots are the principal love interests in films like Metropolis (1927), The Stepford Wives (1975), Blade Runner (1982), Ex Machina (2014), and Her (2013), as well as television programs like Battlestar Galactica and Westworld.

Meanwhile, "killer robots," or deadly autonomous weapons systems, are hypermasculine.

Atlas, Helios, and Titan are examples of rugged military robots developed by the Defense Advanced Research Projects Agency (DARPA).

Achilles, Black Knight, Overlord, and Thor PRO are some of the names given to self-driving automobiles.

The HAL 9000 computer implanted in the spacecraft Discovery in 2001: A Space Odyssey (1968), the most renowned autonomous vehicle of all time, is masculine and deadly.

In the field of artificial intelligence, there is a clear gender disparity.

The head of the Stanford Artificial Intelligence Lab, Fei-Fei Li, revealed in 2017 that her team was mostly made up of "men in hoodies" (Hempel 2017).

Women make up just approximately 12% of the researchers who speak at major AI conferences (Simonite 2018b).

In computer and information sciences, women have 19% of bachelor's degrees and 22% of PhD degrees (NCIS 2018).

Women now have a lower proportion of bachelor's degrees in computer science than they did in 1984, when they had a peak of 37 percent (Simonite 2018a).

This is despite the fact that the earliest "computers," as shown in the film Hidden Figures (2016), were women.

There is significant dispute among philosophers over whether un-situated, gender-neutral knowledge may exist in human society.

Users projected gender preferences on Google and Apple's unsexed digital assistants even after they were launched.

White males developed centuries of professional knowledge, which was eventually unleashed into digital realms.

Will machines be able to build and employ rules based on impartial information for hundreds of years to come? In other words, is there a gender to scientific knowledge? Is it masculine or female? Alison Adam is a Science and Technology Studies researcher who is more concerned in the gender of the ideas created by the participants than the gender of the persons engaged.

Sage, a British corporation, recently employed a "conversation manager" entrusted with building a gender-neutral digital assistant, which was eventually dubbed "Pegg." To help its programmers, the organization has also formalized "five key principles" in a "ethics of code" paper.

According to Sage CEO Kriti Sharma, "by 2020, we'll spend more time talking to machines than our own families," thus getting technology right is critical.

Aether, a Microsoft internal ethics panel for AI and Ethics in Engineering and Research, was recently established.

Gender Swap is a project that employs a virtual reality system as a platform for embodiment experience, a kind of neuroscience in which users may sense themselves in a new body.

Human partners utilize the immersive Head Mounted Display Oculus Rift and first-person cameras to generate the brain illusion.

Both users coordinate their motions to generate this illusion.

The embodiment experience will not operate if one does not correlate to the movement of the other.

It implies that every move they make jointly must be agreed upon by both users.

On a regular basis, new causes of algorithmic gender bias are discovered.

Joy Buolamwini, an MIT computer science graduate student, discovered gender and racial prejudice in the way AI detected individuals' looks in 2018.

She discovered, with the help of other researchers, that the dermatologist-approved Fitzpatrick The datasets for Skin Type categorization systems were primarily made up of lighter-skinned people (up to 86 percent).

The researchers developed a skin type system based on a rebalanced dataset and used it to compare three gender categorization systems available off the shelf.

They discovered that darker-skinned girls are the most misclassified in all three commercial systems.

Buolamwini founded the Algorithmic Justice League, a group that fights unfairness in decision-making software.

Jai Krishna Ponnappan

You may also want to read more about Artificial Intelligence here.

See also: 

Algorithmic Bias and Error; Explainable AI.

Further Reading:

Buolamwini, Joy and Timnit Gebru. 2018. “Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification.” Proceedings of Machine Learning Research: Conference on Fairness, Accountability, and Transparency 81: 1–15.

Hempel, Jessi. 2017. “Melinda Gates and Fei-Fei Li Want to Liberate AI from ‘Guys With Hoodies.’” Wired, May 4, 2017.

Leavy, Susan. 2018. “Gender Bias in Artificial Intelligence: The Need for Diversity and Gender Theory in Machine Learning.” In GE ’18: Proceedings of the 1st International Workshop on Gender Equality in Software Engineering, 14–16. New York: Association for Computing Machinery.

National Center for Education Statistics (NCIS). 2018. Digest of Education Statistics.

Roff, Heather M. 2016. “Gendering a Warbot: Gender, Sex, and the Implications for the Future of War.” International Feminist Journal of Politics 18, no. 1: 1–18.

Simonite, Tom. 2018a. “AI Is the Future—But Where Are the Women?” Wired, August 17, 2018.

Simonite, Tom. 2018b. “AI Researchers Fight Over Four Letters: NIPS.” Wired, October 26, 2018.

Søraa, Roger Andre. 2017. “Mechanical Genders: How Do Humans Gender Robots?” Gender, Technology, and Development 21, no. 1–2: 99–115.

Wosk, Julie. 2015. My Fair Ladies: Female Robots, Androids, and Other Artificial Eves. New Brunswick, NJ: Rutgers University Press.

What Is Artificial General Intelligence?

Artificial General Intelligence (AGI) is defined as the software representation of generalized human cognitive capacities that enables the ...