Understanding Gender Bias in Algorithms and Marketplace Segmentation

Slide Note
Embed
Share

As online algorithms may learn and perpetuate marketplace gender bias, this study investigates how psychographic traits related to men and women are used for segmentation and targeting in marketing. Results indicate biases in associating positive and negative traits with genders, highlighting the need for ethical considerations in algorithm design and marketing strategies.


Uploaded on Dec 12, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. Algorithms Propagate Gender Bias in the Marketplace with Consumers Cooperation Prof. Shelly Rathee Prof. Sachin Banker Prof. Arul Mishra Prof. Himanshu Mishra

  2. GOOGLES PRODUCT RECCOMENDATION FOR BIRTHDAY GIFTS Is it possible that you are recommended different products based on your gender? What products you are recommended when you enter the keyword birthday gifts ?

  3. WHAT IS MARKETPLACE GENDER BIAS? When psychographic traits are related to men versus women: Marketers use these traits for Segmentation and Targeting their consumers Positive traits = rational, loyal, innovative, determined Negative traits = irrational, fickle, lazy, unreliable

  4. RESEARCH QUESTIONS ONLINE TEXT 1. Whether online algorithms learn marketplace gender bias from online text? DETECT BIAS ALGORITHMS 2. Would outcomes for consumers? Would there adverse effects on female consumers? it influence GETS REFLECTED be any OUTPUT (RECCOMNEDATIONS & ADS)

  5. METHODOLOGY 1. To detect marketplace gender bias in online text: Publicly AvailableData: Common Crawl Used Natural Language Processing (word-embedding's) to detect implicit associations Used Similarity measure to find out how men vs. women are associated with positive vs. negative traits. 2. To examine biases in algorithms output/outcomes Field experiments on Facebook and Google conducted Dependent Variable: How advertisements and product recommendations change by online algorithms? Independent Variable: Positive and Negative traits are manipulated in an advertisement.

  6. RESULTS FOR COMMON CRAWL DATASET Similarity (female words Similarity (male words to to positive attributes) - positive attributes) - 0 Similarity (male words to Similarity (female words to negative attributes) negative attributes) Men significantly had greater semantic similarity with positive traits (2.485) than women did (2.083; d=.763, p<.001). Women had significantly greater semantic similarity with negative traits (1.879) than men did (1.837; d=.154, p<.001).

  7. Therefore, algorithms learnedthat women are more likely to be "tempted","fragile","irresponsible", fickle","dissatisfied", "conformist than rational ,"disciplined","loyal","trustworthy","patience", "creative","innovative

  8. RESULTS FROM FIELD EXPERIMENT 100% 90% 80% 70% 60% 50% 40% 80% IV: Positive vs.Negative Ads 30% DV: Impressions (who was targeted to see the advertisement) 20% 10% 87% 20% 13% 0% POSITIVE NEGATIVE

  9. CONTRIBUTIONS AND ETHICALIMPLICATIONS

  10. Shelly Rathee shelly.rathee@villanova.edu Sachin Banker sachin.banker@utah.edu Arul Mishra arul.mishra@utah.edu Himanshu Mishra himanshu.mishra@utah.edu

Related


More Related Content