Summit Attends Event on Algorithmic Bias and STEM Careers

December 11, 2017 Claire Hempel

algorithmic bias photo.pngIn early November, members of Summit’s Litigation Analytics Directorate attended an event at the Technology Policy Institute on Algorithmic Bias and Science Technology Engineering and Mathematics (STEM) Careers. During the event, Dr. Catherine Tucker discussed an article she co-authored with Dr. Anja Lambrecht, Algorithmic Bias? An Empirical Study into Apparent Gender-Based Discrimination in the Display of STEM Career Ads.

The article describes a study in which a gender-neutral ad for STEM career information was advertised on Facebook and elicited an unexpected outcome. The results of the study revealed that males saw more instances of the ad than females for all age groups. In particular, females aged 25-34 were the least likely to see the ad, which was contrary to the desired outcome. Despite seeing the ad fewer than men within every age category, females clicked on the ad a higher rate than their male cohorts.

The researchers suggest that this pattern occurred because the algorithms underlying social media platforms are meant to be cost-optimizing (i.e., spend less, but reach the largest audience). Consequently, if a company markets to males and females in the same campaign, the algorithm will choose to display an ad to the group that is cheaper to market to. These findings were substantiated by similar findings across three other platforms (Google AdWords, Twitter, and Instagram).

The researchers used external data on household goods consumption to further substantiate their findings. The analysis determined that females 18-34 are more likely to purchase a product if they click on an ad. Thus, advertisers see higher returns on investment when they show ads to this demographic. On advertising platforms like Facebook, an “auction” presumably happens behind the scenes. If multiple advertisers target females 18-34, that market becomes more expensive to place ads in front of.

In spite of ad algorithms on social media platforms not being designed to be biased, their marketing approach may garner biased results against certain demographics according to Dr. Tucker’s study. To avoid this phenomena, Dr. Tucker advocates that companies separate their campaigns by gender. However, as Dr. Tucker noted, a similar pattern could play out across other demographics such as race, and income level, which are more difficult to detect since many social media platforms do not collect this information and are forced to rely on using proxies.

The outcome of this study reaffirms why it’s important to carefully think about the data life cycle. Data can often be misleading and lead to unintended outcomes. In the Litigation space at Summit, this is a critical part of our work because our materials need to withstand the scrutiny of the opposition.

The article discussed at the event can be accessed online here.

Share This: