Share these talks and lectures with your colleagues
Invite colleaguesFacebook and Pandora’s box: How using Big Data and Artificial Intelligence in advertising resulted in housing discrimination
Abstract
In 2019, the US Department of Housing and Development charged Facebook with violating the Fair Housing Act 1968. This followed an investigation into the use of ethnically targeted advertising practices on Facebook. To understand Facebook’s targeting methods and the cause of the problematic outcomes, this paper follows the journey of an advertisement through Facebook’s platform. In this way, Facebook’s regulatory missteps can serve as a case study to illustrate how Big Data analytics can, when informed by human and machine bias, cross the line into discriminatory practices. This case study underscores how it is vital — in advertising as in other industries — not to treat advanced analytics like artificial intelligence as black boxes. Indeed, to inform the design and use of advanced analytics, it is essential for companies to consistently develop a comprehensive understanding of their data, in addition to the legal and ethical implications of their operations.
The full article is available to subscribers to the journal.
Author's Biography
Sarah Khatry is a data scientist and writer at DataRobot. She has also has worked in long-form journalism, experimental physics and the entertainment industry. Sarah has a BA in English and physics from Dartmouth College.
Citation
Khatry, Sarah (2020, June 1). Facebook and Pandora’s box: How using Big Data and Artificial Intelligence in advertising resulted in housing discrimination. In the Applied Marketing Analytics: The Peer-Reviewed Journal, Volume 6, Issue 1. https://doi.org/10.69554/YFQX8158.Publications LLP