In the press

Select news articles about my research.

Aleksandra Korolova awarded 2024 Sloan Fellowship
The Alfred P. Sloan Foundation has announced that Aleksandra Korolova, an expert in privacy, algorithmic fairness and technology policy, has been selected as a 2024 Sloan Research Fellow in computer science.
https://www.cs.princeton.edu/news/aleksandra-korolova-awarded-2024-sloan-fellowship
Research Says Facebook’s Ad Algorithm Perpetuates Gender Bias
NEW RESEARCH FROM a team at the University of Southern California provides further evidence that Facebook’s advertising system is discriminatory, showing that the algorithm used to target ads reproduced real-world gender disparities when showing job listings, even among equally qualified candidates.
https://theintercept.com/2021/04/09/facebook-algorithm-gender-discrimination/
Facebook’s ad algorithms are still excluding women from seeing jobs
Facebook is withholding certain job ads from women because of their gender, according to the latest audit of its ad service. The audit, conducted by independent researchers at the University of Southern California (USC), reveals that Facebook’s ad-delivery system shows different job ads to women and men even though the jobs require the same qualifications. This is considered sex-based discrimination under US equal employment opportunity law, which bans ad targeting based on protected characteristics. The findings come despite years of advocacy and lawsuits, and after promises from Facebook to overhaul how it delivers ads.
https://www.technologyreview.com/2021/04/09/1022217/facebook-ad-algorithm-sex-discrimination/
Facebook Algorithm Shows Gender Bias in Job Ads, Study Finds
Facebook disproportionately shows certain types of job ads to men and women, researchers have found, calling into question the company’s progress in rooting out bias in its algorithms. The study led by University of Southern California researchers found that Facebook systems were more likely to present job ads to users if their gender identity reflected the concentration of that gender in a particular position or industry. In tests run late last year, ads to recruit delivery drivers for Domino’s Pizza Inc. DPZ -1.57%▼ were disproportionately shown to men, while women were more likely to receive notices in recruiting shoppers for grocery-delivery service Instacart Inc.
https://www.wsj.com/articles/facebook-shows-men-and-women-different-job-ads-study-finds-11617969600
Facebook’s ad tools subsidize partisanship, research shows. And campaigns may not even know it.
The technologies Facebook uses to put advertising it deems relevant in front of people may be more responsible for the polarization of American politics than previously understood, a team of researchers has concluded. Their findings, which have not been previously reported, are the first to demonstrate a skew in the delivery of political ads based on the content of those ads and the information Facebook has on users — not on the targeting decisions made by a political candidate or campaign.
https://www.washingtonpost.com/technology/2019/12/10/facebooks-ad-delivery-system-drives-partisanship-even-if-campaigns-dont-want-it-new-research-shows/
How Facebook's Political Ad System Is Designed to Polarize
AMID THE TENSE debate over online political advertising, it may seem strange to worry that Facebook gives campaigns too little control over whom their ads target. Yet that’s the implication of a study released this week by a team of researchers at Northeastern University, the University of Southern California, and the progressive nonprofit Upturn. By moonlighting as political advertisers, they found that Facebook’s algorithms make it harder and more expensive for a campaign to get its message in front of users who don’t already agree with them—even if they’re trying to.
https://www.wired.com/story/facebook-political-ad-system-designed-polarize/
Facebook’s Ad Algorithm Is A Race And Gender Stereotyping Machine, New Study Suggests
HOW EXACTLY FACEBOOK decides who sees what is one of the great pieces of forbidden knowledge in the information age, hidden away behind nondisclosure agreements, trade secrecy law, and a general culture of opacity. New research from experts at Northeastern University, the University of Southern California, and the public-interest advocacy group Upturn doesn’t reveal how Facebook’s targeting algorithms work, but does show an alarming outcome: They appear to deliver certain ads, including for housing and employment, in a way that aligns with race and gender stereotypes — even when advertisers ask for the ads to be exposed a broad, inclusive audience.
https://theintercept.com/2019/04/03/facebook-ad-algorithm-race-gender/
Facebook delivers housing, employment ads based on race, gender stereotypes: study
Facebook’s algorithms may deliver advertisements to users based on gender and race stereotypes, according to a new study published Wednesday night. The study, produced by researchers at Northeastern University, the University of Southern California and digital rights nonprofit group, Upturn, suggests that Facebook targets ads in ways that can be discriminatory even when the advertisers do not intend to target or exclude certain groups.
https://thehill.com/policy/technology/437399-facebook-delivers-housing-employment-ads-based-on-race-and-gender/
Facebook’s Ad System Might Be Hard-Coded for Discrimination
CIVIL RIGHTS GROUPS, lawmakers, and journalists have long warned Facebook about discrimination on its advertising platform. But their concerns, as well as Facebook’s responses, have focused primarily on ad targeting, the way businesses choose what kind of people they want to see their ads. A new study from researchers at Northeastern University, the University of Southern California, and the nonprofit Upturn finds ad delivery—the Facebook algorithms that decide exactly which users see those ads—may be just as important.
https://www.wired.com/story/facebooks-ad-system-discrimination/
Facebook’s ad system seems to discriminate by race and gender
ON MARCH 28th the American government sued Facebook for allowing advertisers to exclude whole categories of people from seeing ads for housing—couples with children, non-Americans, non-Christians, disabled people, Hispanics, and so on. The Department of Housing and Urban Development (HUD) said this violated the Fair Housing Act, which bans discrimination against certain “protected” groups.
https://www.economist.com/business/2019/04/04/facebooks-ad-system-seems-to-discriminate-by-race-and-gender
Turning Off Facebook Location Tracking Doesn't Stop It From Tracking Your Location
Aleksandra Korolova has turned off Facebook’s access to her location in every way that she can. She has turned off location history in the Facebook app and told her iPhone that she “Never” wants the app to get her location. She doesn’t “check-in” to places and doesn’t list her current city on her profile. Despite all this, she constantly sees location-based ads on Facebook. She sees ads targeted at “people who live near Santa Monica” (where she lives) and at “people who live or were recently near Los Angeles” (where she works as an assistant professor at the University of Southern California).
https://gizmodo.com/turning-off-facebook-location-tracking-doesnt-stop-it-f-1831149148
How One of Apple's Key Privacy Safeguards Falls Short
FOR THE PAST year, Apple has touted a mathematical tool that it describes as a solution to a paradoxical problem: mining user data while simultaneously protecting user privacy. That secret weapon is "differential privacy," a novel field of data science that focuses on carefully adding random noise to an individual user's information before it's uploaded to the cloud. That way, a company such as Apple's total dataset reveals meaningful results without any one person's secrets being spilled.
https://www.wired.com/story/apple-differential-privacy-shortcomings/
Google's RAPPOR aims to preserve privacy while snaring software stats
Google is applying a surveying technique from the 1960s to a project that aims to collect data about users' computers without compromising their privacy. The project is nicknamed RAPPOR, which stands for Randomized Aggregatable Privacy-Preserving Ordinal Response. Google plans to present a paper on it next week at the ACM Conference on Computer and Communications Security.
http://www.computerworld.com/article/2841954/googles-rappor-aims-to-preserve-privacy-while-snaring-software-stats.html
Marketers Can Glean Private Data on Facebook
Online advertising offers marketers the chance to aim ads at very specific groups of people say, golf players in Illinois who make more than $150,000 a year and vacation in Hawaii. But two recent academic papers show some potential pitfalls of such precise tailoring.
https://www.nytimes.com/2010/10/23/technology/23facebook.html