Facebook’s ad algorithms are still excluding women from seeing jobs
MIT Technology Review recently published a story about an audit that discovered Facebook’s current advertising practice of withholding certain job opportunities from women regardless of qualification. Find the article here.
Even when two jobs being advertised require the same qualifications, such as a delivery driver for Domino’s and one for Instacart, ads showing these job opportunities are being disproportionately shown to one gender and withheld from the other. For example, in the case of Domino’s and Instacart, the study found that the Instacart ad was shown to more women than men and the Domino’s ad was shown to more men than women. Similar trends were found for ads for software engineers for Nvidia and Netflix, with Nvidia’s ad being shown more to women than men and vice versa for Netflix, as well as for sales associates for jewelry and cars, with more women receiving the ad for jewelry sales associates and more men for car sales associates. While a case can be made for somewhat different skill sets being required for the jewelry and car sales associates, in the case of Domino and Instacart, the requirements bear no difference. Instead, what likely causes this disproportion in advertising is the algorithm recognizing trends in occupational demographic disparities between genders; for example, more women currently drive for Instacart than men, and vice versa for Domino’s. What causes Facebook’s advertising algorithm to recognize these demographic trends is unknown, as information has not been made available by Facebook on the inner workings of their advertising system.
Facebook has previously faced legal scrutiny for their demographic-based advertisement delivery system, as US equal employment opportunity law bans ad targeting based on protected characteristics such as gender or race. In March of 2019, the company disabled the ability to select a demographic-based audience for housing, credit, and job ads. They previously allowed job and housing advertisers to exclude certain audiences based on these protected characteristics. Although this was eventually removed due to legal protections for the excluded groups, it was quickly found that even without the advertiser specifying a particular exclusion of an audience, certain demographic groups were still receiving different housing ads. This most recent finding would be considered sex-based discrimination under the same laws.
Despite public statements from Facebook alluding to efforts being made to improve their advertising fairness, the study found that there were no discernible differences in the algorithm between an earlier audit and now. Until they make a change in their algorithmic fairness, an available fix would be to simply shut off algorithmic ad targeting, says Piotr Sapieżyński, an author of an earlier study.
The trend of decision-making algorithms, in this case advertising, reflecting disparities in historical data is nothing new, and has been seen across the board in artificial intelligence, in everything from predictive policing to bank loans. A change at the algorithmic level is what is required in many of these situations to mitigate bias.