in

Facebook will no longer allow advertisers to exclude people based on their race, ethnicity or national origin

The US government and Facebook parent company meta have agreed on a settlement to clear up a lawsuit that accused the company of facilitating housing discrimination. The Department of justice (DOJ) let advertisers specify that ads not be shown to people belonging to specific protected groups. You can read the full agreement below.

The government first brought a case against meta for algorithmic housing discrimination in 2019. The department says this was its first case dealing with algorithmic violations of the fair housing act.

The settlement will have to be approved by a judge before it’s truly final. The settlement says the system will’address racial and other disparities caused by its use of personalization algorithms in its ad delivery system’.

The new system will replace its special ad audiences tool for housing, as well as credit and employment opportunities. According to the DOJ, the tool and its algorithms made it so advertisers could advertise to people that were similar to a pre-selected group. Meta denies wrongdoing and notes that the agreement does n’t constitute an admission of guilt or a finding of liability.

The system will ensure the people watching the ad are targeted by and eligible to see the ad. Meta will look at age, gender, and race to measure how far off the targeted audience is from the actual audience.

By the end of December 2022, the company must prove to the government that the system works as intended and build it into its platform.

The company promises to share its progress as it builds the new system. If the government approves it and it’s put into place, a third party will’investigate and verify on an ongoing basis’.

Spread the AI news in the universe!

What do you think?

Written by Nuked

Leave a Reply

Your email address will not be published. Required fields are marked *