He algorithm from Facebook there would be labelled automatically to 740,000 young boys under 18 from around the world as interested at game already 940,000 more as interested in alcoholic drinks, according to one investigation carried out by the Public Broadcasting Corporation of Denmark and The Guardian
. This offers to advertisers de the social network an audience of more than 1,600,000 minors whose behavior inside Facebook indicates that they are interested in these two topics, and allows you to send them Advertising messages specific.
Anyway, the social network has made it clear in a statement that its rules they do not allow "advertisements on Facebook that promote the sale of alcohol or the gambling to minors and we act against these types of activities when we find them ”. But Facebook, though prohibits alcohol ads and bets on minors, yes It allows to advertisers send any other kind of advertising To them tagged youth how interested on these two issues.
The algorithm has taken into account the behavior of young people within the platform
So for example, the developer of a videogame that includes integrated purchases, thanks to algorithm and to the tools of market segmentation offered by the social network, could identify and show advertising To them minors whose behavior within Facebook has made them worthy of being indicated as interested at game.
On the other side, a anti-gaming service or of prevention of the consumption of alcohol among the youngest ones too would have perfectly segmented the target audiences to whom to offer your services and offer them help. And of course, those who try skip the rules Facebook about it, from the outset, they already have a shortlisted audience at your service.
A video game developer with integrated purchases has a pre-selected audience of children who like bets at their disposal
The company check the advertising that shows its users, in search of those who contravene their rules, so automated, but no is always able to detect them before they have begun to spread.
The automatic segmentation Facebook interests has already been the subject of controversy in the past. In May 2018, it was discovered that the social network was running advertising to users who thought they were interested On topics such as homosexuality, he Islam or liberalism, even though these are issues that are explicitly marked as information sensitive" by data protection laws of the European Union.
Facebook reviews advertising automatically, but does not always detect those that violate its rules
And just over a month ago other investigation also carried out by the Danish radio and the British newspaper revealed that it had algorithmically tagged to 65,000 citizens russians as "interested in betrayal", what I could put them at risk. Later Facebook eliminated that label.
Finally, in March of this year, the United States Department of Housing accused Facebook of violating the Fair Housing Law, arguing that the place allowed advertisers to restrict the advertisements of housing in a way that discriminated “Illegally for reasons of race, color, national origin, religion, family state, sex Y disability" Facebook said then that would work with said Department and the groups of civil rights for to get better their systems of ad targeting.
In March USA discovered that the platform sent housing ads that illegally segmented according to the user's race, sex or disability