Google violates its own policies by pushing Meta in a secret campaign targeting teens
The companies targeted ads at 13- to 17-year-olds to attract them to the Instagram platform.
Google and Meta are under pressure after it was revealed that the two companies collaborated on a secret ad campaign targeting teens, violating Google's internal policies on the treatment of minors online.
According to documents obtained by the Financial Times and people familiar with the matter, the campaign began between February and April of this year in Canada and then expanded to the United States. It focused on displaying YouTube ads to attract young people between the ages of 13 and 17 to the Instagram platform to counter TikTok's growing dominance among this demographic.
The ads targeted a group labeled as "unknown" in Google's advertising system, which allowed the advertisements to be targeted to those under the age of 18. This strategy violated Google's policies, which prohibit the personalization and targeting of ads to minors, as well as the use of demographic data to target ads to this age group.
Google and Meta respond
Google reportedly halted the campaign after the Financial Times contacted the company. A Google spokesperson told Quartz that the campaign was "small' but that, after reviewing the allegations, the company is taking "appropriate action" to prevent future violations.
Meta, for its part, denied that the selection of the "unknown" group circumvented the policies. The company said it adhered to its policies and those of Google, and defended its approach to marketing its apps as a place for young people to connect with friends and discover interests.
A backdrop of concern
This campaign's revelation comes at a critical time for Meta, which has come under increasing scrutiny due to its apparent failure to protect teen users on its platforms. The company has been accused of contributing to mental health problems among young people and is criticized for failing to implement sufficient safety measures to prevent bullying and child exploitation.
In a January hearing before the U.S. Senate, Meta CEO Mark Zuckerberg publicly apologized for these failures and acknowledged challenges in protecting minors online.
Congress's role in regulation
The U.S. Congress has responded to growing concerns about the safety of minors online by passing two key bills. The Child and Adolescent Online Privacy Protection Act (COPA 2.0) prohibits advertising to minors and data collection without their consent. In addition, the Child Internet Safety Act requires technology companies to design platforms that mitigate or prevent harm to users, including cyberbullying, sexual exploitation and drug use.