New Delhi -°C
Today in New Delhi, India

Feb 17, 2020-Monday
-°C

Humidity
-

Wind
-

Select city

Metro cities - Delhi, Mumbai, Chennai, Kolkata

Other cities - Noida, Gurgaon, Bengaluru, Hyderabad, Bhopal , Chandigarh , Dehradun, Indore, Jaipur, Lucknow, Patna, Ranchi

Home / Tech / Twitter apologizes for letting ads be micro-targeted using words like ‘transphobic’, ‘white supremacists’, ‘anti-gay’

Twitter apologizes for letting ads be micro-targeted using words like ‘transphobic’, ‘white supremacists’, ‘anti-gay’

Ads on Twitter were being micro-targeted at certain users who used/showed interest in neo-Nazi, homophobic words

tech Updated: Jan 17, 2020 19:35 IST
HT Correspondent
HT Correspondent
Hindustan Times
Ads on Twitter were being micro-targeted at certain users who used/showed interest in neo-Nazi, homophobic words
Ads on Twitter were being micro-targeted at certain users who used/showed interest in neo-Nazi, homophobic words(REUTERS)

Following a BBC investigation, Twitter has apologized for “allowing adverts to be micro-targeted at certain users such as neo-Nazis, homophobes and other hate groups”.

The investigation found that it was possible to “target users who had shown an interest in keywords including ‘transphobic’, ‘white-supremacists’ and ‘anti-gay’,”. Twitter allowed adverts to be directed at users who has posted about or searched for specific topics.

Anti-hate charities had “raised concerns” that Twitter’s advertising platform “could have been used to spread intolerance”.

What is the problem?

Like all other social media companies, Twitter also creates user profiles by collecting data on things they post, like and share. Advertisers take advantage of these profiles by using tools to “select their campaign audience” from the list of characteristics like – parents of teenagers or amateur photographers.

Twitter can also “control who sees their message by using keywords” and gives advertisers “an estimate for how many users are likely to qualify as a result”. For example, BBC explained, “a car website wanting to reach people using the term ‘petrolhead’ would be told that the potential audience is between 140,000 and 172,000 people”.

Twitter’s keywords are supposed to be restricted but the BBC investigation revealed that it was possible to advertise to users using the term ‘neo-Nazi’. “The ad tool had indicated that in the UK, this would target a potential audience of 67,000 to 81,000 people,” reported BBC.

How did the BBC test this?

BBC created a generic advert from an anonymous Twitter account, saying “Happy New Year” and then targeted three different sets of audiences based on sensitive keywords.

Twitter’s website said that “ads on its platform would be reviewed prior to being launched, and the BBC’s ad initially went into a ‘pending’ state”. However, it was approved soon and ran for a few hours before BBC pulled it down.

“In that time, 37 users saw the post and two of them clicked on a link attached, which directed them to a news article about memes. Running the ad cost 3.84 pounds,” reported BBC adding that “targeting an advert using other problematic keywords seemed to be just as easy to do”.

According to Twitter’s tool, a campaign using the keywords “islamophobes”, “islamaphobia”, “islamophobic” and ‘#islamophobic’ had a potential to reach 92,900 to 114,000 Twitter users. It was also possible to advertise to vulnerable groups, like an audience of 13 to 24-year-olds using words like “anorexic”, “bulimic”, “anorexia” and “bulimia”.

“Twitter estimated the target audience amounted to 20,000 people. The post was seen by 255 users, and 14 people clicked on the link before we stopped it,” BBC reported.

What did campaigners say?

An anti-extremist charity called Hope Not Hate said it “feared that Twitter’s ads could become a propaganda tool for the far-right”.

“I can see this being used to promote engagement and deepen the conviction of individuals who have indicated some or partial agreement with intolerant causes or ideas,” said Patrik Hermansson, its social media researcher.

An eating disorder charity called Anorexia and Bulimia Care added that “it believed the ad tool had already been abused”.

“I’ve been talking about my eating disorder on social media for a few years now and been targeted many times with adverts based on dietary supplements, weight loss supplements, spinal corrective surgery,” said Daniel Magson, the organisation’s chairman.

“It’s quite triggering for me, and I’m campaigning to get it stopped through Parliament. So, it’s great news that Twitter has now acted,” Magson added.

What did Twitter say?

Twitter said it had policies in place to avoid the abuse of keyword targeting, but “acknowledged they had not been applied correctly”.

“[Our] preventative measures include banning certain sensitive or discriminatory terms, which we update on a continuous basis,” Twitter said in a statement.

“In this instance, some of these terms were permitted for targeting purposes. This was an error. We’re very sorry this happened and as soon as we were made aware of the issue, we rectified it. We continue to enforce our ads policies, including restricting the promotion of content in a wide range of areas, including inappropriate content targeting minors,” Twitter added.