New H&M tagline sparks outcry over gender violence association

The launch of a collection by Italian designer Giambattista Valli for H&M in Paris on Nov. 7. (Reuters)
Updated 12 November 2019

New H&M tagline sparks outcry over gender violence association

KUALA LUMPUR: A new collection from fashion giant H&M has unleashed protests from women’s rights campaigners because it includes the slogan “I love GBV,” the initials of the designer but also a widely used acronym for gender-based violence.

H&M said the slogan, emblazoned on hats, a necklace and boxer shorts with a red heart symbolising the word “love,” was an abbreviation of Italian designer Giambattista Valli and any other associations were unintentional.

But women’s rights activists demanded the products be withdrawn saying it was “crazy” to keep selling them. “This is not an obscure term. It’s very commonly used as a short hand for gender-based violence,” said Heather Barr, the women’s rights division co-director at global advocacy group Human Rights Watch.

The Swedish retailer launched the clothing line on Nov. 7 in its first collaboration with Rome-born Valli, known in the fashion world for its ready-to-wear and haute couture pieces.

“We condemn any type of violence, and as a value driven company, we believe in an inclusive and equal society,” H&M said.


Google CEO calls for regulation of artificial intelligence

Updated 20 January 2020

Google CEO calls for regulation of artificial intelligence

  • Sundar Pichai’s comments come as lawmakers and governments seriously consider putting limits on how artificial intelligence is used
  • Pichai’s comments suggest the company may be hoping to head off a broad-based crackdown by the EU on the technology

LONDON: Google’s chief executive called Monday for a balanced approach to regulating artificial intelligence, telling a European audience that the technology brings benefits but also “negative consequences.”

Sundar Pichai’s comments come as lawmakers and governments seriously consider putting limits on how artificial intelligence is used.

“There is no question in my mind that artificial intelligence needs to be regulated. The question is how best to approach this,” Pichai said, according to a transcript of his speech at a Brussel-based think tank.

He noted that there’s an important role for governments to play and that as the European Union and the US start drawing up their own approaches to regulation, “international alignment” of any eventual rules will be critical. He did not provide specific proposals.

Pichai spoke on the same day he was scheduled to meet the EU’s powerful competition regulator, Margrethe Vestager.

Vestager has in previous years hit the Silicon Valley giant with multibillion-dollar fines for allegedly abusing its market dominance to choke off competition. After being reappointed for a second term last autumn with expanded powers over digital technology policies, Vestager has now set her sights on artificial intelligence, and is drawing up rules on its ethical use.

Pichai’s comments suggest the company may be hoping to head off a broad-based crackdown by the EU on the technology. Vestager and the EU have been the among the more aggressive regulators of big tech firms, an approach US authorities have picked up with investigations into the dominance of companies like Google, Facebook and Amazon.

“Sensible regulation must also take a proportionate approach, balancing potential harms with social opportunities,” he said, adding that it could incorporate existing standards like Europe’s tough General Data Protection Regulation rather than starting from scratch.

While it promises big benefits, he raised concerns about potential downsides of artificial intelligence, citing as one example its role in facial recognition technology, which can be used to find missing people but also for “nefarious reasons” which he didn’t specify.

In 2018, Google pledged not to use AI in applications related to weapons, surveillance that violates international norms, or that works in ways that go against human rights.