Tag: bias

09
Oct
2020
Posted in technology

Facebook’s alleged indifference of Indian hate speech linked to policy chief’s political bias

Over the past several weeks, there has been an increasing clamour for Facebook to place its India public policy head, Ankhi Das, on leave as the company continues with an audit of its India operations.

The impetus for the audit was an article written by the Wall Street Journal in mid-August. In that piece, WSJ reported that Das had resisted against taking down inflammatory content that eventually sparked rioting in the capital city of Delhi as it was posted by members of the nationalist BJP party. 

The riots left over fifty dead, most of whom were Muslims. It also led to many of these Muslims’ homes being torched.

“The company’s top public-policy executive in the country, Ankhi Das, opposed applying the hate-speech rules to [T Raja] Singh and at least three other Hindu nationalist individuals and groups flagged internally for promoting or participating in violence,” WSJ reported.

These inflammatory posts were reportedly only taken down months after the riots had already occured, and only when the paper approached the company for a statement. 

One of the BJP politicians, Raja Singh, reportedly said that Rohingya Muslim refugees should be “shot”, and had labelled Indian Muslims as traitors while also threatening to destroy their mosques. Singh, who has enshrined a reputation for these kinds of comments, has since denied these allegations and claimed his account was hacked.

The audit was initiated when a group of 54 retired civil servants wrote to Facebook CEO Mark Zuckerberg about the WSJ revelation. This call for an audit was then reiterated by a jointly-written letter to Facebook by global civil rights organisations such as the Southern Law Poverty Center, Muslim Advocates, and other organisations in countries such as the UK, US, and New Zealand.

“The audit must be removed entirely from the influence of the India

07
Oct
2020
Posted in technology

UK passport photo checker shows bias against dark-skinned women

An illustration showing photos of three people with different skin tones. The photo of the darkest skinned person has a poor quality score and the photo of the lightest skinned person has a good quality score
An illustration showing photos of three people with different skin tones. The photo of the darkest skinned person has a poor quality score and the photo of the lightest skinned person has a good quality score

Women with darker skin are more than twice as likely to be told their photos fail UK passport rules when they submit them online than lighter-skinned men, according to a BBC investigation.

One black student said she was wrongly told her mouth looked open each time she uploaded five different photos to the government website.

This shows how “systemic racism” can spread, Elaine Owusu said.

The Home Office said the tool helped users get their passports more quickly.

“The indicative check [helps] our customers to submit a photo that is right the first time,” said a spokeswoman.

“Over nine million people have used this service and our systems are improving.

“We will continue to develop and evaluate our systems with the objective of making applying for a passport as simple as possible for all.”

Skin colour

The passport application website uses an automated check to detect poor quality photos which do not meet Home Office rules. These include having a neutral expression, a closed mouth and looking straight at the camera.

BBC research found this check to be less accurate on darker-skinned people.

More than 1,000 photographs of politicians from across the world were fed into the online checker.

The results indicated:

  • Dark-skinned women are told their photos are poor quality 22% of the time, while the figure for light-skinned women is 14%

  • Dark-skinned men are told their photos are poor quality 15% of the time, while the figure for light-skinned men is 9%

Photos of women with the darkest skin were four times more likely to be graded poor quality, than women with

02
Oct
2020
Posted in software

Twitter is changing how it crops photos after reports of racial bias

  • Twitter said it is limiting its reliance on machine learning that helps it decide which part of a photo to crop on its platform.
  • Online users have reported racial bias on the social media firm’s image cropping tool, which automatically focuses on the part of a photo it thinks the viewer will find most interesting.
  • One Twitter user recently highlighted how the face of Senate Majority Leader Mitch McConnell, who is white, was routinely centered in automatic image crops, while that of former President Barack Obama was cut out.
  • Visit Business Insider’s homepage for more stories.

Twitter is making changes to its photo cropping function after an investigation into racial bias in the software, the company said on Thursday.

The announcement comes after users on the platform repeatedly showed that the tool — which uses machine learning to choose which part of an image to crop based on what it thinks is the most interesting — cuts out Black people from photos and centers on white faces instead.

Tony Arcieri, a cryptography engineer, posted a series of tweets in mid-September showing how the platform’s algorithm routinely chose to highlight the face of Senate Majority Leader Mitch McConnell, who is white, instead of former President Barack Obama’s in multiple photos of the two. The experiment prompted others to try similar experiments with the same result, and led to the company launching an investigation into its systems shortly after.

 

The social media company implemented its machine-learning-powered image cropping system in 2018. The system “relies on saliency, which predicts where people might look first,” Twitter’s chief design officer, Dantley Davis, and its chief technology officer, Parag Agrawal, wrote in the company blog post on Thursday.

They said in the post that Twitter will