Twitter has launched an investigation after users claimed that its image cropping feature favours the faces of white people.
The latest tweets from @skynewsbreak. 'Right now the club are just trying to make sure this stuff isn't repeated,' says Sky Sports News' North West correspondent James Cooper. 'They recognise - and I'm sure Twitter, the FA and the.
An automatic tool on the social network's mobile app automatically crops pictures that are too big to fit on the screen - and selects which parts of an image should be cut off.
But an experiment from a graduate programmer appeared to show racial bias.
To see what Twitter's algorithm would pick, Tony Arcieri posted a long image featuring headshots of Senate Republican leader Mitch McConnell at the top and former US president Barack Obama at the bottom - separated by white space.
In a second image, Mr Obama's headshot was placed at the top, with Mr McConnell's at the bottom.
Both times, the former president was cropped out altogether.
Following the 'horrible experiment' - which came after an image he posted cropped out a black colleague - Mr Arcieri wrote: 'Twitter is just one example of racism manifesting in machine learning algorithms.'
Cnn Twitter
More from Science & Tech
At the time of writing, his experiment has been retweeted 78,000 times.
Trying a horrible experiment...
Which will the Twitter algorithm pick: Mitch McConnell or Barack Obama? pic.twitter.com/bR1GRyCkia
Twitter has vowed to look into the issue, but said in a statement: 'Our team did test for bias before shipping the model and did not find evidence of racial or gender bias in our testing.
'It's clear from these examples that we've got more analysis to do. We'll continue to share what we learn, what actions we take, and will open source our analysis so others can review and replicate.'
A Twitter representative also pointed to research from a Carnegie Mellon University scientist who analysed 92 images. In that experiment, the algorithm favoured black faces 52 times.
Back in 2018, the company said the tool was based on a 'neural network' that uses artificial intelligence to predict which part of a photo would be interesting to a user.
Meredith Whittaker, the co-founder of the AI Now Institute, told the Thomson Reuters Foundation: 'This is another in a long and weary litany of examples that show automated systems encoding racism, misogyny and histories of discrimination.'
Google uses cookies and data to:Sky Uk
- Deliver and maintain services, like tracking outages and protecting against spam, fraud, and abuse
- Measure audience engagement and site statistics to understand how our services are used
- Improve the quality of our services and develop new ones
- Deliver and measure the effectiveness of ads
- Show personalized content, depending on your settings
- Show personalized or generic ads, depending on your settings, on Google and across the web
Sky Sports News Twitter
For non-personalized content and ads, what you see may be influenced by things like the content you’re currently viewing and your location (ad serving is based on general location). Personalized content and ads can be based on those things and your activity like Google searches and videos you watch on YouTube. Personalized content and ads include things like more relevant results and recommendations, a customized YouTube homepage, and ads that are tailored to your interests.Latest World News Sky News
Click “Customize” to review options, including controls to reject the use of cookies for personalization and information about browser-level controls to reject some or all cookies for other uses. You can also visit g.co/privacytools anytime.