Connect with us

Technology

Why algorithms show violence to boys

Published

on

Why algorithms show violence to boys
BBC A close-up of Cai, a young man wearing a black shirt, looking pensive with his eyes cast downwardBBC

Cai says violent and disturbing material appeared on his feeds “out of nowhere”

It was 2022 and Cai, then 16, was scrolling on his phone. He says one of the first videos he saw on his social media feeds was of a cute dog. But then, it all took a turn.

He says “out of nowhere” he was recommended videos of someone being hit by a car, a monologue from an influencer sharing misogynistic views, and clips of violent fights. He found himself asking – why me?

Over in Dublin, Andrew Kaung was working as an analyst on user safety at TikTok, a role he held for 19 months from December 2020 to June 2022.

He says he and a colleague decided to examine what users in the UK were being recommended by the app’s algorithms, including some 16-year-olds. Not long before, he had worked for rival company Meta, which owns Instagram – another of the sites Cai uses.

Advertisement

When Andrew looked at the TikTok content, he was alarmed to find how some teenage boys were being shown posts featuring violence and pornography, and promoting misogynistic views, he tells BBC Panorama. He says, in general, teenage girls were recommended very different content based on their interests.

TikTok and other social media companies use AI tools to remove the vast majority of harmful content and to flag other content for review by human moderators, regardless of the number of views they have had. But the AI tools cannot identify everything.

Andrew Kaung says that during the time he worked at TikTok, all videos that were not removed or flagged to human moderators by AI – or reported by other users to moderators – would only then be reviewed again manually if they reached a certain threshold.

He says at one point this was set to 10,000 views or more. He feared this meant some younger users were being exposed to harmful videos. Most major social media companies allow people aged 13 or above to sign up.

Advertisement

TikTok says 99% of content it removes for violating its rules is taken down by AI or human moderators before it reaches 10,000 views. It also says it undertakes proactive investigations on videos with fewer than this number of views.

Andrew Kaung, sitting facing the camera in a loft-style room, wearing a black T-shirt

Andrew Kaung says he raised concerns that teenage boys were being pushed violent, misogynistic content

When he worked at Meta between 2019 and December 2020, Andrew Kaung says there was a different problem. He says that, while the majority of videos were removed or flagged to moderators by AI tools, the site relied on users to report other videos once they had already seen them.

He says he raised concerns while at both companies, but was met mainly with inaction because, he says, of fears about the amount of work involved or the cost. He says subsequently some improvements were made at TikTok and Meta, but he says younger users, such as Cai, were left at risk in the meantime.

Several former employees from the social media companies have told the BBC Andrew Kaung’s concerns were consistent with their own knowledge and experience.

Advertisement

Algorithms from all the major social media companies have been recommending harmful content to children, even if unintentionally, UK regulator Ofcom tells the BBC.

“Companies have been turning a blind eye and have been treating children as they treat adults,” says Almudena Lara, Ofcom’s online safety policy development director.

‘My friend needed a reality check’

TikTok told the BBC it has “industry-leading” safety settings for teens and employs more than 40,000 people working to keep users safe. It said this year alone it expects to invest “more than $2bn (£1.5bn) on safety”, and of the content it removes for breaking its rules it finds 98% proactively.

Advertisement

Meta, which owns Instagram and Facebook, says it has more than 50 different tools, resources and features to give teens “positive and age-appropriate experiences”.

Cai told the BBC he tried to use one of Instagram’s tools and a similar one on TikTok to say he was not interested in violent or misogynistic content – but he says he continued to be recommended it.

He is interested in UFC – the Ultimate Fighting Championship. He also found himself watching videos from controversial influencers when they were sent his way, but he says he did not want to be recommended this more extreme content.

“You get the picture in your head and you can’t get it out. [It] stains your brain. And so you think about it for the rest of the day,” he says.

Advertisement

Girls he knows who are the same age have been recommended videos about topics such as music and make-up rather than violence, he says.

Cai, now aged 18, looking at his phone as he faces a large window

Cai says one of his friends became drawn into content from a controversial influencer

Meanwhile Cai, now 18, says he is still being pushed violent and misogynistic content on both Instagram and TikTok.

When we scroll through his Instagram Reels, they include an image making light of domestic violence. It shows two characters side by side, one of whom has bruises, with the caption: “My Love Language”. Another shows a person being run over by a lorry.

Cai says he has noticed that videos with millions of likes can be persuasive to other young men his age.

Advertisement

For example, he says one of his friends became drawn into content from a controversial influencer – and started to adopt misogynistic views.

His friend “took it too far”, Cai says. “He started saying things about women. It’s like you have to give your friend a reality check.”

Cai says he has commented on posts to say that he doesn’t like them, and when he has accidentally liked videos, he has tried to undo it, hoping it will reset the algorithms. But he says he has ended up with more videos taking over his feeds.

A close-up image of a teenage boy holding an iPhone with both hands. The phone and its camera lenses dominate the image, while the boys face is hidden and out of focus

Ofcom says social media companies recommend harmful content to children, even if unintentionally

So, how do TikTok’s algorithms actually work?

Advertisement

According to Andrew Kaung, the algorithms’ fuel is engagement, regardless of whether the engagement is positive or negative. That could explain in part why Cai’s efforts to manipulate the algorithms weren’t working.

The first step for users is to specify some likes and interests when they sign up. Andrew says some of the content initially served up by the algorithms to, say, a 16-year-old, is based on the preferences they give and the preferences of other users of a similar age in a similar location.

According to TikTok, the algorithms are not informed by a user’s gender. But Andrew says the interests teenagers express when they sign up often have the effect of dividing them up along gender lines.

The former TikTok employee says some 16-year-old boys could be exposed to violent content “right away”, because other teenage users with similar preferences have expressed an interest in this type of content – even if that just means spending more time on a video that grabs their attention for that little bit longer.

Advertisement

The interests indicated by many teenage girls in profiles he examined – “pop singers, songs, make-up” – meant they were not recommended this violent content, he says.

He says the algorithms use “reinforcement learning” – a method where AI systems learn by trial and error – and train themselves to detect behaviour towards different videos.

Andrew Kaung says they are designed to maximise engagement by showing you videos they expect you to spend longer watching, comment on, or like – all to keep you coming back for more.

The algorithm recommending content to TikTok’s “For You Page”, he says, does not always differentiate between harmful and non-harmful content.

Advertisement

According to Andrew, one of the problems he identified when he worked at TikTok was that the teams involved in training and coding that algorithm did not always know the exact nature of the videos it was recommending.

“They see the number of viewers, the age, the trend, that sort of very abstract data. They wouldn’t necessarily be actually exposed to the content,” the former TikTok analyst tells me.

That was why, in 2022, he and a colleague decided to take a look at what kinds of videos were being recommended to a range of users, including some 16-year-olds.

He says they were concerned about violent and harmful content being served to some teenagers, and proposed to TikTok that it should update its moderation system.

Advertisement

They wanted TikTok to clearly label videos so everyone working there could see why they were harmful – extreme violence, abuse, pornography and so on – and to hire more moderators who specialised in these different areas. Andrew says their suggestions were rejected at that time.

TikTok says it had specialist moderators at the time and, as the platform has grown, it has continued to hire more. It also said it separated out different types of harmful content – into what it calls queues – for moderators.

BBC iPlayer banner

Panorama: Can We Live Without Our Phones?

What happens when smartphones are taken away from kids for a week? With the help of two families and lots of remote cameras, Panorama finds out. And with calls for smartphones to be banned for children, Marianna Spring speaks to parents, teenagers and social media company insiders to investigate whether the content pushed to their feeds is harming them.

Watch on Monday on BBC One at 20:00 BST (20:30 in Scotland) or on BBC iPlayer (UK only)

Advertisement

‘Asking a tiger not to eat you’

Andrew Kaung says that from the inside of TikTok and Meta it felt really difficult to make the changes he thought were necessary.

“We are asking a private company whose interest is to promote their products to moderate themselves, which is like asking a tiger not to eat you,” he says.

He also says he thinks children’s and teenagers’ lives would be better if they stopped using their smartphones.

Advertisement

But for Cai, banning phones or social media for teenagers is not the solution. His phone is integral to his life – a really important way of chatting to friends, navigating when he is out and about, and paying for stuff.

Instead, he wants the social media companies to listen more to what teenagers don’t want to see. He wants the firms to make the tools that let users indicate their preferences more effective.

“I feel like social media companies don’t respect your opinion, as long as it makes them money,” Cai tells me.

In the UK, a new law will force social media firms to verify children’s ages and stop the sites recommending porn or other harmful content to young people. UK media regulator Ofcom is in charge of enforcing it.

Advertisement

Almudena Lara, Ofcom’s online safety policy development director, says that while harmful content that predominantly affects young women – such as videos promoting eating disorders and self-harm – have rightly been in the spotlight, the algorithmic pathways driving hate and violence to mainly teenage boys and young men have received less attention.

“It tends to be a minority of [children] that get exposed to the most harmful content. But we know, however, that once you are exposed to that harmful content, it becomes unavoidable,” says Ms Lara.

Ofcom says it can fine companies and could bring criminal prosecutions if they do not do enough, but the measures will not come in to force until 2025.

TikTok says it uses “innovative technology” and provides “industry-leading” safety and privacy settings for teens, including systems to block content that may not be suitable, and that it does not allow extreme violence or misogyny.

Advertisement

Meta, which owns Instagram and Facebook, says it has more than “50 different tools, resources and features” to give teens “positive and age-appropriate experiences”. According to Meta, it seeks feedback from its own teams and potential policy changes go through robust process.

Source link

Continue Reading
Advertisement
Click to comment

You must be logged in to post a comment Login

Leave a Reply

Science & Environment

Drones setting a new standard in ocean rescue technology

Published

on

Drones setting a new standard in ocean rescue technology


Last month, two young paddleboarders found themselves stranded in the ocean, pushed 2,000 feet from the shore by strong winds and currents. Thanks to the deployment of a drone, rescuers kept an eye on them the whole time and safely brought them aboard a rescue boat within minutes.

In North Carolina, the Oak Island Fire Department is one of a few in the country using drone technology for ocean rescues. Firefighter-turned-drone pilot Sean Barry explained the drone’s capabilities as it was demonstrated on a windy day. 

“This drone is capable of flying in all types of weather and environments,” Barry said. 

Advertisement

Equipped with a camera that can switch between modes — including infrared to spot people in distress — responders can communicate instructions through a speaker. It also can carry life-preserving equipment.

The device is activated by a CO2 cartridge when it comes in contact with water. Once triggered, it inflates into a long tube, approximately 26 inches long, providing distressed swimmers something to hold on to.

In a real-life rescue, after a 911 call from shore, the drone spotted a swimmer in distress. It released two floating tubes, providing the swimmer with buoyancy until help arrived.

Like many coastal communities, Oak Island’s population can swell from about 10,000 to 50,000 during the summer tourist season. Riptides, which are hard to detect on the surface, can happen at any time.

Advertisement

Every year, about 100 people die due to rip currents on U.S. beaches. More than 80% of beach rescues involve rip currents, if you’re caught in one, rescuers advise to not panic or try to fight it, but try to float or swim parallel to the coastline to get out of the current.

Oak Island Fire Chief Lee Price noted that many people underestimate the force of rip currents.

“People are, ‘Oh, I’m a good swimmer. I’m gonna go out there,’ and then they get in trouble,” Price said.

For Price, the benefit of drones isn’t just faster response times but also keeping rescuers safe. Through the camera and speaker, they can determine if someone isn’t in distress.

Advertisement

Price said many people might not be aware of it. 

“It’s like anything as technology advances, it takes a little bit for everybody to catch up and get used to it,” said Price.

In a demonstration, Barry showed how the drone can bring a safety rope to a swimmer while rescuers prepare to pull the swimmer to shore.

“The speed and accuracy that this gives you … rapid deployment, speed, accuracy, and safety overall,” Price said. “Not just safety for the victim, but safety for our responders.”

Advertisement

Advertisement



Source link

Continue Reading

Technology

Netflix teases its animated Splinter Cell series

Published

on

Netflix teases its animated Splinter Cell series

It’s been quite some time since we heard anything about Netflix’s animated adaptation of Splinter Cell — but the streamer has finally provided some details on the show. The reveal comes in the form of a very brief teaser trailer, which shows a little bit of the show, but mostly showcases Liev Schreiber’s gravelly take on lead character Sam Fisher. We also have a proper name now: it’s called Splinter Cell: Deathwatch.

Source link

Continue Reading

Science & Environment

Horseshoe crabs: Ancient creatures who are a medical marvel

Published

on

Horseshoe crabs: Ancient creatures who are a medical marvel


Horseshoe crabs: Ancient creatures who are a medical marvel – CBS News

Watch CBS News

Advertisement



Correspondent Conor Knighton visits New Jersey beaches along the Delaware Bay to learn about horseshoe crabs – mysterious creatures that predate dinosaurs – whose very blood has proved vital to keeping humans healthy by helping detect bacterial endotoxins. He talks with environmentalists about the decline in the horseshoe crab population, and with researchers who are pushing the pharmaceutical industry to switch its use of horseshoe crab blood with a synthetic alternative used in medical testing.

Advertisement

Be the first to know

Get browser notifications for breaking news, live events, and exclusive reporting.




Source link

Advertisement
Continue Reading

Technology

NYT Strands today — hints, answers and spangram for Friday, September 20 (game #201)

Published

on

NYT Strands homescreen on a mobile phone screen, on a light blue background

Strands is the NYT’s latest word game after the likes of Wordle, Spelling Bee and Connections – and it’s great fun. It can be difficult, though, so read on for my Strands hints.

Want more word-based fun? Then check out my Wordle today, NYT Connections today and Quordle today pages for hints and answers for those games.

Source link

Continue Reading

Science & Environment

SpaceX to launch bitcoin entrepreneur and three crewmates on flight around Earth’s poles

Published

on

SpaceX to launch bitcoin entrepreneur and three crewmates on flight around Earth's poles


A blockchain entrepreneur, a cinematographer, a polar adventurer and a robotics researcher plan to fly around Earth’s poles aboard a SpaceX Crew Dragon capsule by the end of the year, becoming the first humans to observe the ice caps and extreme polar environments from orbit, SpaceX announced Monday.

The historic flight, launched from the Kennedy Space Center in Florida, will be commanded by Chun Wang, a wealthy bitcoin pioneer who founded f2pool and stakefish, “which are among the largest Bitcoin mining pools and Ethereum staking providers,” the crew’s website says.

081224-fram2-crew.jpg
The Fram2 crew, seen during a visit to SpaceX’s Hawthorn, Calif., manufacturing facility. Left to right: Eric Philips, Jannicke Mikkelse, commander Chun Wang and Rabea Rogge.

Advertisement

SpaceX


“Wang aims to use the mission to highlight the crew’s explorational spirit, bring a sense of wonder and curiosity to the larger public and highlight how technology can help push the boundaries of exploration of Earth and through the mission’s research,” SpaceX said on its website.

Wang’s crewmates are Norwegian cinematographer Jannicke Mikkelsen, Australian adventurer Eric Philips and Rabea Rogge, a German robotics researcher. All four have an interest in extreme polar environments and plan to carry out related research and photography from orbit.

The mission, known as “Fram2” in honor of a Norwegian ship used to explore both the Arctic and Antarctic regions, will last three to five days and fly at altitudes between about 265 and 280 miles.

Advertisement

“This looks like a cool & well thought out mission. I wish the @framonauts the best on this epic exploration adventure!” tweeted Jared Isaacman, the billionaire philanthropist who charted the first private SpaceX mission — Inspiration4 — and who plans to blast off on a second flight — Polaris Dawn — later this month.

The flights “showcase what commercial missions can achieve thanks to @SpaceX’s reusability and NASA’s vision with the commercial crew program,” Isaacman said. “All just small steps towards unlocking the last great frontier.”

Like the Inspiration4 mission before them, Wang and his crewmates will fly in a Crew Dragon equipped with a transparent cupola giving them a picture-window view of Earth below and deep space beyond.

No astronauts or cosmonauts have ever viewed Earth from the vantage point of a polar orbit, one tilted, or inclined, 90 degrees to the equator. Such orbits are favored by spy satellites, weather stations and commercial photo-reconnaissance satellites because they fly over the entire planet as it rotates beneath them.

The high-inclination record for piloted flight was set in the early 1960s by Soviet Vostok spacecraft launched into orbits inclined 65 degrees. The U.S. record was set by a space shuttle mission launched in 1990 that carried out a classified military mission in an orbit tilted 62 degrees with respect to the equator.

The International Space Station never flies beyond 51.6 degrees north and south latitude. NASA planned to launch a space shuttle on a classified military mission around the poles in 1986, but the flight was canceled in the wake of the Challenger disaster.

Advertisement

“The North and South Poles are invisible to astronauts on the International Space Station, as well as to all previous human spaceflight missions except for the Apollo lunar missions but only from far away,” the Fram2 website says. “This new flight trajectory will unlock new possibilities for human spaceflight.”

SpaceX has launched 13 piloted missions carrying 50 astronauts, cosmonauts and private citizens to orbit in nine NASA flights to the space station, three commercial visits to the lab and the Inspiration4 mission chartered by Isaacman.

Isaacman and three crewmates plan to blast off Aug. 26 on another fully commercial flight, this one featuring the first civilian spacewalks. NASA plans to launch its next Crew Dragon flight to the space station around Sept. 24.

Advertisement





Source link

Advertisement
Continue Reading

Technology

Finally, a screen that goes anywhere

Published

on

Finally, a screen that goes anywhere

Today we’re launching a totally new, totally different app. Meet Orion.

Orion is a small, fun app that helps you use your iPad as an external HDMI display for any camera, video game console, or even VHS. Just plug in one of the bajillion inexpensive adapters, and Orion handles the rest.

But wait — we’re a camera company. Why an HDMI monitor?

We built this to scratch a few itches. First, in professional cinematography, it’s common to connect an external screen to your camera to get a better view of the action. Orion not only gives you a bigger screen, but you can even share screenshots with your crew with a couple of taps.

We also built this for… pure fun. When traveling with a Nintendo Switch, it’s a delight to play games on a bigger screen, especially alongside friends.

Source link

Advertisement
Continue Reading

Trending

Copyright © 2017 Zox News Theme. Theme by MVP Themes, powered by WordPress.