Instagram, Facebook under fire as Meta whistleblower slams Mark Zuckerberg

Mark Zuckerberg, CEO of tech giant Meta knows exactly the harm his products are causing teenagers, but he chooses to ignore it — that’s the stunning claim from a former senior engineer.

The allegation was made during a wide-ranging interview as part of the new digital series Meta Exposed: Secrets from Inside the Tech Giant.

Meta is a multinational technology company whose products include Facebook, Instagram, Messenger, Threads, and WhatsApp.

Know the news with the 7NEWS app: Download today

Arturo Bejar, Meta whistleblower Arturo Bejar, Meta whistleblower
Arturo Bejar, Meta whistleblower Credit: AAP

For six years until 2015, Arturo Bejar was the product leader at Facebook, responsible for its efforts to keep users safe and supported.

This included a group called Care, which was originally established to develop tools to support teenagers who were being bullied. Between 2019 and 2021 Bejar returned to Meta as a consultant to help with wellbeing issues at Instagram.

There is arguably no one better placed to talk about the methods and motivations of Meta’s executive team.

Meta by the numbers

Revenue: $134 billion (2023)

Zuckerberg wealth: $178 billion

Global users: 3.74 billion

Sextortion deaths: at least 39

US states filing lawsuits against Meta: 41

Revenue from under 17-year-olds: $11 billion (Harvard)

“They’re not transparent about the harm that teens are actually experiencing in their products, and they won’t give the kids, our kids the tools they need to get help when they need it,” Bejar says from his home in California.

The father says Meta executives including CEO Mark Zuckerberg and Instagram boss Adam Mosseri are blatantly aware of how dangerous their apps — particularly Instagram — are to young users but they deliberately underreport the prevalence of bullying and harassment.

Mark Zuckerberg, Meta CEO.Mark Zuckerberg, Meta CEO.
Mark Zuckerberg, Meta CEO. Credit: AAP

Meta publishes statistics in what’s called the “Transparency Centre” quoting bullying and harassment as occurring to four to seven out of 10,000 views but Bejar said the internal data tells a very different story.

“The number of teens that witnessed somebody getting bullied and harassed on Instagram in the last seven days is 27.7 per cent and the number of teens that directly experienced it is around 12 per cent. And when you ask those teens, when this happens and you turn to the company for support, did you feel supported? Fifty per cent say none at all,” Bejar says.

The numbers Bejar quotes are about 400 times greater than Meta’s official external data suggests.

His data comes from what is known as BEEF (Bad Experiences and Encounters Framework) — specific research done while Bejar was working at Meta.

Blowing the whistle

In 2023, the former senior Meta staffer testified before the US Congress, blowing the whistle on the social media company. He says he was motivated to expose the truth after witnessing his daughter’s experience on Instagram.

“When she first wanted to go on Instagram, we waited until she was 14. She created a private account and within a short time after that, she started receiving d*** pics and unwanted advances from other teens in schools nearby. I asked her what she did, and she tried reporting it and nothing happened,” he says.

Adam Mosseri, Instagram boss.Adam Mosseri, Instagram boss.
Adam Mosseri, Instagram boss. Credit: AAP
 Sheryl Sandberg testifies before the Senate Intelligence Committee hearing on Capitol Hill in Washington in 2022. Sheryl Sandberg testifies before the Senate Intelligence Committee hearing on Capitol Hill in Washington in 2022.
Sheryl Sandberg testifies before the Senate Intelligence Committee hearing on Capitol Hill in Washington in 2022. Credit: JLM/AP

The whistleblower said he flagged his concerns, backed-up with data, to Mark Zuckerberg, Adam Mosseri and Facebook’s former chief operating officer Sheryl Sandberg.

“I told them there is a critical gap in the way the company addresses harm.

“Mark Zuckerberg didn’t even reply to my note, which in any other circumstance, given my 30 years of experience in the industry, I would expect that an engagement from the CEO because this is a material harm that people are experiencing in the product, and I would’ve expected that to lead into a conversation that talks about these harms and how to reduce them. Instead, there was no response from Mark,” he recalls.

“I ended up eventually getting an email from Adam Mosseri saying, ‘I’ll be on point on this’. I did have a conversation with him, he said he clearly understood all of the issues that I brought up and that it all made sense.

“I asked if there were any flaws in my data or the logic that I was presenting (but) he agreed with everything. He thanked me for it. My contract ran out a few days later and then they just buried the report and didn’t do anything with the information,” Arturo says.

‘They just want to see no evil, hear no evil and speak no evil’

So why would Zuckerberg and his team want to fudge the real figures? Bejar says it’s simple — liability and reputational damage.

“I think it’s more a matter of in order to avoid embarrassing the company by the reality of the numbers in order to avoid liability from being aware of harm and then not doing enough to address it, which they could do if they knew about it,” Bejar says.

“They just want to see no evil, hear no evil and speak no evil while it happens all the time.

“The kinds of measures that I talk about reducing unwanted content, reducing inappropriate or unwanted contact would not have, I believe, a negative effect on engagement or other metrics of the company.”

Keeping kids safe – the solution

Bejar says there is a solution to keep young people safe on Instagram — a simple button, that could be developed and rolled out in a few months, to report unwanted content.

“So, I think you start by not calling it report. I will say that because we knew from research when I used to be working on this directly that teens don’t like the word report. And so, what you want to say is, ‘this is not for me’.

“And so you have a button that says, ‘can you help me with this?’ And they said, ‘what’s going on?’ I said, ‘oh, this message is not for me’. How come? Because it’s a little gross, it’s a little creepy. And the key in what I just said is that if the language in the product matches what the teens are experiencing, then teens will use it.”

We asked Meta a number of questions regarding Arturo’s claims — they “politely declined” to answer anything.

Few people have more experience in this space than Arturo Bejar, he’s passionate and driven by purpose. He doesn’t want to destroy the apps he simply wants to improve them. If Mark Zuckerberg truly wants to make his family of apps safe for the families using them he should direct message Arturo Bejar today.

Arturo Bejar biography

From 2009 to 2015, Arturo was the senior engineering and product leader at Facebook responsible for its efforts to keep users safe and supported, reporting to Mike Schroepfer, the CTO.

Arturo was responsible for “Site Integrity” — stopping attacks and malicious behaviour; “Security Infrastructure” – which engineered resilient systems and worked on compliance; and a group called “Care”– which developed Facebook’s user-facing and internal customer care tools, as well as child safety tools.

Arturo was responsible for the combined effort of engineering, product, user research, data, and design. This included regularly doing strategic product reviews with the Facebook executive team.

Arturo was also the engineering manager for Facebook’s “Product Infrastructure” team, which built key parts of the product engineering frameworks of Facebook, and developed REACT, one of the core technologies of the web today.

From 2019 to 2021, Arturo returned to Facebook to work as a part-time independent consultant and industry expert for the wellbeing team at Instagram.

In 2022, Arturo was a technical adviser, for the Facebook Oversight Board.

Before that Arturo was recruited to Facebook from Yahoo! where he worked from 1998 to 2009 . Arturo was hired as Yahoo!’s first security engineer, eventually becoming the head of Information Security reporting to the CTO.

Arturo started working for IBM in Mexico City when he was 15, was able to study Mathematics at King’s College London thanks to the support of Steve Wozniak, and first started working on security and social systems in Silicon Valley in 1994 as part of a startup called Electric Communities.

(Biography supplied as the written testimony of Arturo Bejar before the Subcommittee on Privacy, Technology, and the Law, November 7, 2023)

More information

Meta Exposed: Secrets from Inside the Tech Giant – watch here or listen here.

Watch 7NEWS Spotlight: Sextortion on 7plus.

If you or someone you know have been the victim of sextortion in Australia, visit: https://www.accce.gov.au/sextortionhelp

If you need help in a crisis, call Lifeline on 13 11 14. For further information about depression contact beyondblue on 1300224636 or talk to your GP, local health professional or someone you trust.

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Secular Times is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – seculartimes.com. The content will be deleted within 24 hours.

Leave a Comment