We collect cookies to analyze our website traffic and performance; we never collect any personal data.Cookies Policy
Accept
Michigan Post
Search
  • Home
  • Trending
  • Michigan
  • World
  • Politics
  • Top Story
  • Business
    • Business
    • Economics
    • Real Estate
    • Startups
    • Autos
    • Crypto & Web 3
  • Tech
  • Lifestyle
    • Lifestyle
    • Food
    • Beauty
    • Art & Books
  • Health
  • Sports
  • Entertainment
  • Education
Reading: When researchers tried to analyse if Fb was inflicting hurt, the information was manipulated in a transfer straight from the tobacco playbook
Share
Font ResizerAa
Michigan PostMichigan Post
Search
  • Home
  • Trending
  • Michigan
  • World
  • Politics
  • Top Story
  • Business
    • Business
    • Economics
    • Real Estate
    • Startups
    • Autos
    • Crypto & Web 3
  • Tech
  • Lifestyle
    • Lifestyle
    • Food
    • Beauty
    • Art & Books
  • Health
  • Sports
  • Entertainment
  • Education
© 2024 | The Michigan Post | All Rights Reserved.
Michigan Post > Blog > Startups > When researchers tried to analyse if Fb was inflicting hurt, the information was manipulated in a transfer straight from the tobacco playbook
Startups

When researchers tried to analyse if Fb was inflicting hurt, the information was manipulated in a transfer straight from the tobacco playbook

By Editorial Board Published October 7, 2024 7 Min Read
Share
When researchers tried to analyse if Fb was inflicting hurt, the information was manipulated in a transfer straight from the tobacco playbook

For nearly a decade, researchers have been gathering proof that the social media platform Fb disproportionately amplifies low-quality content material and misinformation.

So it was one thing of a shock when in 2023 the journal Science revealed a research that discovered Fb’s algorithms weren’t main drivers of misinformation in the course of the 2020 United States election.

This research was funded by Fb’s dad or mum firm, Meta. A number of Meta staff had been additionally a part of the authorship group. It attracted in depth media protection. It was additionally celebrated by Meta’s president of worldwide affairs, Nick Clegg, who stated it confirmed the corporate’s algorithms have “no detectable impact on polarisation, political attitudes or beliefs”.

However the findings have not too long ago been thrown into doubt by a group of researchers led by Chhandak Bagch from the College of Massachusetts Amherst. In an eLetter additionally revealed in Science, they argue the outcomes had been probably resulting from Fb tinkering with the algorithm whereas the research was being carried out.

In a response eLetter, the authors of the unique research acknowledge their outcomes “might have been different” if Fb had modified its algorithm differently. However they insist their outcomes nonetheless maintain true.

The entire debacle highlights the issues brought on by large tech funding and facilitating analysis into their very own merchandise. It additionally highlights the essential want for larger unbiased oversight of social media platforms.

Retailers of doubt

Huge tech has began investing closely in tutorial analysis into its merchandise. It has additionally been investing closely in universities extra typically.

For instance, Meta and its chief Mark Zuckerberg have collectively donated tons of of tens of millions of {dollars} to greater than 100 faculties and universities throughout the US.

That is just like what large tobacco as soon as did.

Within the mid-Fifties, cigarette corporations launched a coordinated marketing campaign to fabricate doubt in regards to the rising physique of proof which linked smoking with plenty of critical well being points, comparable to most cancers. It was not about falsifying or manipulating analysis explicitly, however selectively funding research and bringing to consideration inconclusive outcomes.

This helped foster a story that there was no definitive proof smoking causes most cancers. In flip, this enabled tobacco corporations to maintain up a public picture of duty and “goodwill” properly into the Nineteen Nineties.

A constructive spin

The Meta-funded research revealed in Science in 2023 claimed Fb’s information feed algorithm lowered consumer publicity to untrustworthy information content material. The authors stated “Meta did not have the right to prepublication approval”, however acknowledged that The Fb Open Analysis and Transparency group “provided substantial support in executing the overall project”.

The research used an experimental design the place individuals – Fb customers – had been randomly allotted right into a management group or remedy group.

The management group continued to make use of Fb’s algorithmic information feed, whereas the remedy group was given a information feed with content material introduced in reverse chronological order. The research sought to check the consequences of those two varieties of information feeds on customers’ publicity to probably false and deceptive info from untrustworthy information sources.

The experiment was sturdy and properly designed. However in the course of the quick time it was carried out, Meta modified its information feed algorithm to spice up extra dependable information content material. In doing so, it modified the management situation of the experiment.

The discount in publicity to misinformation reported within the authentic research was probably because of the algorithmic modifications. However these modifications had been non permanent: a couple of months later in March 2021, Meta reverted the information feed algorithm again to the unique.

In a press release to Science in regards to the controversy, Meta stated it made the modifications clear to researchers on the time, and that it stands by Clegg’s statements in regards to the findings within the paper.

Unprecedented energy

In downplaying the function of algorithmic content material curation for points comparable to misinformation and political polarisation, the research turned a beacon for sowing doubt and uncertainty in regards to the dangerous affect of social media algorithms.

To be clear, I’m not suggesting the researchers who carried out the unique 2023 research misled the general public. The actual downside is that social media corporations not solely management researchers’ entry to knowledge, however may manipulate their methods in a approach that impacts the findings of the research they fund.

What’s extra, social media corporations have the ability to advertise sure research on the very platform the research are about. In flip, this helps form public opinion. It may create a state of affairs the place scepticism and doubt in regards to the impacts of algorithms can turn into normalised – or the place individuals merely begin to tune out.

This type of energy is unprecedented. Even large tobacco couldn’t management the general public’s notion of itself so straight.

All of this underscores why platforms must be mandated to offer each large-scale knowledge entry and real-time updates about modifications to their algorithmic methods.

When platforms management entry to the “product”, additionally they management the science round its impacts. In the end, these self-research funding fashions permit platforms to place revenue earlier than individuals – and divert consideration away from the necessity for extra transparency and unbiased oversight.When researchers tried to analyse if Fb was inflicting hurt, the information was manipulated in a transfer straight from the tobacco playbook

Timothy Graham, Affiliate Professor in Digital Media, Queensland College of Expertise

This text is republished from The Dialog underneath a Artistic Commons license. Learn the unique article.

TAGGED:analysecausingdataFacebookharmmanipulatedmoveplaybookresearchersstraighttobacco
Share This Article
Facebook Twitter Email Copy Link Print

HOT NEWS

Ioannis Antypas on Helping Businesses Expand Into Saudi Arabia and the Middle East

Ioannis Antypas on Helping Businesses Expand Into Saudi Arabia and the Middle East

BusinessTrending
January 3, 2026
Vintage Rare USA: A Curated Archive of Iconic American Style

Vintage Rare USA: A Curated Archive of Iconic American Style

True vintage is not about trends—it’s about authenticity, heritage, and character. Vintage Rare USA has…

December 25, 2025
Omri Raiter: AI and Fusion Are Becoming Core Tools Against the Next Generation of Crime

Omri Raiter: AI and Fusion Are Becoming Core Tools Against the Next Generation of Crime

By Omri Raiter, Founder and CEO of RAKIA Group The next generation of organized crime…

December 24, 2025
Ocado chair joins Visma board forward of €20bn London float

Ocado chair joins Visma board forward of €20bn London float

The chairman of Ocado Group has been recruited to the board of Visma, the European…

December 18, 2025
Unique: Minnie Driver Proves 55 Is the New Fabulous – Beauty

Unique: Minnie Driver Proves 55 Is the New Fabulous – Beauty

Minnie Driver is in a second of full-flight momentum, getting into a vivid, confident period…

December 18, 2025

YOU MAY ALSO LIKE

Lodging accessibility startup Heartful is up for grabs – or closing down

When Jen Clark launched Heartful, the web lodging market for sustainable and inclusive journey, simply over 12 months in the…

Startups
December 18, 2025

GAMING: ‘Ship and survive’ – the Recreation of the Yr Awards has a quiet accessibility disaster

Most of this 12 months’s Recreation of the Yr nominees are lacking primary accessibility options, exposing how trade layoffs and…

Startups
December 16, 2025

How AI performed a central position in spreading misinformation in regards to the Bondi terrorist assault – due to a pretend information website

Hours after the Bondi terrorist assault, whereas many Australians slept, a delusion was generated and laundered via synthetic intelligence. The…

Startups
December 16, 2025

3 methods to scale back trauma for everybody after an occasion like Bondi

After mass violence such because the Bondi seaside terrorist assault on Sunday, misery doesn't cease with these straight affected. Worry,…

Startups
December 16, 2025

Welcome to Michigan Post, an esteemed publication of the Enspirers News Group. As a beacon of excellence in journalism, Michigan Post is committed to delivering unfiltered and comprehensive news coverage on World News, Politics, Business, Tech, and beyond.

Company

  • About Us
  • Newsroom Policies & Standards
  • Diversity & Inclusion
  • Careers
  • Media & Community Relations
  • Accessibility Statement

Contact Us

  • Contact Us
  • Contact Customer Care
  • Advertise
  • Licensing & Syndication
  • Request a Correction
  • Contact the Newsroom
  • Send a News Tip
  • Report a Vulnerability

Term of Use

  • Digital Products Terms of Sale
  • Terms of Service
  • Privacy Policy
  • Cookie Settings
  • Submissions & Discussion Policy
  • RSS Terms of Service
  • Ad Choices

© 2024 | The Michigan Post | All Rights Reserved

Welcome Back!

Sign in to your account

Lost your password?