22 C
Dubai
Sunday, December 22, 2024

Bogus Ideas Have Superspreaders, Too

Must read

This article is part of the On Tech newsletter. You can sign up here to receive it weekdays.

If the Rock encouraged his 58 million Facebook followers to vandalize a fast-food restaurant, Facebook’s policies would treat it the same as if your neighbor blasted this to his 25 friends. President Trump’s tweets can subject people to relentless harassment, but Twitter applies the same (or even looser) rules to his account as to ours.

This past week (and forever), internet companies have been trying to figure out how to handle posts that can encourage violence, contribute to social division and harassment, or spread false information about elections or other high-stakes topics.

When online companies make these decisions, they largely consider the substance of the message, divorced from the messenger, to decide whether a post is harmful and should be deleted or hidden.

But whether they intend it or not, celebrities, politicians and others with large online followings can be superspreaders — not of the coronavirus but of dangerous or false information. And I wonder whether these prominent people need to be held to stricter rules.

When bogus information moves from fringe corners of the internet into mainstream discussions, it’s usually because prominent people helped it get there. Last year, a creepy online hoax called the “Momo challenge” went big after Kim Kardashian posted about it on Instagram. Physicians with many internet followers helped fan a false conspiracy about the origins of the coronavirus.

It would be helpful to break the chain of transmission for these bogus information superspreaders. I admit, this alone won’t fill the internet with happy rainbows, and I’m not sure how this would work practically. But here are a few ideas:

What if once you reach a half-million followers or subscribers, if you share something that fact checkers deem a hoax, or if you post something that brushes close to the internet companies’ existing rules against hate speech, you get a strike against you? (YouTube has a system like this.)

If you collect enough strikes, the punishment could be lower distribution in Facebook’s feed, for example, or you could be blocked from retweets.

These influential people might still be free to post whatever they want online, but fewer people would see it. Yes, that would go for political figures like Mr. Trump. (People who study misinformation say that you can say what you want online, but the internet companies don’t have to spread your message to the world.)

A more radical idea is that once people reach the top tier of follower counts or subscribers on Facebook, Twitter and YouTube, any material they try to post would be quarantined and screened before it hits the internet.

I know. This makes me uneasy, too. There is some precedent for this, though. YouTube has a “preferred” tier of videos that people screen before deeming them safe for commercial messages.

In fact, the internet companies tend to have stricter rules for their business partners than for the rest of us. If a yogi wants to make money from her Instagram account, material that might be typically permitted — vulgar gestures, for example — could exclude her from revenue opportunities.

There’s an awakening that internet companies’ decisions and designs can make online life nastier than it should be. There is no magic wand to fix this. What I’m asking is, whether to slow the virus of nastiness and baloney, we need to consider that some people have more power to spread it than others.

If you don’t already get this newsletter in your inbox, please sign up here.


Here’s a funny (but not funny) thing about Facebook: Over and over when the company is confronted by people who say that it’s doing something off base, Facebook shouts that it is correct and principled and will never budge.

And then over and over, Facebook budges.

This happened when Facebook was confronted with suspicions that Russia-backed trolls were abusing the site to stoke divisions among Americans, when there were revelations about a political firm improperly harvesting Facebook user data, and when Indians were unhappy about Facebook’s prefabricated internet.

Each time the company lashed out, denied the accusation or stuck to its guns. And each time, the company was belatedly forced to admit its mistakes.

This has happened so many times, I made a list a couple years ago.

And it hasn’t stopped. After weeks of making principled speeches about its hands-off approach to inflammatory posts by Mr. Trump, Facebook agreed with some of its employees and others who said posts like that don’t deserve a wide berth.

You can see signs of that Facebook hubris, too, in how it initially responded to advertisers that wanted the company to do more to tackle nastiness on the site’s online hangouts.

It’s natural for a company to defend itself, but Facebook has a bad habit of retreating and lashing out when it should be listening. Facebook would create a lot more trust if it took criticism seriously from the start.


  • The reach of China’s surveillance machine: New research shows that Chinese hackers built software to infect and stalk cellphones of the country’s largely Muslim Uighur population even when they traveled outside China. Uighurs long suspected they were being monitored, but my colleagues Paul Mozur and Nicole Perlroth write that groups connected to China’s government were deploying invasive surveillance software for far longer and in more places than anyone believed.

  • “We need to make our tech last longer.” My colleague Brian X. Chen found a great repair guy to fix his busted iPhone camera. And he has advice for in-person help and other ways to keep your electronics running to be kind to your wallet and our planet.

  • We are being watched: In San Diego, sensors attached to streetlights were pitched as a way to track traffic patterns. But law enforcement also regularly accesses the streetlight camera data in investigations, including for possible evidence of vandalism connected to protests against biased policing, according to the investigative news outlet Voice of San Diego.

Nothing says summer like a bulldog eating a watermelon?


We want to hear from you. Tell us what you think of this newsletter and what else you’d like us to explore. You can reach us at [email protected].

If you don’t already get this newsletter in your inbox, please sign up here.



[ad_2]

Source link

More articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest article