Pensacola, Florida
Monday December 17th 2018

Archives

Outtakes—Fake News Crisis

By Rick Outzen

The Fake News Crisis isn’t how newspapers report and hold our government accountable. The crisis is we are entering an era in which the technology exists to make it appear as if something has happened, regardless of whether or not it did, and public discourse can be undermined easily.

In an interview for BuzzFeed, Aviv Ovadya, the Chief Technologist at the University of Michigan’s Center for Social Media Responsibility, pointed out that incentives that govern the web’s biggest social media platforms are calibrated to reward information that is often misleading and polarizing or both. The platforms want traffic and don’t care if the users are real or fake. Facebook, Twitter, and Google prioritize clicks, shares, ads, and money over quality of information.

The 2016 election showed us how vulnerable the algorithmically-optimized world of social media is to propaganda and misinformation. Entities tied to foreign governments manipulated the public discourse on the candidates, race and a variety of hot-button issues.

Ovadya warned in the future that easy-to-use and eventually seamless technological tools might allow dangerous players on the web to manipulate perception and falsify reality. The public will have a more difficult time believing what it reads, sees and hears.

Advanced technology can create the belief that an event has occurred that didn’t happen. Footage and audio of Donald Trump or North Korean dictator Kim Jong Un can be manipulated to create an audio or video clip of one them declaring nuclear war.

Ovadya said, “It doesn’t have to be perfect—just good enough to make the enemy think something happened that it provokes a knee-jerk and reckless response of retaliation.”

Technology can also manipulate public discourse by jamming government switchboards and inboxes with algorithmically-generated pleas. That’s exactly what happened last year when the Federal Communications Commission asked for public comments on changes to net neutrality protections.

A Pew Research Center study found that only 6 percent of the nearly 22 million public comments about the net neutrality rules were unique comments. Automated comment filing and bot programs delivered millions of comments on both sides of the issue. The center found tens of thousands of comments were filed simultaneously at the same second several times.

The flood of misinformation has begun to create apathy for news, and people have become less informed about what is happening. The ability of the press to hold our governments accountable has started to erode.

Facebook, Twitter and other platforms have to increase their vigilance in weeding out the bots from their systems and create a certification system for its news. Such changes will take time but are necessary to preserve our republic.