August 27, 2014

Source: Shutterstock

Following every national crisis, the Internet serves as a community bulletin board where anyone feels free to tack on his inane beliefs. Regarding Michael Brown, millions of opinions on the teenager’s death are popping up on social media sites like Facebook, Twitter, and Reddit. In all this proselytizing, armchair commentators are discovering something new: not all social networks are playing along.

According to the Washington Post, Facebook is using its algorithm to control the flow of news. While social media platforms like Twitter are broadcasting information in real time, Mark Zuckerberg’s brainchild is filtering “€œupsetting”€ content”€”including the rash of violence in Ferguson. Think of it as a trigger warning to protect delicate sensibilities. Or more accurately: information suppression.

Some people are surprised by this deliberate manipulation of news. As a social network, Facebook is supposed to spread information at breakneck speed. But a new marketing ploy is disrupting the organic trading of stories. Facebook is purposefully hiding negative content. The social juggernaut is boosting positive news stories to provide a happier, more carefree environment.

“€œViewing horrendous death with a passive shrug is a by-product of the virtual immediacy we now rely on. There’s an addictive aspect to stimulation that man’s nature often falls prey to.”€

Facebook’s “€œfeed”€ manipulation is just one among the increasing number of pitfalls in the digital era. For all the benefits of peer-to-peer instant communication, someone is attempting to alter the flow of information to their own benefit. The Internet was supposed to provide a revolutionary means to spread news and ideas. No question it succeeded; but what are we really learning from it?

Recently, two videos depicting gruesome murders received a massive amount of attention. One video is of the questionable police shooting of a mentally ill man in St. Louis. The other is the beheading of journalist James Foley by an Islamic State jihadist.  Both show human beings losing their lives to their fellow man. Both show unnecessary death. And both videos are heated topics of conversation. It’s almost if we”€™ve grown accustomed to the virtualization of tragedy; such macabre imagery no longer horrifies us. Is this the enlightened freedom the digital age was supposed to usher in?

Viewing horrendous death with a passive shrug is a by-product of the virtual immediacy we now rely on. There’s an addictive aspect to stimulation that man’s nature often falls prey to. The seemingly infinite depths of the Internet have become a kind of Siren song; luring the weak-minded into a stream of constant imagery and satisfaction. Pornography, for instance, taps into brain chemistry in a way that demands increasing amounts of gratification. One naughty picture leads to another, until ethical decision-making is rendered foggy and obscure. As Catholic bishop Paul Loverde writes, porn remaps the brain, and “€œit becomes very difficult for one to “€˜reset”€™ to a sense of normality in the future.”€ Grim videos of murder have the same psychological effect as lurid sex tapes.

This is just one way in which the Internet serves as an emotional roller coaster. Another comes in the form of custom-fit lifestyles. We aren”€™t just giving away our personal values to the World Wide Web; we are transferring our very conception of self to cyberspace. Writing in The Week, Michael Brendan Dougherty avers that as digital “€œnetworks learn from our personal habits,”€ we in turn demand “€œautomation and intelligence.”€ That means our smartphones recognizing our daily commute. It means our personal information shooting out to various third parties without explicit consent. It means social networks exploiting and controlling favorable news before we see it. In the end our cognition becomes predicated on what exists in the digital realm, instead of the other way around.

The ride-sharing service Lyft recently launched “€œLyft Line,”€ a program to help people carpool using the same personal driver. Lyft advertises the new service as an affordable way to travel. What it doesn”€™t acknowledge is that the service is based on a computer program that memorizes personal commutes. User customization is celebrated, without recognizing the negative repercussion: daily life becomes an automatized piece of data. The small device we carry in our pocket acts as a homing mechanism. All movement is catalogued; all actions are recorded and put through a complex formula.

And it doesn”€™t stop there. Earlier this summer, Apple announced “€œHomeKit,”€ a kind of “€œsmart”€ program that totally integrates your house with the Internet. The idea is that home appliances will “€œtalk”€ to you via your cell phone, and act according to your wishes. Conservative writer David Walbert says that Apple is trying to bank on “€œour desire to rationalize everything, intellectualize it, control it.”€ The ability to control one’s abode from a smartphone screen sure sounds like the height of civilization. But did anyone ever stop for a second to think: why should man be put in control of everything? Or why should private space become one with the Internet?

Columnists

Sign Up to Receive Our Latest Updates!