Our online identities might have their limits

For the first time in my distinguished Facebook career, I’m taking the time to review some of my ad preferences on the site as I write this post. This isn’t something many of us seem to think about. It took extraordinary events to even get me to look around.

After the results of the recent presidential election came in, the condemnation of fake news was swift. If only for a moment, as everyone was searching for an answer to the Trump phenomenon, eyes turned to Facebook and Google, two Internet giants, to search for answers.

A large share of the American population was going through the angry part of the grieving process after the loss of Hillary Clinton.  At first, Facebook didn’t want to accept responsibility for possibly swaying the election. Google only responded to the issue after post-election coverage fixated on the problem. But as the news cycle turned to the actions of Trump’s administration, the role of Facebook and Google faded into the background. Recently, as Poynter indicated, researchers from Stanford and NYU found that the influence of fake news “was overstated.”

Yet, I still can’t get over the fact that this experience got the American public and media asking about dissemination of information on a collective scale. All of a sudden, in the midst of our blissful Information Age honeymoon, we caught a glimpse of the dark side of social media and search engines. We didn’t like what we saw and we panicked. Now, Google and Facebook, like spurned lovers, are trying to change for fear of losing us.

As they readjust, we have become acutely aware of the methods these companies use to foster our dependence. Seriously …, a BBC podcast, referred to this as “positive reinforcement” in an episode published last month. It makes sense that every click of an ad or an article is recorded as something you might want to hear about in the future. The algorithm used to organize your Facebook News Feed into the ambiguously titled “Top Stories” depends on you feeding and rewarding it.

So after more than ten years of clicking through Facebook, my reputation should precede me on their website. Let’s take a look at a small sample of what they have on me in the ad preferences section:

facebook-interests
Image by John Hernandez via Facebook

There are a lot of accurate interests included here, but some misses that appear to be unrelated to anything I’ve ever done on Facebook. Agriculture? Construction? Esurance? None of that sounds familiar.

Our complex identities have been reduced to a seemingly random list of interests. These interests drive the algorithm serving us information and ads on Facebook. These limitations are self-imposed by the number of friends we have and the interests we highlight but are then taken a step further so that we tend to only see the items we agree with.

If the Internet ever feels like a comfy place to get away from it all, then you’ve experienced that phenomenon. But now that we’re witnessing the divisiveness between opposing political sides and we’re questioning the very fabric of what constitutes “facts,” we may finally be sensing the adverse effects of the echo chamber we’ve created. If the people you tend to disagree with seem absurd, strange or monstrous, then you might have social media to blame.

This is referred to as “false-consensus bias,” as Sean Blanda points out in his Medium post “The ‘Other Side’ Is Not Dumb.” Many of us have been conditioned to be biased to our own self-interests as our beliefs are typically coddled, not tested, by the media pillars of the Internet that we depend on. Talk about a slippery slope.

That’s not to say that social media or Google’s search algorithm is something to eschew going forward. The process of finding a balance, as evidenced by forthcoming posts on this blog, is a work in progress. Even if we wanted to upend the status quo, we’re not going to anytime soon. Our relationship to providers of information is ingrained and symbiotic. There’s no escaping it.

More importantly, whether you’re an emerging media professional, like me, or just a person looking for a chance at civic participation, we stand to benefit by tweaking our role in this relationship and asserting ourselves as our own curators of information.

If the algorithms aim to please, they should adjust accordingly. After all, Google and Facebook could have done nothing after the “fake news” spat. Their moves on that front indicate we have some power in this relationship. Large media and tech companies, not just Facebook and Google, are starting to recognize their emerging role as “gatekeepers” and we should take advantage of this opportunity to guide them.

Consider getting started by looking at your Facebook ad preferences (if you’re logged in, you should be able to access the preferences at this link). See how accurate they are and then see if that bothers you. You might even consider switching your News Feed to a chronological order, “Most Recent,” instead of “Top Stories” to see if that’s preferable—though Facebook tends to switch it back after a while. The option is available on the left hand of the page on the desktop site.

news-feed
Image by John Hernandez via Facebook

Hopefully, you’ll begin to appreciate the power of websites, like Google or Facebook, to distance us from information as much as they connect us. When we recognize the duality of their function, we have a little more freedom to see what else is out there.

Advertisements

4 thoughts on “Our online identities might have their limits

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s