Scary impacts of Facebook’s predictive algorithms

Scary impacts of Facebook’s predictive algorithms

SME operators, and indeed any users of Facebook, should be aware of the way Facebook is effectively influencing, rather than merely distributing, our thoughts and opinions.

There are few national issues that are controversial enough to get everyone talking, posting, commenting, blogging and writing. One of these, of course, is the same-sex marriage postal survey. Now, the point of this article isn’t to sway anyone one way or another, but to surprise you with the fact that Facebook influences you and the way you perceive the world more than you realise it does.

Like many of you, I thought I was always aware of exactly how much Facebook knows about me. It only really takes a quick glimpse of the ads on my timeline to find out my stereotypical life stage and what I should want, according to this world.

In other words, if I were to say I am in a relationship or engaged, I will start to see a lot of Facebook advertisements from wedding vendors (a bit useless if, say, I live in Australia and am in a same-sex relationship.) If I tell Facebook I am married, then I am shown advertisements for property developments, baby stores and IVF clinics.

Knowing our demographics and quantitative data is very different to knowing our behaviours. Announcing that we are in a relationship doesn’t mean we are directly on the way to the chapel, for a gazillion reasons.

What I can tell you for a fact, though, is that those wedding companies who are relying on our relationship status alone for attracting new customers is therefore not very solid.

Facebook has introduced some very unique algorithms to discover and even reinforce our behaviour. Discovering our behaviour is scary enough, but reinforcing our behaviour is influencing it.

The fact that this ensures advertisers can target their demographic very cleverly is just part of the equation. Governments and large organisations can even influence our worlds in a way that we would not approve of, if only we knew.

How do we know it happens? Well, let’s take the same-sex marriage vote.

Some of my friends and associates posted some very passionate pleas in favour of the ‘yes’ vote. Their essay appeared on their account and on their timeline, and to their surprise, it seemed that all of their Facebook “friends” agreed with them. Lots of likes and posts that were in approval of their post.

Pleasantly surprised that all of their Facebook friends were in total agreement, they also felt a little short-changed, as they had also kind of hoped to start a discussion and change even just one mind who wanted to vote the opposite, or not vote at all!

What they didn’t realise, however, was that this Facebook algorithm works very well to reinforce their own position, and hide those who are in opposition.

This also works on other controversial topics too, like the US elections last year, and even down to the demure (whether a new restaurant is worth visiting). This means that you are only privy to the Facebook group of “friends” who agree with you, thereby reinforcing your opinions and attitudes. 

So how does Facebook know your position on various issues? Well, in a number of ways.

Facebook has access to a number of behaviours that show exactly who we are, what we do, and what we believe in. What we do on other websites is being shared with Facebook, as well as our browsing history.

This is just a very small example of how your behaviour is monitored by Facebook – as you can see, you don’t even need to be logged into Facebook for them to have access to these points.

It is an interesting thought – that Facebook influences our opinions by reinforcing them.

Not only is Facebook creating a world that is in agreement with our own position in it, but there are fewer opportunities for us on Facebook to share our opposing thoughts and values.

In an open letter earlier this year, Mark Zuckerberg (the founder and “boss” of Facebook), said that Facebook “stands for bringing us closer together and building a global community”.

That may be true, but exactly how resilient are we going to be, in this artificial world where everyone, apparently, shares our world view?

Katja Forbes is a business owner and educator in design thinking and interaction at the University of Sydney.

 

Scary impacts of Facebook’s predictive algorithms
mybusiness logo

Related Articles

promoted stories