Do algorithms erode our ability to think?
<div class="copy">
<p>Have you ever watched a video or movie because YouTube or Netflix recommended it to you?</p>
<p>Or added a friend on Facebook from the list of “people you may know”?</p>
<p>And how does Twitter decide which tweets to show you at the top of your feed?</p>
<p>These platforms are driven by algorithms, which rank and recommend content for us based on our data.</p>
<p>As Woodrow Hartzog, a professor of law and computer science at Northeastern University, Boston, <a rel="noreferrer noopener" href="https://www.abc.net.au/news/science/2018-04-30/how-the-internet-tricks-you-out-of-privacy-deceptive-design/9676708" target="_blank">explains</a>: “If you want to know when social media companies are trying to manipulate you into disclosing information or engaging more, the answer is always.”</p>
<p>So if we are making decisions based on what’s shown to us by these algorithms, what does that mean for our ability to make decisions freely?</p>
<h3>What we see is tailored for us</h3>
<p>An algorithm is a digital recipe: a list of rules for achieving an outcome, using a set of ingredients.</p>
<p>Usually, for tech companies, that outcome is to make money by convincing us to buy something or keeping us scrolling in order to show us more advertisements.</p>
<p>The ingredients used are the data we provide through our actions online – knowingly or otherwise.</p>
<p>Every time you like a post, watch a video, or buy something, you provide data that can be used to make predictions about your next move.</p>
<p>These algorithms can influence us, even if we’re not aware of it. As the New York Times’ <a rel="noreferrer noopener" href="https://www.nytimes.com/2020/04/22/podcasts/rabbit-hole-prologue.html" target="_blank">Rabbit Hole podcast</a> explores, YouTube’s recommendation algorithms can drive viewers to <a rel="noreferrer noopener" href="https://www.theguardian.com/technology/2018/feb/02/how-youtubes-algorithm-distorts-truth" target="_blank">increasingly extreme content</a>, potentially leading to online radicalisation.</p>
<p>Facebook’s News Feed algorithm ranks content to keep us engaged on the platform.</p>
<p>It can produce a phenomenon called “<a rel="noreferrer noopener" href="https://www.pnas.org/content/111/24/8788/tab-article-info" target="_blank">emotional contagion</a>”, in which seeing positive posts leads us to write positive posts ourselves, and seeing negative posts means we’re more likely to craft negative posts — though this study was <a rel="noreferrer noopener" href="https://www.pnas.org/content/111/29/10779.1" target="_blank">controversial</a> partially because the effect sizes were small.</p>
<p>Also, so-called “<a rel="noreferrer noopener" href="https://www.abc.net.au/news/science/2018-04-30/how-the-internet-tricks-you-out-of-privacy-deceptive-design/9676708" target="_blank">dark patterns</a>” are designed to trick us into sharing more, or <a rel="noreferrer noopener" href="https://econsultancy.com/three-dark-patterns-ux-big-brands-and-why-they-should-be-avoided/" target="_blank">spending more</a> on websites like Amazon.</p>
<p>These are tricks of website design such as hiding the unsubscribe button, or showing how many people are buying the product you’re looking at <em>right now</em>.</p>
<p>They subconsciously nudge you towards actions the site would like you to take.</p>
<h3>You are being profiled</h3>
<p>Cambridge Analytica, the company involved in the largest known Facebook data leak to date, claimed to be able to <a rel="noreferrer noopener" href="https://www.newyorker.com/news/news-desk/cambridge-analytica-and-the-perils-of-psychographics" target="_blank">profile your psychology</a> based on your “likes”.</p>
<p>These profiles could then be used to target you with political advertising.</p>
<p>“Cookies” are small pieces of data which track us across websites.</p>
<p>They are records of actions you’ve taken online (such as links clicked and pages visited) that are stored in the browser.</p>
<p>When they are combined with data from multiple sources including from large-scale hacks, this is known as “<a rel="noreferrer noopener" href="https://www.abc.net.au/news/science/2019-12-03/data-enrichment-industry-privacy-breach-people-data-labs/11751786" target="_blank">data enrichment</a>”.</p>
<p>It can link our personal data like email addresses to other information such as our education level.</p>
<p>These data are regularly used by tech companies like Amazon, Facebook, and others to build profiles of us and predict our future behaviour.</p>
<h3>You are being predicted</h3>
<p>So, how much of your behaviour can be predicted by algorithms based on your data?</p>
<p>Our research, <a href="https://www.nature.com/articles/s41562-018-0510-5">published in </a><em><a rel="noreferrer noopener" href="https://www.nature.com/articles/s41562-018-0510-5" target="_blank">Nature Human Behaviou</a></em><a href="https://www.nature.com/articles/s41562-018-0510-5">r last year</a>, explored this question by looking at how much information about you is contained in the posts your friends make on social media.</p>
<p>Using data from Twitter, we estimated how predictable peoples’ tweets were, using only the data from their friends.</p>
<p>We found data from eight or nine friends was enough to be able to predict someone’s tweets just as well as if we had downloaded them directly (well over 50% accuracy, see graph below).</p>
<p>Indeed, 95% of the potential predictive accuracy that a machine learning algorithm might achieve is obtainable <em>just</em> from friends’ data.</p>
<p>Our results mean that even if you #DeleteFacebook (which trended after the <a rel="noreferrer noopener" href="https://www.sbs.com.au/news/deletefacebook-calls-grow-after-cambridge-analytica-data-scandal" target="_blank">Cambridge Analytica scandal in 2018</a>), you may still be able to be profiled, due to the social ties that remain.</p>
<p>And that’s before we consider the things about Facebook that make it so <a rel="noreferrer noopener" href="https://theconversation.com/why-its-so-hard-to-deletefacebook-constant-psychological-boosts-keep-you-hooked-92976" target="_blank">difficult to delete</a> anyway.</p>
<p>We also found it’s possible to build profiles of <em>non-users</em> — so-called “<a rel="noreferrer noopener" href="https://www.nature.com/articles/s41562-018-0513-2" target="_blank">shadow profiles</a>” — based on their contacts who are on the platform.</p>
<p>Even if you have never used Facebook, if your friends do, there is the possibility a shadow profile could be built of you.</p>
<p>On social media platforms like Facebook and Twitter, privacy is no longer tied to the individual, but to the network as a whole.</p>
<h3>No more free will? Not quite</h3>
<p>But all hope is not lost. If you do delete your account, the information contained in your social ties with friends grows stale over time.</p>
<p>We found predictability gradually declines to a low level, so your privacy and anonymity will eventually return.</p>
<p>While it may seem like algorithms are eroding our ability to think for ourselves, it’s not necessarily the case.</p>
<p>The evidence on the effectiveness of psychological profiling to influence voters <a rel="noreferrer noopener" href="https://www.nytimes.com/2017/03/06/us/politics/cambridge-analytica.html" target="_blank">is thin</a>.</p>
<p>Most importantly, when it comes to the role of people versus algorithms in things like spreading (mis)information, people are just as important.</p>
<p>On Facebook, the extent of your exposure to diverse points of view is more closely related <a rel="noreferrer noopener" href="https://science.sciencemag.org/content/348/6239/1130" target="_blank">to your social groupings</a> than to the way News Feed presents you with content.</p>
<p>And on Twitter, while “fake news” may spread faster than facts, it is <a rel="noreferrer noopener" href="https://science.sciencemag.org/content/359/6380/1146" target="_blank">primarily people who spread it</a>, rather than bots.</p>
<p>Of course, content creators exploit social media platforms’ algorithms to promote content, on <a rel="noreferrer noopener" href="https://theconversation.com/dont-just-blame-youtubes-algorithms-for-radicalisation-humans-also-play-a-part-125494" target="_blank">YouTube</a>, <a rel="noreferrer noopener" href="https://theconversation.com/dont-just-blame-echo-chambers-conspiracy-theorists-actively-seek-out-their-online-communities-127119" target="_blank">Reddit</a> and other platforms, not just the other way round.</p>
<p><em>Image credit: Shutterstock</em></p>
<p><em>This article was originally published on <a rel="noopener" href="https://cosmosmagazine.com/people/behaviour/are-algorithms-eroding-our-ability-to-think/" target="_blank">cosmosmagazine.com</a> and was written by The Conversation.</em></p>
</div>