Social Media is Severing Society: A Review of “The Social Dilemma”
By: Caroline Jung
Something led you here to read this. First off, you have online access. You also probably have some sort of social media. Bingo. You’re my target audience and data mining has just done its job successfully. What’s data mining? Wikipedia defines it as “a process of discovering patterns in large data sets involving methods at the intersection of machine learning, statistics, and database systems.” Basically, it’s the “algorithm” that runs our social media and online content.
In the early beginnings of technology, data mining was an incredibly useful tool. Now, it’s everywhere, and it’s impossible to escape it. It’s used extensively in social media and The Social Dilemma, released in 2020 as a Netflix documentary, explores the unintended consequences of social media on humanity and society. It features interviews with tech experts who are giving dire warnings about their own technological creations; that’s alarming in itself. So what are they concerned about? How does this affect us?
Social media apps are businesses prioritizing profit. Their customers are not us, the users, but advertisers. These apps, most of the time, don’t focus if their product is harmful to the users, but instead build models to get the users’ maximum attention so that they can profit off of advertisers. They’re getting better at this too—there are new types of notifications, features, and they keep up with trends. It’s not easy to separate myself away from these apps because how am I supposed to fight against teams of engineers who have engineered apps that grab and hold my attention to it? Many of us are already aware of the heightened levels of anxiety, depression, and suicide rates from an unrealistic standard of life on these apps. But I don’t think a lot of us know that it is far more than those heightened levels and targeted ads that data mining does. We are being mined at a broad and deep scale, and because of it, social media is a major factor in an increase in conspiracy theories, white supremacy, terrorism, misinformation, polarization, radicalization, and etc. In summary, social media is also ripping the very trust that keeps societies together without us knowing.
Data mining controls our explore page, our for you page, and our feed. The “algorithm”, as we call it, is responsible for keeping our attention on the app, so it’ll only show us posts that it thinks we might like based on our previous actions. Every action is tracked. The “algorithm” does not check if the posts are true or harmful; it doesn’t track what the post encourages, only if that post is grouped in a category we might like. It serves us a drug, and while it is up to us if we take it, the “algorithm” only pushes more and more drugs until we get addicted, and at this point, the “algorithm” only shows us what we want to see to maximize the amount of time we spend on the app.
This has already been happening, but it is especially crucial now, in a time of increased polarization. Take a look at the Black Lives Matter movement. If a user looks at a BLM post and interacts with it, their feed will show more BLM posts, decreasing the number of posts opposing the movement. Similarly, opposers of BLM will interact with posts opposing BLM, decreasing their exposure to pro-BLM content. Both sides will wonder “how come the other side is still unaware of this? How stupid are they to never have seen so many logical arguments backing up my side?” They will start to become skeptical of each other while their feeds will continue to show posts on completely different sides of the spectrum. Less productive discussion will happen between the opposing sides because being skeptical of one another will be a precursor to those discussions. We have already seen peaceful protests turning into riots, cities under curfew, unwarranted arrests, burning cities, and the use of tear gas and rubber bullets. This is losing trust in each other as a society because we can’t fathom why the other side is acting so stupid to us without understanding that their feed looks different to ours.
People will say to do research, but what they don’t know is that research can also be determined by data mining. For example, Google’s autocomplete is based on previous searches and location, so what appears in the autocomplete dropbox for each person is different. If someone types in “COVID-19 is” in the search bar, the autocomplete is going to look different—it might be filled with “a serious issue” or “a hoax.” Before doing any research, there is already bias.
To be clear, data mining is helpful in certain situations. It basically customizes content to our preferences and it was beneficial in technology’s early history. But right now, the negative consequences far outweigh the benefits and it is an urgent crisis. Our society is being ripped apart by us, on behalf of social media.
We might think we are on the right side of the debate, but if our feed only shows us what we want to see, are our opinions legitimate? How much of our thoughts are caused by what we’ve seen on social media and data mining? We need to question this. Right now is the critical point, not the year 3000 when we imagine technology and robots replacing humans. Tristan Harris, the co-founder of Center for Humane Technology, explains “we’re all looking for the moment when technology would overwhelm human strengths and intelligence. When is it gonna cross the singularity, replace our jobs, be smarter than humans? But there’s this much earlier moment when technology exceeds and overwhelms human weaknesses. This point being crossed is at the root of addiction, polarization, radicalization, outrage-ification, vanity-ification, the entire thing. This is overpowering human nature, and this is checkmate on humanity.”
I would argue that we are already at the point where technology is overwhelming human weaknesses. This is what the tech experts are telling us now; that we are at a point in society where technology is controlling our thoughts, our emotions, and our actions without us even knowing. That data mining is ripping societies apart, that we find no purpose to host discussions with the opposing side, that this is an existential crisis. We are witnessing a real-life Matrix transformation and our societies are crumbling. Technology is already starting to replace what makes us human—our ability to form our own opinions, emotions, and actions. It has been the third party behind every action. We cannot allow ourselves to be blindly controlled by it. We cannot allow our creations to sever human society.
Interviewees in The Social Dilemma:
Tristan Harris - former Google design ethicist, co-founder of Center for Humane Technology
Aza Raskin - co-founder of Center for Humane Technology
Justin Rosenstein - co-founder of Asana and Facebook’s like button
Shoshana Zuboff - Harvard University Business School professor, author
Tim Kendall - former Pinterest president
Rashida Richardson - AI Now director of policy research
Renee DiResta - Yonder director of research
Anna Lembke - director of Stanford University Addiction Medicine Fellowship program
Jaron Lanier - virtual reality pioneer, VPL Research Inc.