How is social media shaping humanity? This is the question that Jeff Orlowski tries to answer in this article he wrote for Newsweek.
According to Orlowski, speciation of thought is happening in social media. He says, “there’s a logical, apolitical explanation for our society’s current discord: Personalized machine-learning algorithms are pushing us into our own isolated digital ecosystems to the point that our world views are becoming fundamentally incompatible, and so are we.”
He breaks down this process of speciation in three stages: separation, adaptation, and division:
- First…highly personalized social media and search algorithms silo our worldviews by surveilling our every online move and serving us content completely unique to us that they predict will keep us scrolling. This tends to be things that provoke fear, outrage and loyalty to one’s own group. These algorithms surround all of us—all countries, parties, classes, and identities—all the time, separating what we see and ultimately believe.
- Second…the algorithms first adapt to us, learning what we like and what works to keep us engaged. This is how machine learning operates: With our every post, emoji and moment staring at a certain piece of content, the algorithm learns more about us and then refines its recommendations to ensure greater engagement. Our thoughts in turn adapt to our increasingly extreme and individualized information landscape, moving in the direction we’re pushed.
- Finally…People radicalized by personalized algorithms likewise become less interested in, tolerant of, or even aware of others’ views. As we hole up on our own information islands, alternate perspectives become unreal and uncomfortable to us…Our hesitancy to have conversations that could bridge our divides perpetuates them. Our world views thus clash toward incompatibility.
Editor’s Note: This article is important because it gives an explanation of how social media, which was initially created to connect people, has become a tool for dehumanization. It also serves as an illustration of how algorithms can go wrong – where machine learning tends to feed into our own biases, which in turn strengthens our pre-conceived notions about the world.
Social media, which prioritizes profits above societies, has lost its way. It has become a tool of separation and division.
Read Original Article
Read Online
Click the button below if you wish to read the article on the website where it was originally published.
Read Offline
Click the button below if you wish to read the article offline.
You may also like
-
Metaverse Identity: Defining the Self in a Blended Reality
-
Nuclear Power Is Not For Human Societies, It Is For AI
-
Is there hope for an ethical AI?
-
Elon Musk warned of a ‘Terminator’-like AI apocalypse — now he has built a robot
-
Hundreds of Google DeepMind Employees Urge Company to Stop Military Contracts