From Symbolism to Algorithmic Conditioning.
Could social media be the new Mass Psychological Manipulation?
Anyone who was alive in the 2000’s and coherent enough to understand what was happening, noticed the constant subliminal visuals in the music videos of artists like Lady Gaga, Rihanna, Beyonce, Katy Perry and even Kayne West - the all seeing eyes, pyramids, checkerboard patterns, rituals, split personalities, apocalyptic scenarios - thanks to people like Francis Lawrence, Anthony Mandler, Floria Sigismondi and Chris Applebaum (yes they have names!) This era of entertainment was released during a post 9/11 false flag world, fear and nihilism was pumped through the news and the entertainment industry to help shift us into an age of continuous wars, economic crises, mass surveillance, ego worshiping, and fatalistic, apocalyptic thinking. There are plenty of videos that can be examined and broken down to root out the obvious and the not so obvious messages (which I’ll save for other posts), but today I’m more interested in what we are currently moving into.
As the subliminal shifted into overt symbolism (below) then began to die down, mainly thanks to people becoming aware of it and pointing it out, and it being pushed too hard in the 2010’s, sometimes in a “Whaaaat? We were just joking, guys!” way, a more insidious form of coercion, influence and control has begun to rear it’s ugly head - behaviour shaping, hyper fragmentation and emotional targeting via the algorithms programmed into social media. From the 2010’s to now - the mid 2020’s - more and more of us have noticed our news feeds not only steering us towards products, but cultivating AI curated endless streams of content, influencing our behaviours and emotions. If we scroll in our ‘for you’ type social media feeds, we begin to notice something - we feel as though we are stuck in emotional loops, we feel overwhelmed with information, we even feel exhausted and confused. We even gave it a term - ‘Doom Scrolling’!
So what is causing this exactly? Well whether it’s intentional or not (as a “conspiracy theorist” I personally lean towards intentional) the combination of fragmented short content, an engaging algorithm, AI content and the identity/belief reaffirming and nudging that happens on social media creates an outcome that is almost indistinguishable from psychological warfare. Our belief systems, emotions, motives, reasoning and behaviours are influenced over time. Even when we attempt to curate our own feeds, the algorithm overrides our preferences. It does this in a variety of ways. The monitoring of our likes, shares and comments; predicting what we might do next; and ranking the content we’ve interacted with allows it to score our emotional reactions, assign a demographic and compare us to similar users, test content to see what sticks across a global population and optimize advertising so that we will be exposed to as much as tolerable while we are online. At first this might not seem so bad, after all we are getting this service for free, monitoring and ad optimization are a given - however most users goals are to socialize and/or do business, both of which can become difficult when we are bombarded with ads and targeted with user retention tactics to keep us scrolling.
Unfortunately it only gets worse from here.
Although it can be argued that this type of algorithm was purely created to drive business for large corporations (which is bad enough) and cater to smaller creators, the results that we are experiencing on a mass scale can’t be denied.
The prioritization of content that is trending and highly emotional or impactful creates massive spikes in our dopamine, we take on a roller coaster ride of feelings, fear then elation from one post to the next, and we are even more divided through tribalism. This happens because the most engaging posts are pushed more, and higher up in our news feeds - wars, feel good posts, celebrity gossip, highly sexualised content and gotcha moments from left and right wind pod-casters - all in the same few swipes. Societal divide is fueled, alongside addiction, anxiety and hate. When we are shown content that might be related to our views, we are not just put in an echo chamber, we are also nudged further and further to the one side. Oh you watched that political commentator that popped up, then maybe you’ll like this political commentator who goes a little harder, who leans a little bit more into it? As we are tested on emotional triggers in our feed what can often happen is if content that makes us angry gets more of a reaction from us, then we’ll see more of what makes us angry, getting us stuck in a never ending loop of rage. We’ll react more, think less and eventually feel the need to become defensive in the comments. Ever go to a comments section and people are just mindlessly arguing, without much factual information being presented and a whole lot of ad hominem taking place? Now you know how it gets to that point.
When we are grouped demographically - by age, gender, location, interests, and activity etc we are identified as a kind of tribe and fed content that other members of our tribe enjoy. I personally have alcohol, and fast food ads in my feed, even though I am sober and rarely eat out - because I have been identified as a millennial mother, similarly I am fed hip hop news because I am black. The algorithms takes note of what others in the millennial mother group and the black group consume as content and makes a match, I am often (almost daily) sent a reel from another person in my demographic, that I have already seen. The algorithms also have a kind of fail safe where we are occasionally shown unfamiliar content in the recommender systems, in order to test how we respond or react to it. The problem is, if we hover or watch to see what it is, a new pathway is created for further suggestions. Supposedly this is used in order to break patterns and not create an echo chamber, but it can be used to push an agenda. I was once presented with a political ideology that I didn’t agree with, and watched in order to gauge what was being said - it opened the floodgates and I was regularly fed content from that ideology, not because I agreed with it, liked it or even commented on it, but simply because I watched it. There ends up being a weird hybrid profile built - one that encapsulates our current behaviour and one that models our potential behaviour, and what often happens is that we are also nudged into our potential self, bit by bit. Even if we don’t like something, or talk about it loud or seek it out, the algorithm can notice what we hover over, spend too long looking or watching something, when we are the most emotionally reactive and what content makes us this way, who we resemble behaviorally and interest wise, and what our friends and our cyber doubles are doing and what they enjoy, and even if it might not be intentional attempts are made to influence us and slot us into the herd.
So while Zuckerberg and friends might say, or even intend for the goal to be to connect people and help business thrive, what happens is quite different. The micro and macro targeting, the behaviour nudging, the emotional influencing works to divide us further, make us more reactive, making it more difficult for the everyday person to connect with others and for the small business owner to connect with their audience. We are not told what is being tested with full transparency, or how (unless we go searching for what little information is out there), we can’t opt out without setting aside entire platforms, and we are affected on an individual and mass level. Now that AI is being integrated, there is even less of a human touch, creating these personal little custom designed realities that have the power to influence our emotions, behaviour and even our thoughts. When governments and self proclaimed elites are involved in this process we even see agendas being pushed, like with the 2020 plandemic where there were stay home messages popping up and facts being flagged as misinformation; or during election periods where there are notices to make sure we vote.
What may have started as a tool to connect, then monetize our attention for business, has evolved into a global psychological experiment that might just rival the reach that the entertainment industry has.
Sources:
https://www.shaped.ai/blog/explore-vs-exploit?utm_source=chatgpt.com
https://ieeexplore.ieee.org/document/5360225
https://www.pnas.org/doi/10.1073/pnas.1320040111
https://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles
https://www.nytimes.com/2018/11/06/technology/myanmar-facebook.html
https://www.nytimes.com/2018/12/27/world/facebook-moderators.html
https://www.theguardian.com/technology/article/2024/jul/21/we-unleashed-facebook-and-instagrams-algorithms-on-blank-accounts-they-served-up-sexism-and-misogyny?utm_source=chatgpt.com
https://knowledge.uchicago.edu/record/12679?utm_source=chatgpt.com&v=pdf