These are the “values” of the algorithm. In simple terms, the algorithm decides what shows up in your feed based on what you’ve interacted with before. It doesn’t have an agenda beyond keeping you using the platform. For some people, it starts with cooking recipes that turn into hostess tips and then videos about what your salt and pepper shakers say about you. For others, it’s “day-in-my-life” as an investment banker that turn into podcast clips detailing the mindset of the richest business leaders and then aesthetic B-roll of rooftop bars and yachts. The algorithm doesn’t care what box you fit into, as long as you fit into one. information becomes identity. As long as it can turn you into a consumer, and call it “community”. The currency is your attention, and the more time you spend on an algorithm, the more you lose touch with your own life, hobbies, ideas, and physical community.
One of the most harmful psychological effects of the algorithm is not only its ability to reshape what we believe, but also how well-informed we perceive ourselves to be. Because algorithms deliver frictionless and curated information, it doesn’t require our minds to engage deeply, or invest energy into understanding. We feel well-informed without having done the cognitive work that real understanding requires: Reading beyond headlines, tolerating ambiguity, sitting with material that isn’t immediately engaging, or holding unresolved tension. The same dynamic that lowers users’ sense of responsibility to actively seek, verify, or complicate information also makes them more likely to engage with their feeds in negative emotional ways; the two effects rise together. When users internalize the feed as a sufficient representation of reality, they stop approaching content with distance and start responding to it as if they’re moving through an authentic social and informational ecosystem.
In 2019, a study on the media’s role in emotional states following the Boston Marathon Bombings found that, in the week following the bombings, individuals who consumed six or more hours of daily media coverage reported higher levels of acute stress than those who were actually at or near the scene of the attack. Unlike physical reality, which has limits (the event ends, your body returns to baseline), the online environment caused repeated, unbounded exposure to emotionally charged information – which led consumers into a trauma response that mirrored PTSD – without ever experiencing physical danger.
Because social media is Gen Z’s main news source, the study’s findings reflect the ways this generation commonly consumes and processes information. We are left to metabolize extremely high levels of emotional activation, without proper emotional hygiene (tools for processing, contextualizing, or regulating what we’re absorbing). This leads the psyche to seek resolution: A “why” that requires little mental effort to accept. People look for narratives that can organize their fear, anger, or grief into something coherent. And radical, conspiratorial, political frameworks provide that simplified “why”. These frameworks are likely to be surfaced by the same algorithm that traumatized consumers in the first place.
While planning our chapter’s event with Breaking the Silence speaker Tal Sagi, a Hillel staffer asked what I was doing for security. I hadn’t planned an event that needed security before, so the question caught me off guard. She explained that even though Tal speaks openly against the occupation, for some students, the fact that she had served in the IDF would be enough to dismiss her, or protest the event entirely.
Tal is someone who has taken on real personal and social risk to speak about what she had seen. Someone who shares testimony publicly, who has built her life around confronting the system she was once part of. Who engages with Palestinians, who shares in their lives in ways beyond just those mediated by politics. None of that seemed to matter. The idea that one fact, IDF service, was enough to override everything else. That was the part that stayed with me. Not disagreement, but how quickly complexity collapsed into something simple. How a life shaped by contradiction could be flattened into a single category, and how little room there was for anything that didn’t fit cleanly. In the end, I coordinated with campus security. The event went on without incident. But the conversation stayed with me.
Social media is flattening the narratives we encounter, coloring in the gray areas in bold black and white. Many condemn spaces that hold the grief of both Palestinians and Israelis as neutral sanitization, intended to strip the issue of its humanity or political stakes. In Instagram-soaked language, “both-sidesing” the issue is a dismissive and cutting condemnation of spaces that resist reducing the conflict to a simple binary.
When you’re used to feeds that reward outrage and moral certainty, anything that slows the moment down, anything that asks people to sit with complexity instead of picking a side to perform, can feel flat, or even suspicious (the newest antisemitic trope is that “bothsidesers” or “normalizers” are Mossad agents). Nuance gets coded as “safe”, “institutional”, or the worst insult of all: normal (literally called: normies). As though I must not care enough if I’m not buying into radicalization. But refusing the pull toward narrowness is not apathy; it’s discipline. It takes far more emotional strength to hold grief that isn’t yours alone, to make room for pain that complicates your own, to stay open when everything around you is rewarding closure. The algorithm makes it easy to collapse people into symbols, statistics, and enemies. It makes it easy to feel righteous while becoming less human in the process. Choosing not to go there, choosing to keep seeing the humanity on both sides of this conflict, even and especially when it’s inconvenient, when it costs you social capital online, is not passive. It’s active resistance.
There is nothing revolutionary about allowing oneself to get pulled into certainty, into dehumanization, into algorithmically reinforced outrage, when that is the path of least resistance. What’s truly revolutionary is committing oneself to an open and critical mind, refusing to let grief be ranked or rationed. There is nothing neutral about insisting on humanity when the forces-that-be intend to strip it away. If anything, it’s the most countercultural stance available right now. To believe that justice and peace require us to expand our empathy, not shrink it, and to hold that line even when it would be so much easier, and more socially palatable, to let the algorithm think for us.
Resisting the algorithm requires building spaces that operate by different values. That means being willing to learn, prioritizing listening over performance, and creating environments for people to encounter narratives that are gray in a world of black and white. At Hunter College, J Street U has done exactly that: Hosting speakers that expose students to firsthand testimony from the occupied Palestinian territories, building programs designed to foster nuanced conversations about the conflict, and critical engagement with how it’s represented on social media and in the news. We’re creating rooms for the kind of thinking that the algorithm structurally undermines. This resistance may look unremarkable from the outside: making room for the uncomfortable, questioning what feels immediately convincing, refusing to let suffering be reduced to a slogan. But these acts of intellectual and emotional restraint are precisely what keep our humanity intact in an age designed to erode it.