If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains *.kastatic.org and *.kasandbox.org are unblocked.

Main content

Reflection activity: Persuasive technology's harms

Are social media companies amplifying the right kinds of content?

Reflection activity

🤔 CONSIDER:

YouTube’s recommendation algorithms, which determine 70% of what billions of people watch, has found that a great way to keep people watching is to suggest content that is more extreme, more negative, or more conspiratorial. You’ll find keywords like “destroys” and “hates” showing up more often in YouTube’s algorithm.
  1. What do you notice about the kind of content that YouTube’s recommendation algorithm finds most engaging?
  2. How might amplifying this content change how people see the world?
We will go more in-depth on the harms of persuasive technology in the "Seeing the consequences" unit.

Want to join the conversation?

  • blobby green style avatar for user jlblackmon7160
    The AI and algorithm don't recommend the stuff you like, but rather what it keeps you occupied and what it will do by any means necessary to get you hooked on to anything at all.
    (4 votes)
    Default Khan Academy avatar avatar for user
  • hopper cool style avatar for user Jared Galltan Mendoza
    1. What I notice about the kind of content that YouTube's recommendation algorithm finds most engaging are videos covering certain people like allegations or why that person was so great.

    2. How amplifying this content changes how people see the world is that sometimes this content could be misinformation and those people may have a disgust for the person who was covered in the video they might even harass the person without looking any further into the situation. If it wasn't misinformation they will probably just do the same thing anyway.
    (3 votes)
    Default Khan Academy avatar avatar for user
  • blobby green style avatar for user KevinR
    1. I notice that in the YouTube's recommendation algorithm the most engaging thing is how people do crazy things that end up being dumb.
    2. The content thats being shown could change people by making them trying it out for themselves out which could result them on being hurt or being in trouble.
    (2 votes)
    Default Khan Academy avatar avatar for user
  • blobby green style avatar for user mnicin
    1) Since I watch a bunch of videos relating to a lot of subjects (educational, entertainment, history etc.) I get videos that push more history content because I am used to watching it a lot more than the others I mentioned previously. Since the algorithm + AI knows that I want these videos because I stay to look at the title, thumbnail for a bit longer (like a few seconds), I click on these videos way more, stay to watch these videos through its entirety (sometimes I do tap forward if its dragging the parts way too long or it is just to me unnecessary info), talk and discuss about it etc.

    But lets say if I was watching history, do I mean as in regular, neutral standpoint or that of from a political/ideological perspective. The more you watch those videos, the more you go deep into the rabbit hole, the more partisan does it become (for example you watch a video revolving around American political culture and such, but you have two different videos from a conservative and a liberal perspective, while they do get the basic stuff correct, it later devolves into their personal essay or analysis of these events and skewing some of the parts of the subject as well, its complicated really).

    ____

    2) The AI and algorithm don't recommend the stuff you like, but rather what it keeps you occupied and what it will do by any means necessary to get you hooked on to anything at all. You have to consider that while we receive news around the world on many events, you get DIFFERENT answers from everyone else. For example while you might receive news and info that is favorable towards a certain person and the people criticizing that person would be seen as idiotic, same goes for the other one that receive news and info that is negative towards the person and you think that the people defending the said person are moronic and evil. None of those perspectives are true yet people fall for them due to them not bothering to connect with others IN PERSON rather than online (Online and real life personalities are VERY DIFFERENT and SEPARATE between one another, it depends on which one dominates the other). You can't stop people from having different perspectives about certain things but you have to draw a line on what is fact and what is just a perspective replacing it and claiming to be THE only fact ever to exist.

    All of this is done on purpose just so the tech companies can have more control over the population which will believe in every single propaganda and clickbait posts ever (which might be happening right now) and to make money (I mean its really obvious, duh).
    (1 vote)
    Default Khan Academy avatar avatar for user
  • stelly blue style avatar for user Isabella A
    1.what I notice about the kind of content that YouTube's algorithm shows us is that it shows us what we want to see.
    2. Idk
    (1 vote)
    Default Khan Academy avatar avatar for user
  • duskpin seed style avatar for user MAY (brystol)
    I feel like YouTube is full of people trying to make other people feel good about themselves, and the people who post their lives are not keeping things to themselves. You see a YouTube couple and you think that its such a good duo, but that relationship is now being force to post their lives to people they don't know.
    (1 vote)
    Default Khan Academy avatar avatar for user
  • duskpin tree style avatar for user La'juan King-Robinson
    The AI algorithm don't show you stuff that can help you they show you other thing like how to file taxes getting free phone getting EBT.
    (1 vote)
    Default Khan Academy avatar avatar for user
  • blobby green style avatar for user Na KiyaB
    1. YouTube algorithm suggests similar things that you like watching.
    2. Record videos well ahead of time.Consistency wins on YouTube.
    (1 vote)
    Default Khan Academy avatar avatar for user