The author uses this extreme example to blame YouTube’s attention-hungry algorithm for creating radicals of all types.
If I watch videos of people eating bacon, the algo will recommend more videos of people eating bacon– SURPRISE!
You can use this same line of logic in any algorithmically-driven system that makes recommendations.
If you watch a lot of Disney movies on Netflix, guess what movies Netflix will be recommending for you?
The “filter bubble” ensures that red people see red content, such that only 1% of people (call them purple) will also see blue content because they have blue friends.
If people are seeing more of what they already are inclined to believe, is it possible that social media can shift their opinions, ever so slightly?
I believe it still can, in a powerful and subconscious way.
Read the article, here.