"If you randomly follow the algorithm, you probably would consume less radical content using YouTube as you typically do!" So says Manoel Ribeiro, co-author of a new paper on YouTube's recommendation ...
Add Yahoo as a preferred source to see more of our stories on Google. YouTube's algorithm recommends right-wing, extremist videos to users — even if they haven't interacted with that content before — ...
YouTube’s recommendation engine failed on February 17, 2026, stranding hundreds of thousands of users on blank, unresponsive ...
A new study conducted by the Computational Social Science Lab (CSSLab) at the University of Pennsylvania sheds light on a pressing question: Does YouTube's algorithm radicalize young Americans?
Over the years, the YouTube suggestion algorithm has become pretty complex. I’ve noticed that it can extrapolate my tastes very well based on my watch history, continuously tempting me to consume more ...