Highest Rated Comments


EnderSword74 karma

Do you have plans to hold more tournaments like the WCS this year, and if so, would you be able to more closely match Riot's efforts in funding, advertising and supporting the eSports scene?

EnderSword54 karma

YouTube literally calls them 'Recommended' videos, and they'll even tell you 'This is recommended due to your interest in...' or 'Other Norm MacDonald watches watch...John Mulaney'

While it's a computer doing it, the intent is still the same, it's suggesting something to you because it thinks there's a higher probability you'll engage with what it's suggesting.

But I will say that part is a little different... it's not recommending something you'll "like" it's only recommending something you'll watch.

EnderSword32 karma

Conspiracy theories often come down to like 'how many people had to cooperate?'

In the Epstein case like... 5 maybe? And they maybe didn't even need to 'do' anything...just let him do it. So it just rings as so plausible.

EnderSword17 karma

It's not recommending something you'll like...it's recommending something you'll 'engage' with.

So many people 'hate watch' things... when you actually reported something, you engaged with it and took part, so it's going to show you more of her, thinking you'll also engage and continue disliking her.

The fridge example is of course it not knowing you bought a fridge yet. In many cases those large purchases aren't done right away when someone starts looking, so they do continue to show it for a while.

If it 'knows' the loop is closed, it'll stop. I was looking up gaming laptops in July, I didn't buy one right away so those ads followed me for a few weeks, I finally chose one and ordered it on Amazon, and the ads stopped.

I don't think in all these cases they're so much 'dumb' as they have blind spots to some data, and sometimes their purpose is not the purpose you think.

EnderSword6 karma

I keep noticing this too. I'm from Toronto and I've watched several Jordan Peterson things, he's certainly more right wing than me and says a few crazy things when he leaves his actual field of study, but he's mostly a normal rational guy.

But the moment you click on him, you get flooded with anti-feminism, right wing, conspiracy theory, anti-SJW etc... stuff.

My thinking based on knowledge of similar algorithms is it's sort of playing a risk/reward game... the algorithm probably does know that most people will not continue watching that stuff and will switch topics... but the person who does accept the suggested path will watch 1500 videos and leave comments on 500 of them.

Since we're essentially rewarding it based on its ability to make someone watch a lot, it is naturally going to recommend things that may appeal to an obsessive audience.