I'm not going to keep the scalps of any Nazi I kill while defending my home and loved ones.
I'll just use pen and paper to keep track.
(I'm not bothered by your comment at all, but am attempting to humorously "yes, and" with it.
I am attempting a homorous misdirect where the reader thinks I'm disgusted by threatening to kill Nazis, but then I'm actually just offended by inefficient messy ways of keeping tracked track of any killed Nazis.)
This is not a story of the algorithm predicting what you like. It's showing if you expose a human to the same content over and over again. It can change their way of thinking in order to like the thing they are exposed to. Even more so if they don't know how it works and the person think everyone is into slime. I need to be into it too to fit in. It's very powerful if you want to manipulate populous. It's algorithm induced Stockholm Syndrome.
It's showing if you expose a human to the same content over and over again. It can change their way of thinking in order to like the thing they are exposed to.
I’m certain my YouTube feed is trying to radicalize me into some kind of culture warrior. It’s really annoying.
I deleted all of my watch history to try and reset it and it just got way worse real quick.
I watch one stupid video, now all I see are angry tubers upset that people don’t think exactly like they do and enjoy things they don’t. Then they convince themselves they’re more enlightened than anyone else because they make this content and ban anyone who makes fun of them, all while claiming to be “free speech advocates” of course.
YouTube got bad so fast it’s left my head spinning.
Have you tried clicking the 3 dots on these outrage videos and selecting "don't recommend channel" or a mix of that and "not interested?" I started to see a bunch of right wing political trash in my feed a while back since a lot of my watched videos could be considered adjacent (cars/trucks/offroading/home improvement/dash cam vids/etc) to what these people like and I haven't really had this issue again.
I have done this. I have told them not to post me shit from channels, and topics, over, and over. Best it seems it can go is like 2 months. When I tried deleting my history, and turning it off, it got SO MUCH WORSE. Even making a new account was orders of magnitude worse. As sad as it is, I am actually getting a better result... I have long been at the point where I do not click on things I am not familiar with, or without suggestion from a trusted source. So I just don't look at recommended anymore. Just look for the indicator of new stuff from my subs, or look at things I specifically search for.
Yup, last week Ive clicked on a YT video of certain game, shut it down about 1 minute in after realising it was just another rage-baiting angry youtuber lamenting how the game is too woke. Now all I get is recommendations of angry anti-woke youtube videos bashing the game I actually enjoy.
I started with a clean profile: I never log in to YT so it's just using a local cookie you can always clear to start over.
Anyways, I just searched a few sciencey things to feed the algorithm and now I'm getting loads of crazy fake "science" and conspiracies and the rest is all extremist right wing bullshit.
The only reference I have for this was someone who I knew who rubbed said slime on herself for YouTube when she was 17 to build a following for when she turned 18 and started camming.
And that is why I only open videos about topics I am only mildly interested on or from controversial channels in incognito mode even though I actually pay for ad-free YouTube Premium.
On my defense, that is the only streaming service I pay for.
TFW you forget to wear protection when clicking on a weird video and you permanently scar your algorithm. You try to heal it, but days or weeks later, you are showing your boss a video on marine grade industrial sealant and Chappell Roan Pink Pony Club shows up in your recommended videos and you have to lie and say you have no idea what it is. When he is gone, you play it again.
clicking on a weird video and you permanently scar your algorithm.
It's trivial to delete individual videos from your watch history, even moreso if you just saw it. Doing so makes it as if you never clicked on it in the first place.
As recent advances in AI have shown, humans are really quite predictable when you throw enough data and compute at the problem. At some point the algorithm will be sophisticated enough that it'll be able to get to know you better than you know yourself, and will be able to provide you with things you had no idea were what you really wanted.
Yes, but recent advances have really rubbed it in our faces in ways that are a lot harder to deny. Humans haven't become fundamentally more or less predictable over time but recent advances have shown how predictable we are.
Yes, I heard/saw/read that this is exactly what Amazon do, some years back now. They know who you are, what stage of life you are at, and they know what you want before you do.
I had this exact experience with music algorithm recommendations:
The algorithm analyzed all the songs I asked it to play, and concluded (correctly) that I might enjoy listening to the Beatles. (True story.)
(Now a bit of sarcasm:) I look forward to future insights, in other art forms, such as perhaps the writings of Shakespeare or the paintings of Leonardo Da Vinci.
Yeah doubtful. I think it finds something you will engage in and push on it over and over again until people get normalized to it.
I think it's more like cold reading from a psychic. It's gonna use generic generalized data about the big identifiers for you like age and gender and as you respond try to change its answer to what it needs to based on what you gave it.
That's not new or magical in any way. And it can be really wrong about the broad stuff if you don't fit in with generic identifying groups related to you.
It really just feels like a sales pitch for the middle class to buy more stuff.
Humans aren't static. You don't actually have these secret hidden likes AI can discover, instead, you grow to like the stuff that becomes familiar. You're being trained.
Problem is that none of the algorithms actually care about showing you things you like.
Ads try to sell you on things that you wouldn't otherwise buy. Occasionally, they may just inform you about a good product that you simply didn't know about, but there's more money behind manipulating you into buying bad products, because it's got a brand symbol.
And content recommendation algorithms don't care about you either. They care about keeping you on the platform for longer, to look at more ads.
To some degree, that may mean showing you things you like. But it also means showing you things that aggravate you, that shock you. And the latter is considered more effective at keeping users engaged.