Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Youtube Media Technology

YouTube Lets Users Override Recommendations After Criticism (bloomberg.com) 91

YouTube said it will let users override automated recommendations after criticism over how the online video service suggests and filters toxic clips. From a report: "Although we try our best to suggest videos you'll enjoy, we don't always get it right, so we are giving you more controls for when we don't," Essam El-Dardiry, a product manager at YouTube, wrote in a blog on Wednesday. Users will now be able to tell YouTube to stop suggesting videos from a particular channel by tapping the three-dot menu next to a video on the homepage or Up Next, then choosing "Don't recommend channel." After that, viewers should no longer see videos from that channel, El-Dardiry said. The move comes after Susan Wojcicki and other YouTube executives were criticized for being either unable or unwilling to act on internal warnings about extreme and misleading videos because they were too focused on increasing viewing time and other measures of engagement.
This discussion has been archived. No new comments can be posted.

YouTube Lets Users Override Recommendations After Criticism

Comments Filter:
  • by AndyTheCoderMan ( 1557467 ) on Wednesday June 26, 2019 @03:24PM (#58829950)
    They already have that feature in the "tell us why" when you click to remove a recommendation. What we need is to see what they decided not to recommend and tell them that we do want to see that kind of content.
    • by EvilSS ( 557649 ) on Wednesday June 26, 2019 @03:51PM (#58830148)
      They have that on some of their ads too. Pretty sure it's been a placebo feature though. Never seemed to work for either youtube or ads for me.
    • by lgw ( 121541 )

      Oh, wow, I never knew that was there! Thanks. Somehow watching a lot of science-related videos keeps putting flat earth stuff in my recommendation list, which was funny for about a day.

    • by AmiMoJo ( 196126 )

      I wish YouTube would offer more variety with recommendations. It's less about discovery and more about giving you more of the same.

      • by Mashiki ( 184564 )

        I wish YouTube would offer more variety with recommendations. It's less about discovery and more about giving you more of the same.

        Recommendations are based on your history. So if you're watching a pile of progressive junk, you get more progressive junk. Same if you're watching republican or conservative. And if I'm listening to Infected Mushroom, I get more psytrance recommendations too. The problem is that it works very well for music and 'raw' information, it doesn't work well for anything related to politics. And most people are likely to ignore political recommendations because youtube decided to start being arbiters of truth

  • Will filter out CNN, Fox, MSNBC, The Hill, and all major studios the moment this becomes available.

    • by Tailhook ( 98486 )

      Maybe this means we get small creators again

      No, what it will mean is that whatever you end up not filtering out will feel the ban hammer next.

  • I want the algorithm to show me certain videos I like from a particular creator. For example, I like Pewdiepie's Pew News videos but not his meme reviews.
  • Toxic Clips (Score:4, Insightful)

    by nwaack ( 3482871 ) on Wednesday June 26, 2019 @03:34PM (#58830008)
    While I'm happy they're giving us more control over our recommendations, I don't really see how this has much to do with toxic clips. Like the first post said, I'd rather see what "toxic clips" they decided NOT to show me so (i.e. the ones our YouTube overlords have decided is wrongthink) so I can tell them which of those I'd actually like to see.
    • Re:Toxic Clips (Score:5, Insightful)

      by techsoldaten ( 309296 ) on Wednesday June 26, 2019 @03:38PM (#58830042) Journal

      Yeah, this new feature doesn't seem to have much to do with the problem - Google trying to influence elections.

      Where this probably leads is a new way of brigading certain channels and ensuring they don't appear in recommendations at all because "users don't want them."

      • Everyone is trying to influence elections. It's called political donations. Why should we be upset at Google for clearly being biased when every single major news network or media company has been since time immemorial.

        It was stupid when people were trying to claim that it was the Russians trying to influence our elections and it's just as stupid when people are complaining about Google doing it.
        • by lgw ( 121541 )

          Everyone is trying to influence elections. It's called political donations. Why should we be upset at Google for clearly being biased when every single major news network or media company has been since time immemorial.

          One clear reason. Google benefits from being a neutral platform under Section 230 of the CDA. Once they start trying to influence elections, they are obviously no longer neutral. That means they should be liable for what they choose to publish on their platform.

          Toxic videos should be met with lawsuits against Google for publishing that toxic crap. Any libel or defamation in any video, sue Google!

          • by DRJlaw ( 946416 )

            Once they start trying to influence elections, they are obviously no longer neutral. That means they should be liable for what they choose to publish on their platform.

            But they're not [slate.com].

            Any libel or defamation in any video, sue Google!

            And lose.

            • by lgw ( 121541 )

              Keep telling yourself that. That's the solution to finding yourself in a hole: just keep digging.

              • by DRJlaw ( 946416 )

                Keep telling yourself that

                Will do. Keep telling yourself that 2018 was a fluke.

                That's the solution to finding yourself in a hole: just keep digging.

                Spoken by someone who's never watched an excavator dig a foundation. Hint: dig sideways.

                • by lgw ( 121541 )

                  The Dems victory in 2020 is absolutely as assured as Hillarly's in 2016. The polls already show it!

        • Re:Toxic Clips (Score:5, Informative)

          by oh_my_080980980 ( 773867 ) on Wednesday June 26, 2019 @04:28PM (#58830428)
          Google claims they don't. That's the problem. People assume what you get are unfiltered results from Google and Google pushes that myth.
      • From townhall.com

        Jen Gennai is the head of “Responsible Innovation” for Google, a sector that monitors and evaluates the responsible implementation of Artificial Intelligence (AI) technologies. In the video, Gennai says Google has been working diligently to “prevent” the results of the 2016 election from repeating in 2020:

        “We all got screwed over in 2016, again it wasn’t just us, it was, the people got screwed over, the news media got screwed over, like, everybody got screwed over so we’re rapidly been like, what happened there and how do we prevent it from happening again.”

        “We’re also training our algorithms, like, if 2016 happened again, would we have, would the outcome be different?”

        “Elizabeth Warren is saying we should break up Google,” she said, referring to the Massachusetts Democrat’s wish to break up big tech. The senator wants to impose new rules on tech companies with $25 billion or more in annual ad revenue, forcing Amazon and Google to dramatically reduce their hold on online commerce.

        “And like, I love her but she’s very misguided, like that will not make it better it will make it worse, because all these smaller companies who don’t have the same resources that we do will be charged with preventing the next Trump situation, it’s like a small company cannot do that,” Gennai added.

      • Yeah, this new feature doesn't seem to have much to do with the problem - Google trying to influence elections.

        I thought the problem was that Google's recommendation engine is a one-way ratchet to extremist videos. Pick any topic, start following recommendations, and within an hour or two you're down some rabbit hole you never would have believed could exist.

  • Like, this video sucks. I want to block this user's content but there's no way to do it directly from this page.

    Instead I have to look through the list of thumbnails of recommended content for more from this user, expand the three-dot menu then select it from there.

    Seriously, what is so difficult about providing easy access to a feature where users are most likely to want it and use it?

  • by nitehawk214 ( 222219 ) on Wednesday June 26, 2019 @03:53PM (#58830162)

    So no more of that suggestion of "anime memes" or whatever weeb bullshit that everyone seemed to get.

  • by Elfboy ( 144703 ) on Wednesday June 26, 2019 @04:13PM (#58830296)

    Can I now stop all those freaking Cobra Kaii ads every 3rd video??!?

  • But you're still editing content and acting as intellectual gatekeepers.

    That's not your job, and I truly hope the Justice Department shatters you to hell.

  • Comment removed based on user account deletion
  • This has been a feature for bare minimum 3 years. What the hell?
  • If you use uBlock origin, you can individually click on the recommendations that pop up near the end of the video and just block them. There are about fifteen elements to block but after you do that all the annoying recommendations stop (they are annoying because they block the end of the video).

The moon is made of green cheese. -- John Heywood

Working...