• @General_Effort@lemmy.world
    link
    fedilink
    English
    1014 months ago

    [French media] said the investigation was focused on a lack of moderators on Telegram, and that police considered that this situation allowed criminal activity to go on undeterred on the messaging app.

    Europe defending its citizens against the tech giants, I’m sure.

    • @RedditWanderer@lemmy.world
      link
      fedilink
      English
      68
      edit-2
      4 months ago

      There’s a lot of really really dark shit on telegram that’s for sure, and it’s not like signal where they are just a provider. They do have control the content

          • southsamurai
            link
            fedilink
            English
            74 months ago

            You’re young. It really was a thing. It never stayed up long, and they found ways to make it essentially instantaneous, but there was a time it was easy to find very unpleasant things on Facebook, whether you wanted to or not. Gore in specific was easy to run across at one point. CP, it was more offers to sell it.

            They fixed it, and it isn’t like that now, but it was a problem in the first year or two.

            • sunzu2
              link
              fedilink
              74 months ago

              And there are still informal networks of Pedos and other pests operating on these platforms to this day.

            • @RedditWanderer@lemmy.world
              link
              fedilink
              English
              -134 months ago

              Haha, young ? i wish. But go on making stuff up.

              So now it’s not that it’s readily available, it’s that it was in the beginning. So everyone is allowed to let CP go in the first years of their platform? Is that what youre going with. Eww

        • @Kecessa@sh.itjust.works
          link
          fedilink
          English
          64 months ago

          So you don’t see the difference between the platforms that actually has measures in place to try and prevent it and platforms that intentionally don’t have measures in place to try and prevent it?

          Man, Lemmings must be even dumber than Redditors or something

    • chiisanaA
      link
      English
      264 months ago

      Safe harbour equivalent rules should apply, no? That is, the platforms should not be held liable as long as the platform does not permit for illegal activities on the platform, offer proper reporting mechanism, and documented workflows to investigate + act against reported activity.

      It feels like a slippery slope to arrest people on grounds of suspicion (until proven otherwise) of lack of moderation.

      • @rottingleaf@lemmy.world
        link
        fedilink
        English
        54 months ago

        Telegram does moderation of political content they don’t like.

        Also Telegram does have means to control whatever they want.

        And sometimes they also hide certain content from select regions.

        Thus - if they make such decisions, then apparently CP and such are in their interest. Maybe to collect information for blackmail by some special services (Durov went to France from Baku, and Azerbaijan is friendly with Israel, and Mossad is even suspected of being connected to Epstein operation), maybe just for profit.

        • chiisanaA
          link
          English
          44 months ago

          I don’t know how they manage their platform — I don’t use it, so it’s irrelevant for me personally — was this proven anywhere in a court of law?