• @OldWoodFrame@lemm.ee
    link
    fedilink
    English
    83 months ago

    We don’t dislike government censorship of CSAM. it’s all a spectrum based on the legitimacy of the government order and the legitimacy of the tech billionaire’s refusal to abide.

    • @sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      English
      33 months ago

      Honestly, while I think CSAM is disgusting, I am kind of against government censorship of it. Some go so far as to ban anything resembling CSAM, including imagery that looks like it, but doesn’t actually involve a real child. The problem is the abuse required to create it, but if that abuse didn’t happen, there is no crime, and it should therefore be completely legal.

      The same goes with free speech more broadly. The speech itself should never be illegal, but it should be usable as evidence of another crime. A threat of violence is the crime, and that should be prosecuted, but that shouldn’t mean the government should force the host to censor the speech, that should be at the host’s discretion. What the government can do is subpoena information relevant to the investigation, but IMO it shouldn’t compel any entity to remove content.

      That said, Brazilian law isn’t the same as US law, and X and Space X should respect the laws of all of the countries in which they operate.

      • @Cryophilia@lemmy.world
        link
        fedilink
        English
        33 months ago

        That’s…actually a pretty reasonable take. Fuck Musk, but you’ve convinced me that government censorship is just a bad thing in general and that should apply to Musk as much as anyone else.

        I do think there’s a counter argument to be made that the resources involved in setting up fake accounts to spread bullshit are trivial compared to the resources required to track down and prosecute account owners for crimes, so in a practical sense banning accounts is possibly the only thing one can do (especially if the account owners are foreign). If you give lies the same freedom as truth, you tend to end up with 10 lies for every truth.

        • @Omniraptor@lemm.ee
          link
          fedilink
          English
          1
          edit-2
          3 months ago

          Op’s take is not reasonable imo- if you think threats are harmful enough to prosecute they should also be harmful enough to censor.

          Maybe a more soft form of censorship, such as hiding them behind a cw and a “user was vanned for this post” label rather than outright removal, but you can’t just do nothing.

          • @Cryophilia@lemmy.world
            link
            fedilink
            English
            23 months ago

            Prosecution implies a trial before punishment. Censorship is immediate punishment based solely on the judgment of the authorities. That’s not a minor difference.

            • @sugar_in_your_tea@sh.itjust.works
              link
              fedilink
              English
              13 months ago

              Exactly. If a judge states that an individual is no longer allowed on SM, then I absolutely understand banning the account and removing their posts. However, until justice has been served, it’s 100% the platform’s call, and I think platforms should err on the side of allowing speech.

              • @Cryophilia@lemmy.world
                link
                fedilink
                English
                23 months ago

                I realize I’m jumping back and forth between sides here, but that’s because it’s a complex problem and I haven’t made my mind up. But that said, to return to the previous point…if you need a court order to ban every spammer and troll, you’ll drown in spam and propaganda. The legal system can’t keep up.

                • @sugar_in_your_tea@sh.itjust.works
                  link
                  fedilink
                  English
                  33 months ago

                  I’m not saying companies should need a court order, only that they should only be obligated to remove content by a court order, and ideally they’d lean toward keeping content than removing it. I think it’s generally better for platforms to enable users to hide content they don’t want to see instead of outright removing it for everyone. One person’s independent journalism is another person’s propaganda, and I generally don’t trust big tech companies with agendas to decide between the two.

    • @flashgnash@lemm.ee
      link
      fedilink
      English
      13 months ago

      I’m willing to bet the people that government wanted were not infact posting CSAM, I’m pretty sure even x would ban them of its own volition pretty quickly if they were doing that

      • @OldWoodFrame@lemm.ee
        link
        fedilink
        English
        23 months ago

        They weren’t, it was just the example at the furthest end of the spectrum. But your framing of “if it was REALLY bad, Twitter would ban it” can not be the solution. We have legitimate governments tasked with governing based on the will of the people, it’s not better to just let Elon Musk or Mark Zuckerberg decide the law.

        • @flashgnash@lemm.ee
          link
          fedilink
          English
          13 months ago

          They would ban it if was really bad because it’s illegal for that stuff to exist and they will face much more serious issues as a company if they don’t remove it, they’re not doing it out of the goodness of they’re hearts

          Also not a good look for a company to be hosting that stuff in general for their PR, which is determined entirely by the general population’s reaction to their actions and not a small group of individuals in powerful positions