Mozilla recently removed every version of uBlock Origin Lite from their add-on store except for the oldest version.

Mozilla says a manual review flagged these issues:

Consent, specifically Nonexistent: For add-ons that collect or transmit user data, the user must be informed…

Your add-on contains minified, concatenated or otherwise machine-generated code. You need to provide the original sources…

uBlock Origin’s developer gorhill refutes this with linked evidence.

Contrary to what these emails suggest, the source code files highlighted in the email:

  • Have nothing to do with data collection, there is no such thing anywhere in uBOL
  • There is no minified code in uBOL, and certainly none in the supposed faulty files

Even for people who did not prefer this add-on, the removal could have a chilling effect on uBlock Origin itself.

Incidentally, all the files reported as having issues are exactly the same files being used in uBO for years, and have been used in uBOL as well for over a year with no modification. Given this, it’s worrisome what could happen to uBO in the future.

And gorhill notes uBO Lite had a purpose on Firefox, especially on mobile devices:

[T]here were people who preferred the Lite approach of uBOL, which was designed from the ground up to be an efficient suspendable extension, thus a good match for Firefox for Android.

New releases of uBO Lite do not have a Firefox extension; the last version of this coincides with gorhill’s message. The Firefox addon page for uBO Lite is also gone.

  • @abbenm@lemmy.ml
    link
    fedilink
    38
    edit-2
    3 months ago

    I’ve noticed the same thing you have, but I suspect it has a different explanation. I think it’s more an echo chamber thing. People have said variations of this for a while now in HN comment threads, on reddit and here. And there’s a snowball effect from more people saying it.

    But there’s been a throughline of bizarrely apathetic and insubstantial low effort comments. That’s the one thing that has tied them together, which is why I think they are echo-chambery. Just for one example: one guy just never read a 990 before (a standard nonprofit form), and read Mozilla’s and thought it was a conspiracy, and wrote an anti-Mozilla blog post. And then someone linked to that on Lemmy and said it was shady finances. Tons of upvotes.

    But I’m convinced that no one reads through these links, including the people posting them. Because it takes two seconds to realize they are nonsense. But it doesn’t stop them from getting upvoted.

    So my theory is echo chamber.

    • zkfcfbzr
      link
      fedilink
      English
      393 months ago

      I think it’s probably a combination of both. There’s an astroturfing campaign going on somewhere, just not on Lemmy, which is overall too small and insignificant to target. But astroturfing works - it creates the echo chambers you’re talking about, it creates apathy. Most people just read headlines, not even the comments. You read a bad story about Mozilla once a week and you’ll start to internalize it - eventually your opinion of Mozilla will drop, justified or not, to the point where you’re willing to believe even the more heinous theories about it.

      So you end up with a lot of people who’ve been fed a lot of misleading half-truths and even some outright lies, who are now getting angry enough about the situation they think is going on to start actively posting anti-Mozilla posts and comments on their own.

      • @abbenm@lemmy.ml
        link
        fedilink
        93 months ago

        Right - I think either way there’s a snowballing effect. Astroturfing, at least as far as I can tell, can be notable for at least trying to make coherent arguments. Echo chambers I would say are characterized by fuzzy thinking, and I’ve seen more of the latter here (especially in this thread).

        That said, sometimes the goal of astroturfing isn’t to make a point but to degrade conversations with noise and nonsense, extrapolations and digressions. In light of that, I suppose that too could explain some of what we’re seeing.

    • AwkwardLookMonkeyPuppet
      link
      fedilink
      English
      23 months ago

      You just described why astroturfing and social engineering is so effective. Most people don’t check. So someone can post straight up nonsense and still influence millions of people’s opinions.