• Juki
    link
    fedilink
    English
    151 year ago

    How much of that is a workaround to feed client rendered webpages into LLMs and bypass robots.txt etc

    • @Buddahriffic@lemmy.world
      link
      fedilink
      English
      31 year ago

      I mean, if you want to go around what the site wants you to do, you can just ignore robots.txt. Or use it to find the juicy stuff the site would rather you didn’t.