• interdimensionalmeme@lemmy.ml
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    2
    ·
    26 days ago

    Yeah but there’s would be scrappers if the robots file just pointed to a dump file.

    Then the scraper could just do a spot check a few dozen random page and check the dump is actually up to date and complete and then they’d know they don’t need to waste any time there and move on.