Assuming disallow-all, and some research on robots.txt in Geminispace (Was: Re: robots.txt for Gemini formalised)
Björn Wärmedal
bjorn.warmedal at gmail.com
Thu Nov 26 07:32:01 GMT 2020
> > Google's "cached" pages system is essentially an archive under a different name. Is it truly a leap of logic if even a court of law comes to the same decision?
>
> Yes, the court clearly made a leap in logic. Courts don't always follow logic,
> because it's not efficient to do so.
Courts don't always follow logic, but they often follow precedent.
> Anyways, if I find any site archiving any of the stuff from my server, I'll be
> looking into DMCA takedowns, because I don't tolerate utter disrespect for users'
> content like that. It's disgusting.
And a DMCA takedown notice is a legal measure, which needs a legal
footing. And it would get that by using robots.txt files as
established in precedent.
More information about the Gemini
mailing list