robots.txt for Gemini formalised
Solderpunk
solderpunk at posteo.net
Tue Nov 24 19:03:28 GMT 2020
On Tue Nov 24, 2020 at 3:07 PM CET, Drew DeVault wrote:
> Web portals are users, plain and simple. Anyone who blocks a web portal
> is blocking legitimate users who are engaging in legitimate activity.
> This is a dick move and I won't stand up for anyone who does it.
This has actually long been a bit of a contentious point in the
Gopherverse, and we have inherited a bit of the controversy, if I
remember much earlier discussions accurately. There are some people
(a vocal minority? I'm not sure), who feel that public web proxies
exposing their Gopherhole/capsule to the entire browser-using world are
negating the agency they exercised in very deliberately putting some
content up only on Gopher/Gemini and not the web. Web proxies force
them to be visible in (and linkable from) a space that they are actively
trying not to participate in.
While I am aware of the ultimate futility of trying to control where
publically served online content ends up, I have some sympathy for this
perspective (perhaps even more so now that we have very nice tools like
your own Kineto by which people who *do* want their content to be
accessible from a browser can achieve this easily). When the first web
portals for Gemini turned up, some people expressed interest in being
able to opt out, to keep their Gemini-only content truly Gemini-only,
and at least one of those early web portals (portal.mozz.us) agreed to
respect those wishes. The webproxy user agent I put into the first
robots.txt draft is actually just codifying what portal.mozz.us has
already been doing for many months. I did not expect its inclusion to
be so controversial. I *did* try to word it carefully so that personal
webproxies which, e.g. run on a user's local machine and are not
publically accessible need not abide by robots.txt, as those are really
just roundabout Gemini clients.
Cheers,
Solderpunk
More information about the Gemini
mailing list