External SHA hashes instead? (was: Re: Content length, EOF -- ways to resolve whether we received everything)

Arav K. nothien at uber.space
Sat Oct 31 09:25:09 GMT 2020


On Fri Oct 30, 2020 at 12:34 PM UTC, Nathan Galt wrote:
> If you’re worried whether the occasional big(gish) file transferred
> correctly and don’t want to stand up an HTTP server, have you
> considered publishing SHA-256 or -512 hashes, like one does for
> Linux-distribution .iso files?

The issue with this is twofold:
 * Clients generally don't want the hashes of all files, only of the
   they are trying to download.  It is better to only provide this hash,
   which can be done for individual files (either as a MIME type param
   or from a special access point which takes the file name as input).
 * The server would have to constantly update the hash list (or keep it
   in memory and still keep updating it).  This is all the more
   difficult for servers running CGI scripts, as their output can vary.
   Further complications arise if servers change their output based on
   client certs.  Better to serve the hash along with the content
   itself, as then all the details about the content are known (e.g. the
   client cert used to request the content, and the state of databases
   and CGI output which affect the response).

~aravk | ~nothien


More information about the Gemini mailing list