External SHA hashes instead?
Jason McBrayer
jmcbray at carcosa.net
Mon Nov 2 14:37:11 GMT 2020
"Arav K." <nothien at uber.space> writes:
> On Fri Oct 30, 2020 at 12:34 PM UTC, Nathan Galt wrote:
>> If you’re worried whether the occasional big(gish) file transferred
>> correctly and don’t want to stand up an HTTP server, have you
>> considered publishing SHA-256 or -512 hashes, like one does for
>> Linux-distribution .iso files?
>
> The issue with this is twofold:
> * Clients generally don't want the hashes of all files, only of the
> they are trying to download. It is better to only provide this hash,
> which can be done for individual files (either as a MIME type param
> or from a special access point which takes the file name as input).
> * The server would have to constantly update the hash list (or keep it
> in memory and still keep updating it). This is all the more
> difficult for servers running CGI scripts, as their output can vary.
> Further complications arise if servers change their output based on
> client certs. Better to serve the hash along with the content
> itself, as then all the details about the content are known (e.g. the
> client cert used to request the content, and the state of databases
> and CGI output which affect the response).
I believe the suggestion was 1) to do this only for known large files
(like the audio files on konpeito.media) and 2) Provide hashes and/or
signatures for the purpose of manual, not automatic, validation.
--
+-----------------------------------------------------------+
| Jason F. McBrayer jmcbray at carcosa.net |
| A flower falls, even though we love it; and a weed grows, |
| even though we do not love it. -- Dogen |
More information about the Gemini
mailing list