问题
Dropbox rest api, in function metatada has a parameter named "hash" https://www.dropbox.com/developers/reference/api#metadata
Can I calculate this hash locally without call any remote api rest function?
I need know this value to reduce upload bandwidth.
回答1:
https://www.dropbox.com/developers/reference/content-hash explains how Dropbox computes their file hashes. A Python implementation of this is below:
import hashlib
import math
import os
DROPBOX_HASH_CHUNK_SIZE = 4*1024*1024
def compute_dropbox_hash(filename):
file_size = os.stat(filename).st_size
num_chunks = int(math.ceil(file_size/DROPBOX_HASH_CHUNK_SIZE))
with open(filename, 'rb') as f:
block_hashes = b''
while True:
chunk = f.read(DROPBOX_HASH_CHUNK_SIZE)
if not chunk:
break
block_hashes += hashlib.sha256(chunk).digest()
return hashlib.sha256(block_hashes).hexdigest()
回答2:
The "hash" parameter on the metadata call isn't actually the hash of the file, but a hash of the metadata. It's purpose is to save you having to re-download the metadata in your request if it hasn't changed by supplying it during the metadata request. It is not intended to be used as a file hash.
Unfortunately I don't see any way via the Dropbox API to get a hash of the file itself. I think your best bet for reducing your upload bandwidth would be to keep track of the hash's of your files locally and detect if they have changed when determining whether to upload them. Depending on your system you also likely want to keep track of the "rev" (revision) value returned on the metadata request so you can tell whether the version on Dropbox itself has changed.
回答3:
This won't directly answer your question, but is meant more as a workaround; The dropbox sdk gives a simple updown.py example that uses file size and modification time to check the currency of a file.
an abbreviated example taken from updown.py:
dbx = dropbox.Dropbox(api_token)
...
# returns a dictionary of name: FileMetaData
listing = list_folder(dbx, folder, subfolder)
# name is the name of the file
md = listing[name]
# fullname is the path of the local file
mtime = os.path.getmtime(fullname)
mtime_dt = datetime.datetime(*time.gmtime(mtime)[:6])
size = os.path.getsize(fullname)
if (isinstance(md, dropbox.files.FileMetadata) and mtime_dt == md.client_modified and size == md.size):
print(name, 'is already synced [stats match]')
回答4:
As far as I am concerned, No you can't. The only way is using Dropbox API which is explained here.
回答5:
Node.js / Javascript version
The following code calculates the hash of a file the same way Dropbox does, and returns it as a hexadecimal string.
const fs = require('fs');
const crypto = require('crypto');
function calcDropboxHash(filePath) {
return new Promise((resolve, reject) => {
const chunkSize = 4 * 1024 * 1024;
const readStream = fs.createReadStream(filePath);
const chunkHashes = [];
readStream.on('data', data => {
chunkHashes.push(crypto.createHash('sha256').update(data).digest());
});
readStream.on('error', err => {
reject(err);
});
readStream.on('end', () => {
resolve(crypto.createHash('sha256')
.update(Buffer.concat(chunkHashes)).digest('hex'));
});
readStream.read(chunkSize);
});
}
来源:https://stackoverflow.com/questions/13008040/locally-calculate-dropbox-hash-of-files