Upload progress indicators for fetch?

徘徊边缘 提交于 2019-11-26 17:25:35

Streams are starting to land in the web platform (https://jakearchibald.com/2016/streams-ftw/) but it's still early days.

Soon you'll be able to provide a stream as the body of a request, but the open question is whether the consumption of that stream relates to bytes uploaded.

Particular redirects can result in data being retransmitted to the new location, but streams cannot "restart". We can fix this by turning the body into a callback which can be called multiple times, but we need to be sure that exposing the number of redirects isn't a security leak, since it'd be the first time on the platform JS could detect that.

Some are questioning whether it even makes sense to link stream consumption to bytes uploaded.

Long story short: this isn't possible yet, but in future this will be handled either by streams, or some kind of higher-level callback passed into fetch().

My solution is to use axios, which supports this pretty well:

      axios.request( {
        method: "post", 
        url: "/aaa", 
        data: myData, 
        onUploadProgress: (p) => {
          console.log(p); 
          //this.setState({
            //fileprogress: p.loaded / p.total
          //})
        }


      }).then (data => {
        //this.setState({
          //fileprogress: 1.0,
        //})
      })

I have example for using this in react on github.

Bergi

I don't think it's possible. The draft states:

it is currently lacking [in comparison to XHR] when it comes to request progression


(old answer):
The first example in the Fetch API chapter gives some insight on how to :

If you want to receive the body data progressively:

function consume(reader) {
  var total = 0
  return new Promise((resolve, reject) => {
    function pump() {
      reader.read().then(({done, value}) => {
        if (done) {
          resolve()
          return
        }
        total += value.byteLength
        log(`received ${value.byteLength} bytes (${total} bytes in total)`)
        pump()
      }).catch(reject)
    }
    pump()
  })
}

fetch("/music/pk/altes-kamuffel.flac")
  .then(res => consume(res.body.getReader()))
  .then(() => log("consumed the entire body without keeping the whole thing in memory!"))
  .catch(e => log("something went wrong: " + e))

Apart from their use of the Promise constructor antipattern, you can see that response.body is a Stream from which you can read byte by byte using a Reader, and you can fire an event or do whatever you like (e.g. log the progress) for every of them.

However, the Streams spec doesn't appear to be quite finished, and I have no idea whether this already works in any fetch implementation.

Shishir Arora

Since none of the answers solve the problem.

Just for implementation sake, you can detect the upload speed with some small initial chunk of known size and the upload time can be calculated with content-length/upload-speed. You can use this time as estimation.

A possible workaround would be to utilize new Request() constructor then check Request.bodyUsed Boolean attribute

The bodyUsed attribute’s getter must return true if disturbed, and false otherwise.

to determine if stream is distributed

An object implementing the Body mixin is said to be disturbed if body is non-null and its stream is disturbed.

Return the fetch() Promise from within .then() chained to recursive .read() call of a ReadableStream when Request.bodyUsed is equal to true.

Note, the approach does not read the bytes of the Request.body as the bytes are streamed to the endpoint. Also, the upload could complete well before any response is returned in full to the browser.

const [input, progress, label] = [
  document.querySelector("input")
  , document.querySelector("progress")
  , document.querySelector("label")
];

const url = "/path/to/server/";

input.onmousedown = () => {
  label.innerHTML = "";
  progress.value = "0"
};

input.onchange = (event) => {

  const file = event.target.files[0];
  const filename = file.name;
  progress.max = file.size;

  const request = new Request(url, {
    method: "POST",
    body: file,
    cache: "no-store"
  });

  const upload = settings => fetch(settings);

  const uploadProgress = new ReadableStream({
    start(controller) {
        console.log("starting upload, request.bodyUsed:", request.bodyUsed);
        controller.enqueue(request.bodyUsed);
    },
    pull(controller) {
      if (request.bodyUsed) {
        controller.close();
      }
      controller.enqueue(request.bodyUsed);
      console.log("pull, request.bodyUsed:", request.bodyUsed);
    },
    cancel(reason) {
      console.log(reason);
    }
  });

  const [fileUpload, reader] = [
    upload(request)
    .catch(e => {
      reader.cancel();
      throw e
    })
    , uploadProgress.getReader()
  ];

  const processUploadRequest = ({value, done}) => {
    if (value || done) {
      console.log("upload complete, request.bodyUsed:", request.bodyUsed);
      // set `progress.value` to `progress.max` here 
      // if not awaiting server response
      // progress.value = progress.max;
      return reader.closed.then(() => fileUpload);
    }
    console.log("upload progress:", value);
    progress.value = +progress.value + 1;
    return reader.read().then(result => processUploadRequest(result));
  };

  reader.read().then(({value, done}) => processUploadRequest({value,done}))
  .then(response => response.text())
  .then(text => {
    console.log("response:", text);
    progress.value = progress.max;
    input.value = "";
  })
  .catch(err => console.log("upload error:", err));

}
const response = await fetch(url);
const total = Number(response.headers.get('content-length'));

const reader = response.body.getReader();
let bytesReceived = 0;
while (true) {
    const result = await reader.read();
    if (result.done) {
        console.log('Fetch complete');
        break;
    }
    bytesReceived += result.value.length;
    console.log('Received', bytesReceived, 'bytes of data so far');
}

thanks to this link: https://jakearchibald.com/2016/streams-ftw/

Leon Gilyadov
const req = await fetch('./foo.json');
const total = Number(req.headers.get('content-length'));
let loaded = 0;
for await(const {length} of req.body.getReader()) {
  loaded = += length;
  const progress = ((loaded / total) * 100).toFixed(2); // toFixed(2) means two digits after floating point
  console.log(`${progress}%`); // or yourDiv.textContent = `${progress}%`;
}

Key part is ReadableStream ≪obj_response.body≫.

Sample:

let parse=_/*result*/=>{
  console.log(_)
  //...
  return /*cont?*/_.value?true:false
}

fetch('').
then(_=>( a/*!*/=_.body.getReader(), b/*!*/=z=>a.read().then(parse).then(_=>(_?b:z=>z)()), b() ))

You can test running it on a huge page eg https://html.spec.whatwg.org/ and https://html.spec.whatwg.org/print.pdf . CtrlShiftJ and load the code in.

(Tested on Chrome.)

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!