Load Tensorflow js model from local file system in javascript

白昼怎懂夜的黑 提交于 2019-11-28 11:43:21

I know you're trying to load your model in a browser but if anybody lands here that's trying to do it in Node, here's how:

const tf = require("@tensorflow/tfjs");
const tfn = require("@tensorflow/tfjs-node");
const handler = tfn.io.fileSystem("./path/to/your/model.json");
const model = await tf.loadModel(handler);

LoadModel uses fetch under the hood. And fetch cannot access the local files directly. It is meant to be used to get files served by a server. More on this here. To load a local file with the browser, there is two approaches, asking the user to upload the file with

<input type="file"/>

Or serving the file by a server.

In these two scenarios, tf.js provides way to load the model.

  1. Load the model by asking the user to upload the file

html

<input type="file" id="upload-json"/>
<input type="file" id="upload-weights"/>

js

const uploadJSONInput = document.getElementById('upload-json');
const uploadWeightsInput = document.getElementById('upload-weights');
const model = await tfl.loadModel(tf.io.browserFiles(
 [uploadJSONInput.files[0], uploadWeightsInput.files[0]]));
  1. Serving the local files using a server

To do so, one can use the following npm module http-server to serve the directory containing both the weight and the model. It can be installed with the following command:

 npm install http-server -g

Inside the directory, one can run the following command to launch the server:

http-server -c1 --cors .

Now the model can be loaded:

 // load model in js script
 (async () => {
   ...
   const model = await tf.loadFrozenModel('http://localhost:8080/model.pb', 'http://localhost:8080/weights.json')
 })()
const tf = require('@tensorflow/tfjs');
const tfnode = require('@tensorflow/tfjs-node');

async function loadModel(){
    const handler = tfnode.io.fileSystem('tfjs_model/model.json');
    const model = await tf.loadLayersModel(handler);
    console.log("Model loaded")
}


loadModel();

This worked for me in node. Thanks to jafaircl.

You could try:

const model = await tf.models.modelFromJSON(myModelJSON)

Here it is in the tensorflow.org docs

Check out our documentation for loading models: https://js.tensorflow.org/api/latest/#Models-Loading

You can use tf.loadModel takes a string which is a URL to your model definition which needs to get served over HTTP. This means you need to start an http-server to serve those files (it will not allow you to make a request to your filesystem because of CORS).

This package can do that for you: npmjs.com/package/http-server

You could use insecure chrome instance:

C:\Program Files (x86)\Google\Chrome\Application>chrome.exe --disable-web-security --disable-gpu --user-data-dir=C:/Temp

Than you could add this script to redefine fetch function

async function fetch(url) {
  return new Promise(function(resolve, reject) {
    var xhr = new XMLHttpRequest
    xhr.onload = function() {
      resolve(new Response(xhr.responseText, {status: 200}))
    }
    xhr.onerror = function() {
      reject(new TypeError('Local request failed'))
    }
    xhr.open('GET', url)
    xhr.send(null)
  })
}

After that be shure that you use the right model loader my comment about loader issue

BUT your weights will be incorrect - as I understand there are some encoding problems.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!