tensorflow.js

How to load tensorflow-js weights from express using tf.loadLayersModel()?

ε祈祈猫儿з 提交于 2021-02-05 12:26:15
问题 I get the error - RangeError: attempting to construct out-of-bounds TypedArray on ArrayBuffer when I try to load a tf-js model into Reactjs. I'm using express.js to send the json+bin files to react so that I can run inference in the browser itself. Here's the relevant Express.js code. The json+bin files are all in the same folder. app.use( "/api/pokeml/classify", express.static(path.join(__dirname, "classifier_models/original/model.json")) ) Here's how I'm loading this in React - import * as

How to load tensorflow-js weights from express using tf.loadLayersModel()?

♀尐吖头ヾ 提交于 2021-02-05 12:22:22
问题 I get the error - RangeError: attempting to construct out-of-bounds TypedArray on ArrayBuffer when I try to load a tf-js model into Reactjs. I'm using express.js to send the json+bin files to react so that I can run inference in the browser itself. Here's the relevant Express.js code. The json+bin files are all in the same folder. app.use( "/api/pokeml/classify", express.static(path.join(__dirname, "classifier_models/original/model.json")) ) Here's how I'm loading this in React - import * as

this.util.TextEncoder is not a constructor only in electron app (works in chrome)

本秂侑毒 提交于 2021-02-05 09:11:36
问题 I am creating a body segmentation app using tensorflow bodypix model. It works fine in the browser. I am using webpack to use its modules(see below) import * as wasm from "@tensorflow/tfjs-backend-wasm"; import * as tf from "@tensorflow/tfjs-core"; import * as bodyPix from "@tensorflow-models/body-pix"; wasm.setWasmPaths("./wasm/"); tf.setBackend("wasm").then(() => { //some simple vanilla js code }); //some more vanilla js code... It works exactly fine in chrome and giving output as expected

How can I convert a tensor into bounding box coordinates and labels in Javascript?

a 夏天 提交于 2021-01-29 07:13:47
问题 I've trained a .pb object detection model in python using Colab and converted it to the model.json format using the TensorFlow converter. I need to load this model inside the browser (no Node.js!) and run inference there. This is my complete code so far (images added in HTML using PHP): <!DOCTYPE html> <html lang="en"> <head> <meta charset="UTF-8"> <meta name="viewport" content="width=device-width, initial-scale=1.0"> <title>Document</title> <link rel="stylesheet" href="../style.css"> </head>

WASM backend for tensorflowjs throws “Unhandled Rejection (RuntimeError): index out of bounds” error in Reactjs

倖福魔咒の 提交于 2021-01-28 21:08:00
问题 I am trying to set up a WASM back-end for blazeface face detection model in a react app. Although the demo with the vanillajs can run it without any error for hours, in react it throws "Unhandled Rejection (RuntimeError): index out of bounds error" after leaving the cam open for more than 3-5 minutes. Entire app crashes with this error. From the log of the error below, maybe it is related to disposeData() or disposeTensor() functions which to my guess, they are related to garbage collecting.

WASM backend for tensorflowjs throws “Unhandled Rejection (RuntimeError): index out of bounds” error in Reactjs

僤鯓⒐⒋嵵緔 提交于 2021-01-28 20:40:49
问题 I am trying to set up a WASM back-end for blazeface face detection model in a react app. Although the demo with the vanillajs can run it without any error for hours, in react it throws "Unhandled Rejection (RuntimeError): index out of bounds error" after leaving the cam open for more than 3-5 minutes. Entire app crashes with this error. From the log of the error below, maybe it is related to disposeData() or disposeTensor() functions which to my guess, they are related to garbage collecting.

custom layer multiple input issue (Uncaught TypeError: Cannot read property 'dtype' of undefined)

瘦欲@ 提交于 2021-01-28 19:29:35
问题 I am trying to write a custom layer (lambda layer replacement), layer inferred without model does fine, with wrapping model runs to a certain point, than crashes. Printing the received inputs in the layer itself works, just before the crash. the issue @github is issue <!-- Load TensorFlow.js --> <script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs/dist/tf.min.js"> </script> <script> /****************************************************************************** * tensorflow.js lambda

custom layer multiple input issue (Uncaught TypeError: Cannot read property 'dtype' of undefined)

坚强是说给别人听的谎言 提交于 2021-01-28 19:22:45
问题 I am trying to write a custom layer (lambda layer replacement), layer inferred without model does fine, with wrapping model runs to a certain point, than crashes. Printing the received inputs in the layer itself works, just before the crash. the issue @github is issue <!-- Load TensorFlow.js --> <script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs/dist/tf.min.js"> </script> <script> /****************************************************************************** * tensorflow.js lambda

How to merge shard bin files into one

ぐ巨炮叔叔 提交于 2021-01-28 18:22:41
问题 I just trained a Google Auto ML Vision model and exported as TensorFlowJS to implement it in React Native. The Problem I am facing: Google splits the weights into 6 different shard*.bin files. As it is a graph model, I can't use the tensorflowjs_converter to leverage the --weight_shard_size_bytes setting. I did not find any other way to import it into my React Native app than by using the BundleResourcesIO function which needs one bin file. The model should be bundled locally. Does anyone

How to convert from Tensorflow.js (.json) model into Tensorflow (SavedModel) or Tensorflow Lite (.tflite) model?

我怕爱的太早我们不能终老 提交于 2021-01-28 12:16:22
问题 I have downloaded a pre-trained PoseNet model for Tensorflow.js (tfjs) from Google, so its a json file. However, I want to use it on Android, so I need the .tflite model. Although someone has 'ported' a similar model from tfjs to tflite here, I have no idea what model (there are many variants of PoseNet) they converted. I want to do the steps myself. Also, I don't want to run some arbitrary code someone uploaded into a file in stackOverflow: Caution: Be careful with untrusted code—TensorFlow