วันอาทิตย์ที่ 19 เมษายน พ.ศ. 2563

Raspberry pi TensorFlow.JS Facemesh






Raspberry pi TensorFlow.JS Facemesh

The facemesh package infers approximate 3D facial surface geometry from an image or video stream, requiring only a single camera input without the need for a depth sensor. This geometry locates features such as the eyes, nose, and lips within the face, including details such as lip contours and the facial silhouette. This information can be used for downstream tasks such as expression classification (but not for identification). Refer to our model card for details on how the model performs across different datasets. This package is also available through MediaPipe.

Performance characteristics

Facemesh is a lightweight package containing only ~3MB of weights, making it ideally suited for real-time inference on a variety of mobile devices. When testing, note that TensorFlow.js also provides several different backends to choose from, including WebGL and WebAssembly (WASM) with XNNPACK for devices with lower-end GPU's. The table below shows how the package performs across a few different devices and TensorFlow.js backends:

The table shows how the package performs across different devices and TensorFlow.js backends


Demo Source Code
https://github.com/tensorflow/tfjs-models/tree/master/facemesh




Installation on Raspberry pi ( see in Youtube )

Install NodeJS and npm

curl -sL https://deb.nodesource.com/setup_12.x | sudo bash -

sudo apt-get install nodejs


Install Yarn

curl -sS https://dl.yarnpkg.com/debian/pubkey.gpg | sudo apt-key add -

echo "deb https://dl.yarnpkg.com/debian/ stable main" | sudo tee /etc/apt/sources.list.d/yarn.list

sudo apt-get update && sudo apt-get install yarn


Check Version



Run Demo code ( need USB Webcam )

Git Clone code
git clone https://github.com/tensorflow/tfjs-models.git


Go into the facemesh folder:
cd facemesh
Install dependencies:
yarn
Publish facemesh locally:
yarn build && yarn yalc publish
Cd into the demos and install dependencies:
cd demo
yarn
Link the local facemesh to the demos:
yarn yalc link @tensorflow-models/facemesh
Start the dev demo server:
yarn watch
Then Open Web Browser and use domain localhost:1234

Need USB Webcam







Test on Android Web Browser



       .

Reference

Face and hand tracking in the browser with MediaPipe and TensorFlow.js
https://blog.tensorflow.org/2020/03/face-and-hand-tracking-in-browser-with-mediapipe-and-tensorflowjs.html

MediaPipe
https://github.com/google/mediapipe

Online Demo

ไม่มีความคิดเห็น:

แสดงความคิดเห็น