Home > other >  How to pass media stream from angular code to Facemesh.js
How to pass media stream from angular code to Facemesh.js

Time:12-08

How can i replace this camera function and pass my own mediastream from another angular module? Src: https://google.github.io/mediapipe/solutions/face_mesh

const camera = new Camera(videoElement, {
  onFrame: async () => {
    await faceMesh.send({image: videoElement});
  },
  width: 1280,
  height: 720
});
camera.start();

I am unable to pass mediastream to faceMesh.send() if i try to remove camera fucntion.

CodePudding user response:

MediaPipe FaceMesh can read HTMLVideoElement (behind the scenes its just a copy-frame-to-canvas followed by get pixels from canvas to tensor), so you don't have to deal with that.

But your media stream will not have per-frame events (there is no such thing in JS specs) - that's what Camera wrapper value-add is.

Not going to write the code for you, but workflow you need is:

  1. do faceMesh.send
  2. listen for results
  3. once you have results do next faceMesh.send

CodePudding user response:

To pass a media stream from Angular code to Facemesh.js, you will need to use the WebRTC API and the getUserMedia() method to access the user's camera and microphone. Here is an example of how to do this:

  1. First, import the necessary modules and libraries in your Angular component:
import { Component } from '@angular/core';

import * as faceapi from 'facemesh';
  1. In the component's class, declare a property to hold the media stream and initialize it using the getUserMedia() method:
export class MyComponent {
  stream: MediaStream;

  constructor() {
    navigator.mediaDevices.getUserMedia({ video: true, audio: false })
      .then((stream) => {
        this.stream = stream;
      })
      .catch((err) => {
        // Handle error
      });
  }
}
  1. Once you have the media stream, you can pass it to Facemesh.js using the createStreamingContext method:
const faceMesh = faceapi.createStreamingContext();
faceMesh.video.srcObject = this.stream;
  1. You can then use the faceMesh object to detect faces in the video stream and perform other operations provided by Facemesh.js. For example:
faceapi.detect(faceMesh).then((detections) => {
  // Use detections to display face mesh on video
});

Note: This is just a basic example of how to pass a media stream from Angular to Facemesh.js. You may need to add additional code to handle errors and other details.

  • Related