It would be really awesome if mediapipe added a simple way to integrate with game engines such as Unity or Unreal. The easiest way would probably be to provide onnx versions of their networks for...
It would be really awesome if mediapipe added a simple way to integrate with game engines such as Unity or Unreal. The easiest way would probably be to provide onnx versions of their networks for use with Unity's barracuda library.
From the blog post: [...] There is a web demo you can try which is kind of neat, though without instructions, I'm not sure exactly what it does.
From the blog post:
MediaPipe Holistic consists of a new pipeline with optimized pose, face and hand components that each run in real-time, with minimum memory transfer between their inference backends, and added support for interchangeability of the three components, depending on the quality/speed tradeoffs. When including all three components, MediaPipe Holistic provides a unified topology for a groundbreaking 540+ keypoints (33 pose, 21 per-hand and 468 facial landmarks) and achieves near real-time performance on mobile devices. MediaPipe Holistic is being released as part of MediaPipe and is available on-device for mobile (Android, iOS) and desktop. We are also introducing MediaPipe’s new ready-to-use APIs for research (Python) and web (JavaScript) to ease access to the technology.
[...]
Its blended approach enables remote gesture interfaces, as well as full-body AR, sports analytics, and sign language recognition.
There is a web demo you can try which is kind of neat, though without instructions, I'm not sure exactly what it does.
It would be really awesome if mediapipe added a simple way to integrate with game engines such as Unity or Unreal. The easiest way would probably be to provide onnx versions of their networks for use with Unity's barracuda library.
From the blog post:
[...]
There is a web demo you can try which is kind of neat, though without instructions, I'm not sure exactly what it does.