Simple inference deep head pose ncnn version

ncnn-deep-head-pose

Simple implement inference deep head pose ncnn version with high performance and optimized resource. This project based on deep-head-pose project by Nataniel Ruiz. And detail paper in CVPR Workshop. I use Retinaface for face detection step.

Workflow

Re-build ncnn for arbitrary platform.

  • Official ncnn document shown in detail how to use and build ncnn for arbitrary platform.
  • And if you use my docker environment, i was build ncnn library inside docker environment with path: /home/ncnn_original/build it's contain ncnn shared static library and tools for convert and quantize ncnn models.

Convert models ncnn format.

  • As original deep head pose project used Pytorch framwork. So, We need convert Pytorch model to ncnn model.
  • In ncnn wiki had detailed this guide here. After convert pytorch to onnx format, you have to use ncnn build tools to convert onnx->ncnn. Inside my docker env, ready to use in /home/ncnn_original/build/tools/onnx
  • Notice, Netron support to visualize network in intuitive easy to get input node and output node as specification of ncnn

Build run test

Run environment

  • git clone https://github.com/docongminh/ncnn-deep-head-pose
  • cd ncnn-deep-head-pose
  • execute env: docker exec -it deep-head-pose bash
  • cd to source mounted: cd /source
  • cd to ncnn build library: cd /home/ncnn_original

Cmake project

  • In project root inside docker: `mkdir -p build && cd build

  • Cmake and build: cmake .. && make

  • Run test: ./main

  • Examples:

    cr7

    m10

Note during develop project

References

Owner
Đỗ Công Minh
telegram: @minhdocs
Đỗ Công Minh
Similar Resources

A library for high performance deep learning inference on NVIDIA GPUs.

A library for high performance deep learning inference on NVIDIA GPUs.

Forward - A library for high performance deep learning inference on NVIDIA GPUs Forward - A library for high performance deep learning inference on NV

Dec 17, 2022

TFCC is a C++ deep learning inference framework.

TFCC is a C++ deep learning inference framework.

Dec 23, 2022

KSAI Lite is a deep learning inference framework of kingsoft, based on tensorflow lite

KSAI Lite English | 简体中文 KSAI Lite是一个轻量级、灵活性强、高性能且易于扩展的深度学习推理框架,底层基于tensorflow lite,定位支持包括移动端、嵌入式以及服务器端在内的多硬件平台。 当前KSAI Lite已经应用在金山office内部业务中,并逐步支持金山

Dec 27, 2022

Benchmark framework of compute-in-memory based accelerators for deep neural network (inference engine focused)

DNN+NeuroSim V1.3 The DNN+NeuroSim framework was developed by Prof. Shimeng Yu's group (Georgia Institute of Technology). The model is made publicly a

Nov 24, 2022

PPLNN is a high-performance deep-learning inference engine for efficient AI inferencing.

PPLNN is a high-performance deep-learning inference engine for efficient AI inferencing.

PPLNN, which is short for "PPLNN is a Primitive Library for Neural Network", is a high-performance deep-learning inference engine for efficient AI inferencing.

Dec 29, 2022

TensorRT is a C++ library for high performance inference on NVIDIA GPUs and deep learning accelerators.

TensorRT Open Source Software This repository contains the Open Source Software (OSS) components of NVIDIA TensorRT. Included are the sources for Tens

Jan 4, 2023

MACE is a deep learning inference framework optimized for mobile heterogeneous computing platforms.

MACE is a deep learning inference framework optimized for mobile heterogeneous computing platforms.

Mobile AI Compute Engine (or MACE for short) is a deep learning inference framework optimized for mobile heterogeneous computing on Android, iOS, Linux and Windows devices.

Jan 3, 2023

A c++ implementation of yolov5 head deepsort detector

A c++ implementation of yolov5 head deepsort detector

A C++ implementation of Yolov5 and Deepsort in Jetson Xavier nx and Jetson nano This repository uses yolov5 and deepsort to follow humna heads which c

Aug 25, 2022

Deep Learning API and Server in C++11 support for Caffe, Caffe2, PyTorch,TensorRT, Dlib, NCNN, Tensorflow, XGBoost and TSNE

Open Source Deep Learning Server & API DeepDetect (https://www.deepdetect.com/) is a machine learning API and server written in C++11. It makes state

Dec 30, 2022
Pose-tensorflow - Human Pose estimation with TensorFlow framework
Pose-tensorflow -  Human Pose estimation with TensorFlow framework

Human Pose Estimation with TensorFlow Here you can find the implementation of the Human Body Pose Estimation algorithm, presented in the DeeperCut and

Dec 29, 2022
A lightweight 2D Pose model can be deployed on Linux/Window/Android, supports CPU/GPU inference acceleration, and can be detected in real time on ordinary mobile phones.
A lightweight 2D Pose model  can be deployed on Linux/Window/Android, supports CPU/GPU inference acceleration, and can be detected in real time on ordinary mobile phones.

A lightweight 2D Pose model can be deployed on Linux/Window/Android, supports CPU/GPU inference acceleration, and can be detected in real time on ordinary mobile phones.

Jan 3, 2023
Android hand detect and pose estimation by ncnn
Android hand detect and pose estimation by ncnn

ncnn_nanodet_hand 1.hand detect:用nanode-m训练了个hand detect模型, 2.hand pose:用CMU的数据集训练了个ghostnet作为backbone模仿pfld的handpose模型 3.推理:handpose.cpp单独检测pose,nano

Jan 3, 2023
Android MoveNet pose estimation by ncnn
Android MoveNet pose estimation by ncnn

ncnn_Android_MoveNet Android MoveNet single human pose estimation by ncnn this project is a ncnn Android demo for MoveNet, it depends on ncnn library

Dec 31, 2022
RealSR-NCNN-Android is a simple Android application that based on Realsr-NCNN & Real-ESRGAN.
 RealSR-NCNN-Android is a simple Android application that based on Realsr-NCNN & Real-ESRGAN.

RealSR-NCNN-Android Real-ESRGAN is a Practical Algorithms for General Image Restoration. RealSR-NCNN-Android is a simple Android application that base

Jan 3, 2023
NCNN+Int8+YOLOv4 quantitative modeling and real-time inference
NCNN+Int8+YOLOv4 quantitative modeling and real-time inference

NCNN+Int8+YOLOv4 quantitative modeling and real-time inference

Dec 6, 2022
ncnn is a high-performance neural network inference framework optimized for the mobile platform
ncnn is a high-performance neural network inference framework optimized for the mobile platform

ncnn ncnn is a high-performance neural network inference computing framework optimized for mobile platforms. ncnn is deeply considerate about deployme

Jan 5, 2023
This is a sample ncnn android project, it depends on ncnn library and opencv
This is a sample ncnn android project, it depends on ncnn library and opencv

This is a sample ncnn android project, it depends on ncnn library and opencv

Jan 6, 2023
GFPGAN-ncnn - a naive NCNN implementation of GFPGAN aims at developing Practical Algorithms for Real-world Face Restoration
GFPGAN-ncnn - a naive NCNN implementation of GFPGAN aims at developing Practical Algorithms for Real-world Face Restoration

GFPGAN-ncnn a naive ncnn implementation of GFPGAN aims at developing Practical Algorithms for Real-world Face Restoration model support: 1.GFPGANClean

Dec 10, 2022
Forward - A library for high performance deep learning inference on NVIDIA GPUs
 Forward - A library for high performance deep learning inference on NVIDIA GPUs

a library for high performance deep learning inference on NVIDIA GPUs.

Mar 17, 2021