Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for human pose estimation tensorrt
human-pose-estimation
x
tensorrt
x
5 search results found
Hyperpose
⭐
1,237
Library for Fast and Flexible Human Pose Estimation
Trt_pose
⭐
763
Real-time pose estimation accelerated with NVIDIA TensorRT
Deepstream_pose_estimation
⭐
210
This is a sample DeepStream application to demonstrate a human pose estimation pipeline.
Centernet Tensorrt
⭐
143
This is a C++ implementation of CenterNet using TensorRT and CUDA
Lightglue Onnx
⭐
141
ONNX-compatible LightGlue: Local Feature Matching at Light Speed. Supports TensorRT, OpenVINO
Deepstream Yolo Pose
⭐
93
NVIDIA DeepStream SDK 6.3 / 6.2 / 6.1.1 / 6.1 / 6.0.1 / 6.0 application for YOLO-Pose models
Tensorrt Examples
⭐
71
TensorRT Examples (TensorRT, Jetson Nano, Python, C++)
Isaac_ros_pose_estimation
⭐
70
Deep learned, hardware-accelerated 3D object pose estimation
Easy_vitpose
⭐
67
Easy and fast 2d human and animal multi pose estimation using SOTA ViTPose [Y. Xu et al., 2022] Real-time performances and multiple skeletons supported.
Tensorrt Openpose
⭐
36
TensorRT C++ Implementation of openpose
Deepstream Yolo Pose
⭐
29
Use Deepstream python API to extract the model output tensor and customize the post-processing of YOLO-Pose
Trt_pose_hand
⭐
24
Real-time hand pose estimation and gesture classification using TensorRT
Dedode Onnx Tensorrt
⭐
17
ONNX-compatible DeDoDe 🎶 Detect, Don't Describe - Describe, Don't Detect, for Local Feature Matching. Supports TensorRT 🚀
Ros2_trt_pose
⭐
8
ROS 2 package for "trt_pose": real-time human pose estimation on NVIDIA Jetson Platform
Vitpose Pytorch
⭐
7
VitPose without MMCV dependencies
Real Time 3d Multi Person Pose Estimation Demo On Jetson Tx2
⭐
5
Real-time 3D multi-person pose estimation demo on Jetson TX2 with TensorRT.
Lite Hrnet Trt
⭐
5
Run Lite-HRNet pose-estimation with TensorRT and C++.
1-5 of 5 search results
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.