Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for onnx inference engine
inference-engine
x
onnx
x
13 search results found
Tengine
⭐
4,452
Tengine is a lite, high performance, modular inference engine for embedded device
Forward
⭐
491
A library for high performance deep learning inference on NVIDIA GPUs.
Libonnx
⭐
451
A lightweight, portable pure C99 onnx inference engine for embedded devices with hardware acceleration support.
Mivisionx
⭐
179
MIVisionX toolkit is a set of comprehensive computer vision and machine intelligence libraries, utilities, and applications bundled into a single toolkit. AMD MIVisionX also delivers a highly optimized open-source implementation of the Khronos OpenVX™ and OpenVX™ Extensions.
Cnn Inference Engine Quick View
⭐
142
A quick view of high-performance convolution neural networks (CNNs) inference engines on mobile devices.
Nn Inference Template
⭐
63
Neural network inference template for real-time cricital audio environments - presented at ADC23
Planer
⭐
45
Powerful Light Artificial NEuRon inference framework for CNN
Mpsx
⭐
39
GPU tensor framework with support for running ONNX models
Cheetahinfer
⭐
33
A C++ inference SDK based on TensorRT
Dbface On Openvino
⭐
23
Describes how to run DBFace, a real-time, single-shot face detection model on Intel OpenVINO
Openvino Ep Enabled Onnxruntime
⭐
7
Describing How to Enable OpenVINO Execution Provider for ONNX Runtime
Mivisionx Inference Tutorial
⭐
6
MIVisionX toolkit is a set of comprehensive computer vision and machine intelligence libraries, utilities, and applications bundled into a single toolkit.
Openvino Onnx Importer Api
⭐
5
Demonstrate how to use ONNX importer API in Intel OpenVINO toolkit. This API allows user to load an ONNX model and run inference with OpenVINO Inference Engine.
1-13 of 13 search results
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.