Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for c plus plus inference engine
c-plus-plus
x
inference-engine
x
34 search results found
Tengine
⭐
4,452
Tengine is a lite, high performance, modular inference engine for embedded device
Ctranslate2
⭐
2,437
Fast inference engine for Transformer models
Kuiperinfer
⭐
1,706
带你从零实现一个高性能的深度学习推理库,支持Unet、Yolov5、Resnet等模型的推理。Imp a high-performance deep learning inference library step by step
Nitro
⭐
1,115
A fast, lightweight, embeddable inference engine to supercharge your apps with local AI. OpenAI-compatible API
Feathercnn
⭐
1,052
FeatherCNN is a high performance inference engine for convolutional neural networks.
Adlik
⭐
688
Adlik: Toolkit for Accelerating Deep Learning Inference
Msnhnet
⭐
666
🔥 (yolov3 yolov4 yolov5 unet ...)A mini pytorch inference framework which inspired from darknet.
Forward
⭐
491
A library for high performance deep learning inference on NVIDIA GPUs.
Compute Engine
⭐
232
Highly optimized inference engine for Binarized Neural Networks
Search Legend
⭐
181
docs for search system and ai infra
Tinytensor
⭐
108
TinyTensor is a tool for running already trained NN (Neural Network) models to be able to use them for inference of various tasks such as image classification, semantic segmentation, etc.
Daisykit
⭐
91
DaisyKit is an easy AI toolkit with face mask detection, pose detection, background matting, barcode detection, face recognition and more. - with NCNN, OpenCV, Python wrappers
Openvino_contrib
⭐
91
Repository for OpenVINO's extra modules
Torsten
⭐
50
library of C++ functions that support applications of Stan in Pharmacometrics
Pytorch Inference
⭐
50
OpenCL Inference Engine for pytorch
Ure
⭐
50
Unified Rule Engine. Graph rewriting system for the AtomSpace. Used as reasoning engine for OpenCog.
Fast Llama
⭐
46
Runs LLaMA with Extremely HIGH speed
Ros_openvino
⭐
39
A ROS package to wrap openvino inference engine and get it working with Myriad and GPU
Cheetahinfer
⭐
33
A C++ inference SDK based on TensorRT
Chainer Trt
⭐
31
Chainer x TensorRT
Refactorgraph
⭐
25
分层解耦的深度学习推理引擎
R2inference
⭐
21
RidgeRun Inference Framework
Simpleinfer
⭐
17
A simple neural network inference framework
Yolooclinference
⭐
16
Pomagma
⭐
15
An inference engine for extensional untyped λ-calculus
Armednn
⭐
13
cross-platform modular neural network inference library, small and efficient
Latte
⭐
12
Latte is a convolutional neural network (CNN) inference engine written in C++ and uses AVX to vectorize operations. The engine runs on Windows 10, Linux and macOS Sierra.
Yolooclinference
⭐
9
An extremely light weight tiny-YOLO inference engine targeted towards OpenCL hardware.
3d_neurosim_v1.0
⭐
9
Benchmark framework of 3D integrated CIM accelerators for popular DNN inference, support both monolithic and heterogeneous 3D integration
Easyocr Cpp
⭐
9
Custom C++ implementation of deep learning based OCR
Openvino Ep Enabled Onnxruntime
⭐
7
Describing How to Enable OpenVINO Execution Provider for ONNX Runtime
Mivisionx Inference Tutorial
⭐
6
MIVisionX toolkit is a set of comprehensive computer vision and machine intelligence libraries, utilities, and applications bundled into a single toolkit.
Mantaray
⭐
6
Lightspeed C++ Neural Network (UE) Inference Library for Chess
Openvino Onnx Importer Api
⭐
5
Demonstrate how to use ONNX importer API in Intel OpenVINO toolkit. This API allows user to load an ONNX model and run inference with OpenVINO Inference Engine.
Related Searches
C Plus Plus Cmake (8,712)
C Plus Plus Qt (8,557)
C Plus Plus Video Game (8,255)
C Plus Plus Algorithms (6,194)
Python C Plus Plus (5,758)
C Plus Plus Opengl (4,396)
C Plus Plus 3d Graphics (3,196)
C Plus Plus Testing (2,735)
Java C Plus Plus (2,523)
C Plus Plus Command Line (2,304)
1-34 of 34 search results
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.