Nccl

Sample examples of how to call collective operation functions on multi-GPU environments. A simple example of using broadcast, reduce, allGather, reduceScatter and sendRecv operations.
Alternatives To Nccl
Project NameStarsDownloadsRepos Using ThisPackages Using ThisMost Recent CommitTotal ReleasesLatest ReleaseOpen IssuesLicenseLanguage
Vulkan_minimal_compute538
6 years ago3mitC++
Minimal Example of Using Vulkan for Compute Operations. Only ~400LOC.
Amgx420
5 months ago79bsd-3-clauseCuda
Distributed multigrid linear solver library on GPU
Selene349
7 months ago18November 22, 202124bsd-3-clause-clearJupyter Notebook
a framework for training sequence-level deep learning networks
Audio_adversarial_examples230
2 years ago9bsd-2-clausePython
Targeted Adversarial Examples on Speech-to-Text systems
Nnforge176
5 years ago1C++
Convolutional neural networks C++ framework with CPU and GPU (CUDA) backends
Opencl Examples80
3 years ago1Objective-C++
Simple OpenCL examples for exploiting GPU computing
Vulkan_ray_tracing_minimal_abstraction58
a year agogpl-3.0C++
A minimal implementation of Vulkan ray tracing.
Aparapi Examples47
3 years ago17July 12, 20212apache-2.0Java
A framework for executing native Java code on the GPU.
Keras_multi_gpu38
7 years ago2Python
Multi-GPU training for Keras
Foldscuda.jl36
2 years ago6mitJulia
Data-parallelism on CUDA using Transducers.jl and for loops (FLoops.jl)
Alternatives To Nccl
Select To Compare


Alternative Project Comparisons
Popular Example Projects
Popular Gpu Projects
Popular Learning Resources Categories
Related Searches

Get A Weekly Email With Trending Projects For These Categories
No Spam. Unsubscribe easily at any time.
Gpu
Nvidia
Mpi