Alternatives To Rbd Benchmarks
Project NameStarsDownloadsRepos Using ThisPackages Using ThisMost Recent CommitTotal ReleasesLatest ReleaseOpen IssuesLicenseLanguage
Cpu X1,588
9 hours ago1gpl-3.0C
CPU-X is a Free software that gathers information on CPU, motherboard and more
1m Go Tcp Server1,237
2 years ago2April 15, 20213Go
benchmarks for implementation of servers which support 1 million connections
Boomer1,0702710 days ago18February 22, 20229mitGo
A better load generator for locust, written in golang.
2 days ago137mitJulia
🌊 Julia software for fast, friendly, flexible, ocean-flavored fluid dynamics on CPUs and GPUs
20 days ago22mitShell
Various gRPC benchmarks
Chillout57734a year ago26January 23, 20208mitJavaScript
Reduce CPU usage by non-blocking async loop and psychologically speed up in JavaScript
3 years ago3apache-2.0Shell
VPS benchmark script — based on the popular, plus CPU and ioping tests, and dual-stack IPv4 and v6 speedtests by default
Sbc Bench457
6 hours ago1bsd-3-clauseShell
Simple benchmark for single board computers
Sympact439485 years ago7May 28, 20183mitJavaScript
🔥 Simple stupid CPU/MEM "Profiler" for your JS code.
7 years ago23gpl-2.0C++
Volumetric 3D Mapping in Real-Time on a CPU
Alternatives To Rbd Benchmarks
Select To Compare

Alternative Project Comparisons


Build Status

This repository contains the benchmark suite associated with the paper "Benchmarking and Workload Analysis of Robot Dynamics Algorithms", submitted to IROS 2019.

  title={Benchmarking and workload analysis of robot dynamics algorithms},
  author={Neuman, Sabrina M and Koolen, Twan and Drean, Jules and Miller, Jason E and Devadas, Srinivas},
  booktitle={2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},

Repository layout

  • description contains robot models
  • experiments contains scripts to run experiments
  • input_data_gen contains scripts to generate input data
  • libs contains the rbd libraries
  • testbenches contains the testbench code used by the experiments

Optional/Generated Directories:

  • csv contains generated input data in csv format
  • results is a directory to store experiment results


rbd-benchmarks is currently Linux-only, and uses Travis for continuous integration. Travis is currently set up to ensure that the code compiles on Ubuntu Xenial Xerus (16.04). rbd-benchmarks has also been user-tested on Ubuntu Bionic Beaver (18.04). Other platforms are untested and as such unlikely to work out of the box.


Please see our Travis configuration file for an up-to-date list of dependencies. Note that while the version of Maxima (used by RobCoGen) available through apt in Ubuntu 16.04 is not sufficiently up-to-date, the version that ships with Ubuntu 18.04 is.

In addition, you will need Julia 1.1, both to run benchmarks for RigidBodyDynamics.jl, and to generate robot description files and input test data. Binaries for Julia can be obtained from After decompressing the archive, either:

  • set the JULIA environment variable to point to the julia executable: export JULIA=/path/to/julia/julia
  • add the directory containing the julia executable to the PATH: export PATH=/path/to/julia/:$PATH

Finally, the C/C++ compiler to use may be specified by setting the CC and CXX environment variables. By default, the system C/C++ compiler will be used, which may be too old on Ubuntu 16.04 (symptom: linking errors while running make in libs/rbdjl).

Submodule initialization

After cloning the repository, please run

git submodule update --init --recursive

to initialize Git submodules.

Generating model description files

To generate URDF and KINDSL files according to the process described in the paper, run

cd description
cd ..

If this errors, please ensure that submodules have been initialized (previous step).

Building libraries

To download and install the rigid body dynamics libraries under test (locally), run

cd libs
make # or make -j 4 to build the libraries in parallel with four make workers
cd ..

Generating random input data

To generate the csv directory containing .csv files with library-specific random input data (with sample i for each library representing the same physical state / torque input), run

cd input_data_gen
cd ..

Installing likwid

This step is optional, and only required for the CPU instruction mix results.

To install the likwid hardware performance counter tools, run

cd likwid
make -j 4
sudo make install
cd ..

Note that likwid will be installed locally, but sudo is required to chown the locally installed files to root, since these files need access to model-specific register device files.

Hardware performance counters differ per CPU architecture. Currently, only the Kaby Lake CPU architecture has been tested. Running the likwid experiments on other CPU architectures is likely to result in a program crash.

Running experiments

To run timing experiments (average time) for all libraries, run


To record run times for each sample in a .csv file for all libraries, run


To run likwid performance counter experiments, run

Popular Benchmark Projects
Popular Cpu Projects
Popular Software Performance Categories

Get A Weekly Email With Trending Projects For These Categories
No Spam. Unsubscribe easily at any time.
C Plus Plus