Project Name | Stars | Downloads | Repos Using This | Packages Using This | Most Recent Commit | Total Releases | Latest Release | Open Issues | License | Language |
---|---|---|---|---|---|---|---|---|---|---|
Recbole | 2,534 | 2 days ago | 7 | February 25, 2022 | 83 | mit | Python | |||
A unified, comprehensive and efficient recommendation library | ||||||||||
Transfer Learning Library | 2,364 | 7 days ago | 2 | July 24, 2020 | 12 | mit | Python | |||
Transfer Learning Library for Domain Adaptation, Task Adaptation, and Domain Generalization | ||||||||||
Domainbed | 1,004 | 7 days ago | 1 | mit | Python | |||||
DomainBed is a suite to test domain generalization algorithms | ||||||||||
Pmlb | 715 | 2 | 7 | a month ago | 9 | October 13, 2020 | 12 | mit | Python | |
PMLB: A large, curated repository of benchmark datasets for evaluating supervised machine learning algorithms. | ||||||||||
Apriori | 703 | 7 months ago | 13 | mit | Python | |||||
Python Implementation of Apriori Algorithm for finding Frequent sets and Association Rules | ||||||||||
Machinelearning | 684 | 3 years ago | 1 | Python | ||||||
Machine learning resources,including algorithm, paper, dataset, example and so on. | ||||||||||
Deepnude_official | 677 | 4 years ago | 1 | gpl-3.0 | Python | |||||
Zr Obp | 494 | 4 months ago | 16 | June 15, 2022 | 17 | apache-2.0 | Python | |||
Open Bandit Pipeline: a python library for bandit algorithms and off-policy evaluation | ||||||||||
Nussl | 402 | 2 years ago | 31 | April 02, 2021 | 33 | mit | Python | |||
A flexible source separation library in Python | ||||||||||
Carefree Learn | 390 | 4 days ago | 35 | June 20, 2022 | mit | Python | ||||
Deep Learning ❤️ PyTorch |
NOTE: FedJAX is not an officially supported Google product. FedJAX is still in the early stages and the API will likely continue to change.
FedJAX is a JAX-based open source library for Federated Learning simulations that emphasizes ease-of-use in research. With its simple primitives for implementing federated learning algorithms, prepackaged datasets, models and algorithms, and fast simulation speed, FedJAX aims to make developing and evaluating federated algorithms faster and easier for researchers. FedJAX works on accelerators (GPU and TPU) without much additional effort. Additional details and benchmarks can be found in our paper.
You will need a moderately recent version of Python. Please check the PyPI page for the up to date version requirement.
First, install JAX. For a CPU-only version:
pip install --upgrade pip
pip install --upgrade jax jaxlib # CPU-only version
For other devices (e.g. GPU), follow these instructions.
Then, install FedJAX from PyPI:
pip install fedjax
Or, to upgrade to the latest version of FedJAX:
pip install --upgrade git+https://github.com/google/fedjax.git
Below is a simple example to verify FedJAX is installed correctly.
import fedjax
import jax
import jax.numpy as jnp
import numpy as np
# {'client_id': client_dataset}.
fd = fedjax.InMemoryFederatedData({
'a': {
'x': np.array([1.0, 2.0, 3.0]),
'y': np.array([2.0, 4.0, 6.0]),
},
'b': {
'x': np.array([4.0]),
'y': np.array([12.0])
}
})
# Initial model parameters.
params = jnp.array(0.5)
# Mean squared error.
mse_loss = lambda params, batch: jnp.mean(
(jnp.dot(batch['x'], params) - batch['y'])**2)
# Loss for clients 'a' and 'b'.
print(f"client a loss = {mse_loss(params, fd.get_client('a').all_examples())}")
print(f"client b loss = {mse_loss(params, fd.get_client('b').all_examples())}")
The following tutorial notebooks provide an introduction to FedJAX:
You can also take a look at some of our working examples:
To cite this repository:
@article{fedjax2021,
title={{F}ed{JAX}: Federated learning simulation with {JAX}},
author={Jae Hun Ro and Ananda Theertha Suresh and Ke Wu},
journal={arXiv preprint arXiv:2108.02117},
year={2021}
}