Project Name | Stars | Downloads | Repos Using This | Packages Using This | Most Recent Commit | Total Releases | Latest Release | Open Issues | License | Language |
---|---|---|---|---|---|---|---|---|---|---|
Pysc2 | 7,745 | 20 | 6 | a month ago | 8 | September 27, 2019 | 47 | apache-2.0 | Python | |
StarCraft II Learning Environment | ||||||||||
Coach | 2,248 | 1 | 1 | 6 months ago | 13 | October 10, 2019 | 90 | apache-2.0 | Python | |
Reinforcement Learning Coach by Intel AI Lab enables easy experimentation with state of the art Reinforcement Learning algorithms | ||||||||||
Torchcraft | 1,346 | 2 years ago | 33 | other | C++ | |||||
Connecting Torch to StarCraft | ||||||||||
Smac | 828 | 2 months ago | 12 | mit | Python | |||||
SMAC: The StarCraft Multi-Agent Challenge | ||||||||||
Pysc2 Examples | 721 | 2 years ago | 23 | apache-2.0 | Python | |||||
StarCraft II - pysc2 Deep Reinforcement Learning Examples | ||||||||||
Awesome Game Ai | 522 | 3 months ago | mit | |||||||
Awesome Game AI materials of Multi-Agent Reinforcement Learning | ||||||||||
Reaver | 516 | 3 years ago | 14 | February 24, 2020 | 10 | mit | Python | |||
Reaver: Modular Deep Reinforcement Learning Framework. Focused on StarCraft II. Supports Gym, Atari, and MuJoCo. | ||||||||||
Gym Starcraft | 505 | 6 years ago | 4 | Python | ||||||
StarCraft environment for OpenAI Gym, based on Facebook's TorchCraft. (In progress) | ||||||||||
Pymarl2 | 425 | 2 months ago | 3 | apache-2.0 | Python | |||||
Fine-tuned MARL algorithms on SMAC (100% win rates on most scenarios) | ||||||||||
Ic3net | 174 | 7 months ago | 9 | mit | Python | |||||
Code for ICLR 2019 paper: Learning when to Communicate at Scale in Multiagent Cooperative and Competitive Tasks |
This example program was built on
The easiest way to get PySC2 is to use pip:
$ pip install git+https://github.com/deepmind/pysc2
Also, you have to install baselines
library.
$ pip install git+https://github.com/openai/baselines
You have to purchase StarCraft II and install it. Or even the Starter Edition will work.
http://us.battle.net/sc2/en/legacy-of-the-void/
Follow Blizzard's documentation to
get the linux version. By default, PySC2 expects the game to live in
~/StarCraftII/
.
Download the ladder maps
and the mini games
and extract them to your StarcraftII/Maps/
directory.
$ python train_mineral_shards.py --algorithm=a2c
$ python enjoy_mineral_shards.py
$ python train_mineral_shards.py --algorithm=deepq --prioritized=True --dueling=True --timesteps=2000000 --exploration_fraction=0.2
$ python train_mineral_shards.py --algorithm=a2c --num_agents=2 --num_scripts=2 --timesteps=2000000
Description | Default | Parameter Type | |
---|---|---|---|
map | Gym Environment | CollectMineralShards | string |
log | logging type : tensorboard, stdout | tensorboard | string |
algorithm | Currently, support 2 algorithms : deepq, a2c | a2c | string |
timesteps | Total training steps | 2000000 | int |
exploration_fraction | exploration fraction | 0.5 | float |
prioritized | Whether using prioritized replay for DQN | False | boolean |
dueling | Whether using dueling network for DQN | False | boolean |
lr | learning rate (if 0 set random e-5 ~ e-3) | 0.0005 | float |
num_agents | number of agents for A2C | 4 | int |
num_scripts | number of scripted agents for A2C | 4 | int |
nsteps | number of steps for update policy | 20 | int |