Pysc2 Examples

StarCraft II - pysc2 Deep Reinforcement Learning Examples
Alternatives To Pysc2 Examples
Project NameStarsDownloadsRepos Using ThisPackages Using ThisMost Recent CommitTotal ReleasesLatest ReleaseOpen IssuesLicenseLanguage
Pysc27,745206a month ago8September 27, 201947apache-2.0Python
StarCraft II Learning Environment
Coach2,248116 months ago13October 10, 201990apache-2.0Python
Reinforcement Learning Coach by Intel AI Lab enables easy experimentation with state of the art Reinforcement Learning algorithms
2 years ago33otherC++
Connecting Torch to StarCraft
2 months ago12mitPython
SMAC: The StarCraft Multi-Agent Challenge
Pysc2 Examples721
2 years ago23apache-2.0Python
StarCraft II - pysc2 Deep Reinforcement Learning Examples
Awesome Game Ai522
3 months agomit
Awesome Game AI materials of Multi-Agent Reinforcement Learning
3 years ago14February 24, 202010mitPython
Reaver: Modular Deep Reinforcement Learning Framework. Focused on StarCraft II. Supports Gym, Atari, and MuJoCo.
Gym Starcraft505
6 years ago4Python
StarCraft environment for OpenAI Gym, based on Facebook's TorchCraft. (In progress)
2 months ago3apache-2.0Python
Fine-tuned MARL algorithms on SMAC (100% win rates on most scenarios)
7 months ago9mitPython
Code for ICLR 2019 paper: Learning when to Communicate at Scale in Multiagent Cooperative and Competitive Tasks
Alternatives To Pysc2 Examples
Select To Compare

Alternative Project Comparisons

StartCraft II Reinforcement Learning Examples

This example program was built on

  • pysc2 (Deepmind) []
  • baselines (OpenAI) []
  • s2client-proto (Blizzard) []
  • Tensorflow 1.3 (Google) []

Current examples


  • CollectMineralShards with Deep Q Network


Quick Start Guide

1. Get PySC2


The easiest way to get PySC2 is to use pip:

$ pip install git+

Also, you have to install baselines library.

$ pip install git+

2. Install StarCraft II

Mac / Win

You have to purchase StarCraft II and install it. Or even the Starter Edition will work.

Linux Packages

Follow Blizzard's documentation to get the linux version. By default, PySC2 expects the game to live in ~/StarCraftII/.

3. Download Maps

Download the ladder maps and the mini games and extract them to your StarcraftII/Maps/ directory.

4. Train it!

$ python --algorithm=a2c

5. Enjoy it!

$ python

4-1. Train it with DQN

$ python --algorithm=deepq --prioritized=True --dueling=True --timesteps=2000000 --exploration_fraction=0.2

4-2. Train it with A2C(A3C)

$ python --algorithm=a2c --num_agents=2 --num_scripts=2 --timesteps=2000000
Description Default Parameter Type
map Gym Environment CollectMineralShards string
log logging type : tensorboard, stdout tensorboard string
algorithm Currently, support 2 algorithms : deepq, a2c a2c string
timesteps Total training steps 2000000 int
exploration_fraction exploration fraction 0.5 float
prioritized Whether using prioritized replay for DQN False boolean
dueling Whether using dueling network for DQN False boolean
lr learning rate (if 0 set random e-5 ~ e-3) 0.0005 float
num_agents number of agents for A2C 4 int
num_scripts number of scripted agents for A2C 4 int
nsteps number of steps for update policy 20 int
Popular Starcraft Projects
Popular Reinforcement Learning Projects
Popular Games Categories
Related Searches

Get A Weekly Email With Trending Projects For These Categories
No Spam. Unsubscribe easily at any time.
Machine Learning
Artificial Intelligence
Reinforcement Learning
Deep Reinforcement Learning
Deep Q Network