Pysc2 Examples

StarCraft II - pysc2 Deep Reinforcement Learning Examples
Alternatives To Pysc2 Examples
Project NameStarsDownloadsRepos Using ThisPackages Using ThisMost Recent CommitTotal ReleasesLatest ReleaseOpen IssuesLicenseLanguage
Pysc27,745206a month ago8September 27, 201947apache-2.0Python
StarCraft II Learning Environment
Coach2,248116 months ago13October 10, 201990apache-2.0Python
Reinforcement Learning Coach by Intel AI Lab enables easy experimentation with state of the art Reinforcement Learning algorithms
Torchcraft1,346
2 years ago33otherC++
Connecting Torch to StarCraft
Smac828
2 months ago12mitPython
SMAC: The StarCraft Multi-Agent Challenge
Pysc2 Examples721
2 years ago23apache-2.0Python
StarCraft II - pysc2 Deep Reinforcement Learning Examples
Awesome Game Ai522
3 months agomit
Awesome Game AI materials of Multi-Agent Reinforcement Learning
Reaver516
3 years ago14February 24, 202010mitPython
Reaver: Modular Deep Reinforcement Learning Framework. Focused on StarCraft II. Supports Gym, Atari, and MuJoCo.
Gym Starcraft505
6 years ago4Python
StarCraft environment for OpenAI Gym, based on Facebook's TorchCraft. (In progress)
Pymarl2425
2 months ago3apache-2.0Python
Fine-tuned MARL algorithms on SMAC (100% win rates on most scenarios)
Ic3net174
7 months ago9mitPython
Code for ICLR 2019 paper: Learning when to Communicate at Scale in Multiagent Cooperative and Competitive Tasks
Alternatives To Pysc2 Examples
Select To Compare


Alternative Project Comparisons
Readme

StartCraft II Reinforcement Learning Examples

This example program was built on

  • pysc2 (Deepmind) [https://github.com/deepmind/pysc2]
  • baselines (OpenAI) [https://github.com/openai/baselines]
  • s2client-proto (Blizzard) [https://github.com/Blizzard/s2client-proto]
  • Tensorflow 1.3 (Google) [https://github.com/tensorflow/tensorflow]

Current examples

Minimaps

  • CollectMineralShards with Deep Q Network

CollectMineralShards

Quick Start Guide

1. Get PySC2

PyPI

The easiest way to get PySC2 is to use pip:

$ pip install git+https://github.com/deepmind/pysc2

Also, you have to install baselines library.

$ pip install git+https://github.com/openai/baselines

2. Install StarCraft II

Mac / Win

You have to purchase StarCraft II and install it. Or even the Starter Edition will work.

http://us.battle.net/sc2/en/legacy-of-the-void/

Linux Packages

Follow Blizzard's documentation to get the linux version. By default, PySC2 expects the game to live in ~/StarCraftII/.

3. Download Maps

Download the ladder maps and the mini games and extract them to your StarcraftII/Maps/ directory.

4. Train it!

$ python train_mineral_shards.py --algorithm=a2c

5. Enjoy it!

$ python enjoy_mineral_shards.py

4-1. Train it with DQN

$ python train_mineral_shards.py --algorithm=deepq --prioritized=True --dueling=True --timesteps=2000000 --exploration_fraction=0.2

4-2. Train it with A2C(A3C)

$ python train_mineral_shards.py --algorithm=a2c --num_agents=2 --num_scripts=2 --timesteps=2000000
Description Default Parameter Type
map Gym Environment CollectMineralShards string
log logging type : tensorboard, stdout tensorboard string
algorithm Currently, support 2 algorithms : deepq, a2c a2c string
timesteps Total training steps 2000000 int
exploration_fraction exploration fraction 0.5 float
prioritized Whether using prioritized replay for DQN False boolean
dueling Whether using dueling network for DQN False boolean
lr learning rate (if 0 set random e-5 ~ e-3) 0.0005 float
num_agents number of agents for A2C 4 int
num_scripts number of scripted agents for A2C 4 int
nsteps number of steps for update policy 20 int
Popular Starcraft Projects
Popular Reinforcement Learning Projects
Popular Games Categories
Related Searches

Get A Weekly Email With Trending Projects For These Categories
No Spam. Unsubscribe easily at any time.
Python
Machine Learning
Artificial Intelligence
Reinforcement Learning
Dqn
Tensorboard
Deep Reinforcement Learning
Starcraft
Deepmind
Deep Q Network