This API is utilized by USAspending.gov to obtain all federal spending data which is open source and provided to the public as part of the DATA Act.
Ensure the following dependencies are installed and working prior to continuing:
Dockerwhich will handle the other application dependencies.
Bashor another Unix Shell equivalent
Using Docker is recommended since it provides a clean environment. Setting up your own local environment requires some technical abilities and experience with modern software tools.
PostgreSQLversion 10.x (with a dedicated
Now, navigate to the base file directory where you will store the USAspending repositories
$ mkdir -p usaspending && cd usaspending $ git clone https://github.com/fedspendingtransparency/usaspending-api.git $ cd usaspending-api
There are three documented options for setting up a local database in order to run the API:
Create a Local postgres database called 'data_store_api' and either create a new username and password for the database or use all the defaults. For help, consult:
Make sure to grant whatever user you created for the data_store api database superuser permissions or some scripts will not work:
postgres=# ALTER ROLE <<role/user you created>> WITH SUPERUSER;
See below for basic setup instructions. For help with Docker Compose:
None of these commands will rebuild a Docker image! Use
--build if you make changes to the code or want to rebuild the image before running the
If you run a local database, set
POSTGRES_PORT should be changed if it isn't 5432.
docker-compose up usaspending-db will create and run a Postgres database.
docker-compose run --rm usaspending-manage python3 -u manage.py migrate will run Django migrations: https://docs.djangoproject.com/en/2.2/topics/migrations/.
docker-compose run --rm usaspending-manage python3 -u manage.py load_reference_data will load essential reference data (agencies, program activity codes, CFDA program data, country codes, and others).
docker-compose run --rm usaspending-manage python3 -u manage.py matview_runner --dependencies will provision the materialized views which are required by certain API endpoints.
docker-compose.yamlcontains the shell commands necessary to set up the database manually, if you prefer to have a more custom environment.
For further instructions on how to download, use, and setup the database using a subset of our data please go to:
Some of the API endpoints reach into Elasticsearch for data.
docker-compose up usaspending-es will create and start a single-node Elasticsearch cluster, using the
ES_CLUSTER_DIR specified in the
.env configuration file. We recommend using a folder outside of the usaspending-api project directory so it does not get copied to other containers.
The cluster should be reachable via at http://localhost:9200 ("You Know, for Search").
Optionally, to see log output, use
docker-compose logs usaspending-es (these logs are stored by docker even if you don't use this).
docker-compose up usaspending-api
settings.py(buckets, elasticsearch, local paths) and they will be mounted and used when you run this.
The application will now be available at
Note: if the code was run outside of Docker then compiled Python files will potentially trip up the docker environment. A useful command to run for clearing out the files on your host is:
find . | grep -E "(__pycache__|\.pyc|\.pyo$)" | xargs rm -rf
In your local development environment, available API endpoints may be found at
Deployed production API endpoints and docs are found by following links here:
Note: it is possible to run ad-hoc commands out of a Docker container once you get the hang of it, see the comments in the Dockerfile.
For details on loading reference data, DATA Act Broker submissions, and current USAspending data into the API, see loading_data.md.
For details on how our data loaders modify incoming data, see data_reformatting.md.
To run all tests in the docker services run
docker-compose run --rm usaspending-test
To run tests locally and not in the docker services, you need:
Once these are satisfied, run:
(usaspending-api) $ pytest
Create and activate the virtual environment using
venv, and ensure the right version of Python 3.7.x is being used (the latest RHEL package available for
python36u: as of this writing)
$ pyenv install 3.7.2 $ pyenv local 3.7.2 $ python -m venv .venv/usaspending-api $ source .venv/usaspending-api/bin/activate
Your prompt should then look as below to show you are in the virtual environment named
usaspending-api (to exit that virtual environment, simply type
deactivate at the prompt).
install application dependencies
(usaspending-api) $ pip install -r requirements/requirements.txt
.envrc file in the repo root, which will be ignored by git. Change credentials and ports as-needed for your local dev environment.
export DATABASE_URL=postgres://usaspending:[email protected]:5432/data_store_api export ES_HOSTNAME=http://localhost:9200 export DATA_BROKER_DATABASE_URL=postgres://admin:[email protected]:5435/data_broker
direnv does not pick this up after saving the file, type
$ direnv allow
Alternatively, you could skip using
direnv and just export these variables in your shell environment.
Some automated integration tests run against a Broker database. If the dependencies to run such integration tests are not satisfied, those tests will bail out and be marked as Skipped.
(You can see messages about those skipped tests by adding the
-rs flag to pytest, like:
To satisfy these dependencies and include execution of these tests, do the following:
Dockerinstalled and running on your machine
Brokersource code is checked out alongside this repo at
DATA_BROKER_DATABASE_URLenvironment variable set, and pointing to a live PostgreSQL server (no database required)
Brokerbackend Docker image by running:
(usaspending-api) $ docker build -t dataact-broker-backend ../data-act-broker-backend
NOTE: Broker source code should be re-fetched and image rebuilt to ensure latest integration is tested
Re-running the test suite using
pytest -rs with these dependencies satisfied should yield no more skips of the broker integration tests.
To submit fixes or enhancements, or to suggest changes, see CONTRIBUTING.md