Project Name | Stars | Downloads | Repos Using This | Packages Using This | Most Recent Commit | Total Releases | Latest Release | Open Issues | License | Language |
---|---|---|---|---|---|---|---|---|---|---|
Pynamodb | 2,177 | 1 | 8 days ago | 13 | June 09, 2022 | 258 | mit | Python | ||
A pythonic interface to Amazon's DynamoDB | ||||||||||
Saw | 1,320 | 2 months ago | 13 | February 11, 2021 | 45 | mit | Go | |||
Fast, multi-purpose tool for AWS CloudWatch Logs | ||||||||||
Aws | 835 | 3 months ago | 2 | apache-2.0 | Shell | |||||
A collection of bash shell scripts for automating various tasks with Amazon Web Services using the AWS CLI and jq. | ||||||||||
Goformation | 808 | 44 | 17 hours ago | 62 | June 10, 2021 | 50 | apache-2.0 | Go | ||
GoFormation is a Go library for working with CloudFormation templates. | ||||||||||
Iam Policy Json To Terraform | 680 | 12 days ago | 7 | February 06, 2021 | 10 | apache-2.0 | Go | |||
Small tool to convert an IAM Policy in JSON format into a Terraform aws_iam_policy_document | ||||||||||
Aws Security Viz | 665 | 3 days ago | 176 | August 15, 2021 | 4 | mit | Ruby | |||
Visualize your aws security groups. | ||||||||||
Devops Python Tools | 659 | a day ago | 32 | mit | Python | |||||
80+ DevOps & Data CLI Tools - AWS, GCP, GCF Python Cloud Functions, Log Anonymizer, Spark, Hadoop, HBase, Hive, Impala, Linux, Docker, Spark Data Converters & Validators (Avro/Parquet/JSON/CSV/INI/XML/YAML), Travis CI, AWS CloudFormation, Elasticsearch, Solr etc. | ||||||||||
Awacs | 389 | 120 | 23 | 5 days ago | 39 | January 01, 2022 | 14 | bsd-2-clause | Python | |
Python library for AWS Access Policy Language creation | ||||||||||
Nodb | 330 | 1 | 3 years ago | 7 | September 26, 2017 | 14 | Python | |||
NoDB isn't a database.. but it sort of looks like one. | ||||||||||
Foremast | 277 | 2 | a year ago | 392 | December 01, 2021 | apache-2.0 | Python | |||
Spinnaker Pipeline/Infrastructure Configuration and Templating Tool - Pipelines as Code. |
Ben North (GitHub / blog), March 2018 (repository root)
pip install .
Among the components of Amazon Web Services are the following two parts of their 'serverless' approach:
While Lambda functions can be written in many languages, to write a Step Function you describe the logic as a state machine in JSON. This seems cumbersome when compared to our normal way of describing how to control interlinked computations, which is to write some Python (or C#, or Java, or...).
Based on this observation, the tools presented here are a 'plausibility argument of concept' for the idea that you could write your top-level logic as a Python program and have it compiled into a Step Function state machine. (I haven't developed this far enough to call it a 'proof of concept'.)
One of the desired properties of the system is that the source program should be more or less 'normal Python'. It should be possible to use it in two ways:
The ability to run your logic as a normal Python program allows local development and testing.
Although I think the tools here do show that the idea has promise, there would be plenty still to do to make them useful for production purposes. I am very unlikely to have time in the near future to develop this any further, but the source is all here (under GPL) if anybody wants to build on it.
The 'Python to Step Function compiler' tool,
pysfn.tools.compile
,
reads in a file of Python code and emits JSON corresponding to the
control flow of a specified 'entry point' function in that code. The
resulting JSON is used for the creation of an AWS Step Function.
Various supplied Python functions allow the programmer to express
intent in terms of retry characteristics, parallel invocations, error
handling, etc. Nonetheless the code is valid normal Python and
executes with (mostly) equivalent semantics to those the resulting
Step Function will have.
The 'Python to Step Function wrapper compiler' tool, pysfn.tools.gen_lambda
,
constructs a zip-file containing the original Python
code together with a small wrapper. The zip-file is suitable for
uploading as an AWS Lambda function. This gives the top-level Step
Function access to what were callees in the original Python code.
In the below I have omitted details like creation of IAM users, creation of roles, etc. See Amazon's documentation on these points.
The original Python source consists of a main driver function, with a collection of small functions used by the main function. It is very simple, performing a few computations on an input string, but serves the purpose of illustrating the compilation process. It has a suite of unit tests:
pip install .[dev] # install development dependencies
pytest tests/test_analyse_text.py
Output:
# ... ======== 10 passed in 0.02 seconds ======== ...
python -m pysfn.tools.gen_lambda examples/analyse_text.py lambda-function.zip
unzip -l lambda-function.zip
Output:
Archive: lambda-function.zip
Length Date Time Name
--------- ---------- ----- ----
302 1980-01-01 00:00 handler.py
1981 2018-03-25 20:01 inner/analyse_text.py
452 2018-03-23 22:32 pysfn.py
--------- -------
2735 3 files
Now upload lambda-function.zip
as a new Lambda function with the
Python 3.6
runtime, specify handler.dispatch
as its entry point,
and note its ARN for use in the next step.
python -m pysfn.tools.compile examples/analyse_text.py LAMBDA-FUN-ARN > examples/stepfun.json
cat examples/stepfun.json
Output (the full output is 196 lines):
{
"States": {
"n0": {
"Type": "Pass",
"Result": {
"function": "get_summary",
"arg_names": [
"text"
]
},
"ResultPath": "$.call_descr",
"Next": "n1"
},
[...]
"n19": {
"Type": "Succeed",
"InputPath": "$.locals.result"
}
},
"StartAt": "n0"
}
Now copy-and-paste this as the JSON for a new Step Function.
You should now be able to perform an execution of this Step Function with, for example, the input
{
"locals": {
"text": "a short example"
}
}
to get the output
{
"output": "text starts with a, has 15 chars, 5 vowels, and 2 spaces"
}
This document: Copyright 2018 Ben North; licensed under CC BY-SA 4.0
See the file COPYING
for full licensing details.