Project Name | Stars | Downloads | Repos Using This | Packages Using This | Most Recent Commit | Total Releases | Latest Release | Open Issues | License | Language |
---|---|---|---|---|---|---|---|---|---|---|
Ethereum Etl | 2,456 | 5 days ago | 37 | May 24, 2022 | 125 | mit | Python | |||
Python scripts for ETL (extract, transform and load) jobs for Ethereum blocks, transactions, ERC20 / ERC721 tokens, transfers, receipts, logs, contracts, internal transactions. Data is available in Google BigQuery https://goo.gl/oY5BCQ | ||||||||||
Dynamodb Onetable | 538 | 2 | 6 days ago | 92 | September 09, 2022 | 13 | mit | JavaScript | ||
DynamoDB access and management for one table designs with NodeJS | ||||||||||
Parquet4s | 236 | 2 | 6 days ago | 47 | December 15, 2022 | 12 | mit | Scala | ||
Read and write Parquet in Scala. Use Scala classes as schema. No need to start a cluster. | ||||||||||
Evb Cli | 200 | 3 months ago | 57 | June 30, 2022 | 15 | JavaScript | ||||
Pattern generator and debugging tool for Amazon EventBridge | ||||||||||
Aws Config Resource Schema | 162 | 13 days ago | 16 | apache-2.0 | ||||||
AWS Config resource schema define the properties and types of AWS Config resource configuration items (CIs). Resource CI schema are used by developers when performing advanced resource queries and when processing CI data. | ||||||||||
Eventbridge Atlas | 135 | 6 months ago | 4 | mit | JavaScript | |||||
Open-source tool to document, discover, and share your Amazon EventBridge schemas. | ||||||||||
Botoform | 100 | 3 years ago | Python | |||||||
Architect infrastructure on AWS using YAML | ||||||||||
Dynamodump | 100 | a year ago | 12 | February 26, 2022 | 4 | mit | JavaScript | |||
Node CLI for backing up and restoring schema+data from DynamoDB tables | ||||||||||
Flutter Aws Appsync Sample | 94 | 3 years ago | 2 | Swift | ||||||
Create custom bridge for AWS AppSync in Flutter | ||||||||||
Graphql Recipes | 83 | 3 years ago | 1 | |||||||
A list of GraphQL recipes that, when used with the Amplify CLI, will deploy an entire AWS AppSync GraphQL backend. |
This is a Kafka Producer that generates mock Clickstream data for an imaginary e-commerce site. The scenario is that a user logs into the site, lands on the home page, browses the product catalog, lands on individual product pages, and either adds the product to cart or not and conitues to do that until the user either creates an order and does order checkout or abandons the session.
The producer generates the events in a user session in sequence but runs multiple threads to simulate multiple users hitting the site. The number of threads can be specified by using parameters. It utilizes a Schema Registry and generates Avro encoded events. It supports both the AWS Glue Schema Registry and a 3rd party Schema Registry.
For the AWS Glue Schema Registry, the producer accepts parameters to use a specific registry, pre-created schema name, schema description, compatibility mode (to check compatibility for schema evolution) and whether to turn on auto registration of schemas. If those parameters are not specified but using the AWS Glue Schema registry is specified, it uses the default schema registry. Some of the parameters may need to be specified if others are not. The AWS Glue Schema Registry provides open sourced serde libraries for serialization and deserialization which use the AWS default credentials chain (by default) for credentials and the region to construct an endpoint. For further information see the AWS Glue Schema Registry documentation.
For the 3rd party Schema Registry, the location of the Schema Registry needs to be specified in a producer.properties_msk file.
For each event generated, the Producer assigns a Global seq number that is unique across the threads and sequential. The Producer assigns a UserId as the partition key for the events sent to Apache Kafka which means that the events for the same User always go to the same Kafka partition which would allow stateful processing of user events in order. However, the Global seq numbers are spread out across multiple partitions in Apache Kafka. The highest Global seq number received by the consumer at any point can be utilized to figure out how far behind the producer, the consumer is or if the consumer is caught up with the producer.
This producer supports TLS in-transit encryption, TLS mutual authentication and SASL/SCRAM authentication with Amazon MSK. See the relevant parameters to enable them below.
This producer can be used to generate other types of events by modifying the RunProducer and Events classes.
This consumer depends on another library to get secrets from AWS Secrets Manager for SAS/SCRAM authentication with Amazon MSK. The library needs to be installed first before creating the jar file for the consumer.
git clone https://github.com/aws-samples/sasl-scram-secrets-manager-client-for-msk.git
cd sasl-scram-secrets-manager-client-for-msk
mvn clean install -f pom.xml
mvn clean package -f pom.xml
--help (or -h): help to get list of parameters
***--numberOfUsers (or -nou)***: Specify the number of users sessions to generate. Default Integer.MAX_VALUE.
***--topic (or -t)***: Apache Kafka topic to send events to. Default ExampleTopic.
***--propertiesFilePath (or -pfp)***: Location of the producer properties file which contains information about the Apache Kafka bootstrap brokers and the location of the Confluent Schema Registry. Default /tmp/kafka/producer.properties.
***--numThreads (or -nt)***: Number of threads to run in parallel. Default 2.
***--runFor (or -rf)***: Number of seconds to run the producer for.
***--noDelay (or -nd)***: There is a built in delay between events in a user session to mimick a real user scenario. This turns it off to try and load events as fast as possible.
***--sslEnable (or -ssl)***: Enable TLS communication between this application and Amazon MSK Apache Kafka brokers for in-transit encryption.
***--mTLSEnable (or -mtls)***: Enable TLS communication between this application and Amazon MSK Apache Kafka brokers for in-transit encryption and TLS mutual authentication. If this parameter is specified, TLS is also enabled. This reads the specified properties file for SSL_TRUSTSTORE_LOCATION_CONFIG, SSL_KEYSTORE_LOCATION_CONFIG, SSL_KEYSTORE_PASSWORD_CONFIG and SSL_KEY_PASSWORD_CONFIG. Those properties need to be specified in the properties file.
***--saslscramEnable (or -sse)***: Enable SASL/SCRAM authentication between this application and Amazon MSK with in-transit encryption. If this parameter is specified, --saslscramUser (or -ssu) also needs to be specified. Also, this parameter cannot be specified with --mTLSEnable (or -mtls) or --sslEnable (or -ssl)
***--iamEnable (or -iam)***: Enable AWS IAM authentication between this application and Amazon MSK with in-transit encryption. If this parameter is specified, this parameter cannot be specified with --mTLSEnable (or -mtls) or --sslEnable (or -ssl) or --saslscramEnable (or -sse). For IAM authentication / authorization to work, attach an authorization policy to the IAM role that corresponds to EC2 Instance Profile.
java -jar KafkaClickstreamClient-1.0-SNAPSHOT.jar -t ExampleTopic -pfp /tmp/kafka/producer.properties_msk -nt 8 -rf 300 -mtls
java -jar KafkaClickstreamClient-1.0-SNAPSHOT.jar -t ExampleTopic -pfp /tmp/kafka/producer.properties_msk -nt 8 -rf 300 -sse ssu nancy
java -jar KafkaClickstreamClient-1.0-SNAPSHOT.jar -t ExampleTopic -pfp /tmp/kafka/producer.properties_msk -nt 8 -rf 300 -iam