Project Name | Stars | Downloads | Repos Using This | Packages Using This | Most Recent Commit | Total Releases | Latest Release | Open Issues | License | Language |
---|---|---|---|---|---|---|---|---|---|---|
Iglu Central | 106 | 12 hours ago | 179 | apache-2.0 | Shell | |||||
Contains all JSON Schemas, Avros and Thrifts for Iglu Central | ||||||||||
Xsd2thrift | 69 | 3 years ago | 13 | other | Java | |||||
This tool allows for converting XML Schema files (.xsd) to Thrift (.thrift) and Protocol Buffers (.proto). | ||||||||||
Probor | 39 | 3 | 5 years ago | 8 | May 18, 2018 | 2 | Rust | |||
A protocol on top of CBOR that provides protobuf-like functionality | ||||||||||
Jsonschema2go | 35 | 4 years ago | May 26, 2021 | 1 | other | Go | ||||
Code generator for JSON schema | ||||||||||
Parquet Flinktacular | 26 | 6 years ago | 2 | apache-2.0 | Java | |||||
How to use Parquet in Flink | ||||||||||
Casyn | 17 | 9 years ago | 44 | October 05, 2015 | Clojure | |||||
Clojure client for Cassandra using Thrift AsyncClient - For a better/more robust client using CQL3 see https://github.com/mpenet/alia | ||||||||||
Aptos | 16 | 6 years ago | 7 | mit | Python | |||||
:sunny: Avro, Protobuf, Thrift on Swagger | ||||||||||
Concrete | 14 | 2 months ago | 1 | other | Thrift | |||||
Thrift definitions, making HLT data specifications concrete | ||||||||||
Idl_storage_guidelines | 10 | 6 years ago | 1 | apache-2.0 | ||||||
This document attempts to capture useful patterns and warn about subtle gotchas when it comes to designing and evolving schemas for long-term serialized data. It is not intended as a guide for how to best represent a particular dataset or process. | ||||||||||
Schemakeeper | 6 | 3 | 9 months ago | 3 | August 01, 2021 | 24 | apache-2.0 | Java | ||
Schemakeeper - yet another schema registry for Avro, Thrift and Protobuf schemas |
Schemakeeper - yet another schema registry for Avro, Thrift and Protobuf schemas. It provides a RESTful interface for storing and retrieving Subjects and Schemas metadata.
To be able to use Oracle as your backend for schemas it is required to install ojdbc jar manually. You can do it using maven (or any other tool):
mvn install:install-file -Dfile=path/to/your/ojdbc8.jar -DgroupId=com.oracle -DartifactId=ojdbc8 -Dversion=19.3.0.0 -Dpackaging=jar
schemakeeper-common
- common classes for SerDe's and APIschemakeeper-client
- http client implementationschemakeeper-avro
- Avro SerDeschemakeeper-thrift
- Thrift SerDeschemakeeper-protobuf
- Protobuf SerDeschemakeeper-kafka-common
- common classes for Kafka SerDeschemakeeper-kafka-avro
- Avro SerDe for Kafkaschemakeeper-kafka-thrift
- Thrift SerDe for Kafkaschemakeeper-kafka-protobuf
- Protobuf SerDe for KafkaEvery Schemakeeper module is published at Maven Central
<dependency>
<groupId>com.nryanov.schemakeeper</groupId>
<artifactId>schemakeeper-${module.name}</artifactId>
<version>${module.version}</version>
</dependency>
compile 'com.nryanov.schemakeeper:${module.name}:${module.version}'
"com.nryanov.schemakeeper" %% "<schemakeeper-module>" % "[version]"
Server build:
sbt server/stage
// executable should be in server/target/universal/stage/bin/*
sbt server/docker:publishLocal
sbt server/docker:publishLocal
sbt test
docker pull nryanov/schemakeeper:{version}
docker run --name={container_name} -p 9081:9081 -p 9990:9990 -d nryanov/schemakeeper:{version}
Server uses Lightbend HOCON config for configuration. By default server is listening 9081 port for rest. Also, H2 is used as default server backend with next settings:
schemakeeper {
server {
port = 9081
host = 0.0.0.0
}
storage {
username = ""
password = ""
driver = "org.h2.Driver"
schema = "public"
url = "jdbc:h2:mem:schemakeeper;DB_CLOSE_DELAY=-1"
}
}
schemakeeper {
server {
cors {
allowedCredentials = "<true/false>"
anyOrigin = "<true/false>"
anyMethod = "<true/false>"
maxAge = "<number>"
allowsOrigins = "<ORIGIN>"
allowsMethods = "<Comma-separated methods>"
allowsHeaders = "<Comma-separated headers>"
exposedHeaders = "<Comma-separated headers>"
}
}
}
If you using jar for starting server, you can configure your app using -Dconfig.file java option:
java -jar -Dconfig.file=<PATH TO application.conf> schemakeeper-server.jar
To configure docker image you can use environment variables:
Cors settings:
Under the hood the Unirest is used as http client. You can configure it:
Map<String, Object> properties = new HashMap();
properties.put(ClientConfig.CLIENT_MAX_CONNECTIONS, 10);
properties.put(ClientConfig.CLIENT_CONNECTIONS_PER_ROUTE, 5);
properties.put(ClientConfig.CLIENT_SOCKET_TIMEOUT, 5000);
properties.put(ClientConfig.CLIENT_CONNECT_TIMEOUT, 5000);
// Optionally, you can confgure your http client to use proxy
properties.put(ClientConfig.CLIENT_PROXY_HOST, "proxyHost");
properties.put(ClientConfig.CLIENT_PROXY_PORT, port);
properties.put(ClientConfig.CLIENT_PROXY_USERNAME, "username");
properties.put(ClientConfig.CLIENT_PROXY_PASSWORD, "password");
Map<String, Object> properties = new HashMap();
properties.put(SerDeConfig.SCHEMAKEEPER_URL_CONFIG, "url");
AvroSerDeConfig config = new AvroSerDeConfig(properties);
AvroSerializer serializer = new AvroSerilizer(config);
AvroDeserializer deserializer = new AvroDeserializer(config);
byte[] b = serializer.serialize("subject", data);
Object r = deserializer.deserialize(b);
Avro SerDe also compatible with Thrift and Protobuf serialized data. It is possible to serialize Thrift/Protobuf using Thrift/Protobuf serializer and then deserialize it using AvroDeserializer.
With Kafka
Use KafkaAvroSerializer.class
and KafkaAvroDeserializer.class
.
Also, the required property is SerDeConfig.SCHEMAKEEPER_URL_CONFIG
. Other settings are optional for SerDe.
Before using Thrift SerDe, you should generate Thrift classes.
Map<String, Object> properties = new HashMap();
properties.put(SerDeConfig.SCHEMAKEEPER_URL_CONFIG, "url");
ThriftSerDeConfig config = new ThriftSerDeConfig(properties);
ThriftSerializer serializer = new ThriftSerializer(config);
ThriftDeserializer deserializer = new ThriftDeserializer(config);
byte[] b = serializer.serialize("subject", data);
Object r = deserializer.deserialize(b);
With Kafka
Use KafkaThriftSerializer.class
and KafkaThriftDeserializer.class
.
Also, the required property is SerDeConfig.SCHEMAKEEPER_URL_CONFIG
. Other settings are optional for SerDe.
Before using Protobuf SerDe, you should generate Protobuf classes.
Map<String, Object> properties = new HashMap();
properties.put(SerDeConfig.SCHEMAKEEPER_URL_CONFIG, "url");
ProtobufSerDeConfig config = new ProtobufSerDeConfig(properties);
ProtobufSerializer serializer = new ProtobufSerializer(config);
ProtobufDeserializer deserializer = new ProtobufDeserializer(config);
byte[] b = serializer.serialize("subject", data);
Object r = deserializer.deserialize(b);
With Kafka
Use KafkaProtobufSerializer.class
and KafkaProtobufDeserializer.class
.
Also, the required property is SerDeConfig.SCHEMAKEEPER_URL_CONFIG
. Other settings are optional for SerDe.
Swagger DOC: /docs
GET /v2/subjects
Get list of registered subjects
Response:
Status codes:
GET /v2/subjects/{subject_name}
Get subject metadata by name
Response:
Status codes:
PUT /v2/subjects/{subject_name}
Update subject settings
Body:
{
"compatibilityType": "Compatibility type",
"isLocked": "true/false"
}
Response:
Status codes:
GET /v2/subjects/{subject_name}/versions
Get list of subject's schema versions
Response:
Status codes:
GET /v2/subjects/{subject_name}/schemas
Get list of subject's schemas metadata
Response:
Status codes:
GET /v2/subjects/{subject_name}/versions/{version}
Get subject's schema metadata by version
Response:
Status codes:
GET /v2/schemas/{id}
Get schema by id
Response:
Status codes:
POST /v2/subjects/{subject_name}/schemas/id
Body:
{
"schemaText": "AVRO SCHEMA STRING",
"schemaType": "IDENTIFIER Of SCHEMA TYPE [avro, thrift or protobuf]"
}
Check if schema is registered and connected with current subject and return it's id
Response:
Status codes:
DELETE /v2/subjects/{subject_name}
Delete subject metadata
Response:
Status codes:
DELETE /v2/subjects/{subject_name}/versions/{version_number}
Delete subject schema by version
Response:
Status codes:
POST /v2/subjects/{subject_name}/compatibility/schemas
Body:
{
"schemaText": "AVRO SCHEMA STRING",
"schemaType": "IDENTIFIER Of SCHEMA TYPE [avro, thrift or protobuf]"
}
Check schema compatibility
Response:
Status codes:
POST /v2/schemas
Body:
{
"schemaText": "AVRO SCHEMA STRING",
"schemaType": "IDENTIFIER Of SCHEMA TYPE [avro, thrift or protobuf]"
}
Register new schema
Response:
Status codes:
POST /v2/subjects/{subject_name}/schemas
Body:
{
"schemaText": "AVRO SCHEMA STRING",
"schemaType": "IDENTIFIER Of SCHEMA TYPE [avro, thrift or protobuf]",
"compatibilityType": "SUBJECT COMPATIBILITY TYPE"
}
Register new subject (if not exists), schema (if not exists) and connect it to each other
Response:
Status codes:
POST /v2/subjects
Body:
{
"subject": "subject name",
"compatibilityType": "compatibility type name",
"isLocked": "false or true"
}
Register new subject
Response:
Status codes:
POST /v2/subjects/{subject_name}/schemas/{schema_id}
Connect schema to subject as next version
Response:
Status codes: