Awesome Open Source
Search results for etl data integration
29 search results found
Data integration platform for ELT pipelines from APIs, databases & files to warehouses & lakes.
An orchestration platform for the development, production, and observation of data assets.
🧙 The modern replacement for Airflow. Build, run, and manage data pipelines for integrating and transforming data.
The open source high performance data integration platform built for developers.
Privacy and Security focused Segment-alternative, in Golang and React
Kestra is an infinitely scalable orchestration and scheduling platform, creating, running, scheduling, and monitoring millions of complex pipelines.
Apache DevLake is an open-source dev data platform to ingest, analyze, and visualize the fragmented data from DevOps tools, extracting insights for engineering excellence, developer experience, and community growth.
A lightweight opinionated ETL framework, halfway between plain scripts and Apache Airflow
Conduit streams data between data stores. Kafka Connect replacement. No JVM required.
Use SQL to build ELT pipelines on a data lakehouse.
Recap tracks and transform schemas across your whole application.
Categorical Query Language IDE
Mara Example Project 2
An example mini data warehouse for python project stats, template for new projects
mito ETL tool
Dataplane is an Airflow inspired data platform with additional data mesh capability to automate, schedule and design data pipelines and workflows. Dataplane is written in Golang with a React front end.
Powerful RDF Knowledge Graph Generation with [R2]RML Mappings
🏎 The python library enabling access to tools and data sources in minutes, with Naas low-code formulas.
an data-centric integration platform
Mara Etl Tools
Utilities for creating ETL pipelines with mara
Configurable data bridge for permanent ETL jobs
Singer Working Group
Working group for ongoing development and iteration of the Singer Spec, the de-facto protocol for open source data connectors. Please use "Issues" to create discussion items - or use "Discussions" for general questions.
An intuitive and flexible RDF pipeline solution designed to simplify and automate ETL processes for efficient data management.
A declarative, SQL-like DSL for data integration tasks.
Prism is the easiest way to develop, orchestrate, and execute data pipelines in Python.
A service that provides a clustering storage of metadata for Data Integration purposes
Simple Python Bibliometrics
Easiest way to get your bibliometrics data ready and prepared
Pdi As400 Plugin
Extension job for Pentaho Data Integration
Pdi Domino Plugin
Extension database, job and step for Pentaho Data Integration
Python Etl (852)
Pipeline Etl (294)
Java Etl (233)
1-29 of 29 search results
Follow Us On Twitter
Copyright 2018-2023 Awesome Open Source. All rights reserved.