Project Name | Stars | Downloads | Repos Using This | Packages Using This | Most Recent Commit | Total Releases | Latest Release | Open Issues | License | Language |
---|---|---|---|---|---|---|---|---|---|---|
Linkeddatahub | 424 | 5 days ago | 35 | June 30, 2022 | 38 | apache-2.0 | XSLT | |||
The low-code Knowledge Graph application platform. Apache license. | ||||||||||
Processor | 54 | 1 | 5 months ago | 32 | May 23, 2022 | 4 | apache-2.0 | Java | ||
Ontology-driven Linked Data processor and server for SPARQL backends. Apache License. | ||||||||||
Csv2rdf | 50 | 1 | 6 days ago | 5 | May 23, 2022 | 2 | apache-2.0 | Java | ||
Streaming, transforming, SPARQL-based CSV to RDF converter. Apache license. | ||||||||||
Ltlod | 8 | 10 months ago | XSLT | |||||||
Lithuanian Linked Open Data | ||||||||||
Platypus Kb Lucene | 2 | 2 years ago | gpl-2.0 | Java | ||||||
Lucene-based service for the knowledge base used by Platypus | ||||||||||
Cidme Cli | 1 | 3 years ago | mit | |||||||
Node-based CLI for CIDME (ALPHA! INCOMPLETE!) |
LinkedDataHub (LDH) is open source software you can use to manage data, create visualizations and build apps on RDF Knowledge Graphs.
What's new in LinkedDataHub v3? Watch this video for a feature overview:
We started the project with the intention to use it for Linked Data publishing, but gradually realized that we've built a multi-purpose data-driven platform.
We are building LinkedDataHub primarily for:
What makes LinkedDataHub unique is its completely data-driven architecture: applications and documents are defined as data, managed using a single generic HTTP API and presented using declarative technologies. The default application structure and user interface are provided, but they can be completely overridden and customized. Unless a custom server-side processing is required, no imperative code such as Java or JavaScript needs to be involved at all.
Follow the Get started guide to LinkedDataHub. The setup and basic configuration sections are provided below and should get you running.
LinkedDataHub is also available as a free AWS Marketplace product!
It takes a few clicks and filling out a form to install the product into your own AWS account. No manual setup or configuration necessary!
bash
shell 4.x. It should be included by default on Linux. On Windows you can install the Windows Subsystem for Linux.keytool
available on $PATH
. It comes with the JDK.openssl
available on $PATH
uuidgen
available on $PATH
.env
file and fill out the missing values (you can use .env_sample
as a template). For example:
COMPOSE_CONVERT_WINDOWS_PATHS=1
COMPOSE_PROJECT_NAME=linkeddatahub
PROTOCOL=https
HTTP_PORT=81
HTTPS_PORT=4443
HOST=localhost
ABS_PATH=/
[email protected]
OWNER_GIVEN_NAME=John
OWNER_FAMILY_NAME=Doe
OWNER_ORG_UNIT=My unit
OWNER_ORGANIZATION=My org
OWNER_LOCALITY=Copenhagen
OWNER_STATE_OR_PROVINCE=Denmark
OWNER_COUNTRY_NAME=DK
$owner_cert_pwd
and $secretary_cert_pwd
with your own passwords):
./scripts/setup.sh .env ssl $owner_cert_pwd $secretary_cert_pwd 3650
The script will create an ssl
sub-folder where the SSL certificates and/or public keys will be placed.docker-compose up --build
It will build LinkedDataHub's Docker image, start its container and mount the following sub-folders:
data
where the triplestore(s) will persist RDF datauploads
where LDH stores content-hashed file uploads
The first should take around half a minute as datasets are being loaded into triplestores. After a successful startup, the last line of the Docker log should read something like:linkeddatahub_1 | 09-Feb-2021 14:18:10.536 INFO [main] org.apache.catalina.startup.Catalina.start Server startup in [32609] milliseconds
ssl/owner/keystore.p12
into a web browser of your choice (password is the $owner_cert_pwd
value supplied to setup.sh
)
Settings > Advanced > Manage Certificates > Import...
Options > Privacy > Security > View Certificates... > Import...
local
section).Your connection is not private
in Chrome or Warning: Potential Security Risk Ahead
in Firefox due to the self-signed server certificate. Ignore it: click Advanced
and Proceed
or Accept the risk
to proceed.
chrome://flags/#allow-insecure-localhost
, switch Allow invalid certificates for resources loaded from localhost
to Enabled
and restart Chrome.env_sample
and .env
files might be invisible in MacOS Finder which hides filenames starting with a dot. You should be able to create it using Terminal however.docker
group. Add it usingsudo usermod -aG docker ${USER}
and re-login with your user. An alternative, but not recommended, is to run
sudo docker-compose up
A common case is changing the base URI from the default https://localhost:4443/
to your own.
Lets use https://ec2-54-235-229-141.compute-1.amazonaws.com/linkeddatahub/
as an example. We need to split the URI into components and set them in the .env
file using the following parameters:
PROTOCOL=https
HTTP_PORT=80
HTTPS_PORT=443
HOST=ec2-54-235-229-141.compute-1.amazonaws.com
ABS_PATH=/linkeddatahub/
ABS_PATH
is required, even if it's just /
.
Dataspaces are configured in config/system-varnish.trig
. Relative URIs will be resolved against the base URI configured in the .env
file.
⚠️ Do not use blank nodes to identify applications or services. We recommend using the urn:
URI scheme, since LinkedDataHub application resources are not accessible under their own dataspace.
LinkedDataHub supports a range of configuration options that can be passed as environment parameters in docker-compose.yml
. The most common ones are:
CATALINA_OPTS
SELF_SIGNED_CERT
true
if the server certificate is self-signedSIGN_UP_CERT_VALIDITY
IMPORT_KEEPALIVE
MAX_CONTENT_LENGTH
MAIL_SMTP_HOST
MAIL_SMTP_PORT
GOOGLE_CLIENT_ID
GOOGLE_CLIENT_SECRET
If you need to start fresh and wipe the existing setup (e.g. after configuring a new base URI), you can do that using
sudo rm -rf data uploads && docker-compose down -v
⚠️ This will remove the persisted data and files as well as Docker volumes.
LinkedDataHub CLI wraps the HTTP API into a set of shell scripts with convenient parameters. The scripts can be used for testing, automation, scheduled execution and such. It is usually much quicker to perform actions using CLI rather than the user interface, as well as easier to reproduce.
The scripts can be found in the scripts
subfolder.
⚠️ The CLI scripts internally use Jena's CLI commands. Set up the Jena environment before running the scripts.
An environment variable JENA_HOME
is used by all the command line tools to configure the class path automatically for you. You can set this up as follows:
On Linux / Mac
export JENA_HOME=the directory you downloaded Jena to
export PATH="$PATH:$JENA_HOME/bin"
These demo applications can be installed into a LinkedDataHub instance using the provided CLI scripts.
⚠️ Before running app installation scripts that use LinkedDataHub's CLI scripts, set the SCRIPT_ROOT
environmental variable to the scripts
subfolder of your LinkedDataHub fork or clone. For example:
export SCRIPT_ROOT="/c/Users/namedgraph/WebRoot/AtomGraph/LinkedDataHub/scripts"
LinkedDataHub includes an HTTP test suite. The server implementation is also covered by the Processor test suite.
Please report issues if you've encountered a bug or have a feature request.
Commercial consulting, development, and support are available from AtomGraph.