Dev8d Linked Data

some experiments with linked data available from the dev8d conference
Alternatives To Dev8d Linked Data
Project NameStarsDownloadsRepos Using ThisPackages Using ThisMost Recent CommitTotal ReleasesLatest ReleaseOpen IssuesLicenseLanguage
Frab660
3 months ago94otherRuby
conference management system
Conferences281
10 years agoRuby
Crowd-sourced Ruby conference wiki
Pentest Bookmarkz67
2 years ago
A collection of useful links for Pentesters
Dev8d Linked Data7
14 years agoPython
some experiments with linked data available from the dev8d conference
Csunatc18docs7
5 years ago8
Slides and handouts shared by presenters at the 2018 CSUN Assistive Technology Conference
Xamarin.antmedia.samples4
3 years agomitC#
Sample Apps for DT Nuget Bindings of Ant Media Android and iOS SDK's
Vldc.ru3
9 years ago9JavaScript
Vladivostok Developer Conference site
Campusbadge2
4 years agoapache-2.0JavaScript
Code (and wiki) for the Vonage Campus Developer Track badge, based on pixl.js
Alternatives To Dev8d Linked Data
Select To Compare


Alternative Project Comparisons
Readme

This little project is a linked-data experiment with data from the dev8d semantic media wiki http://wiki.2010.dev8d.org/. For any of this to work you'll need to have the python module rdflib installed.

http://rdflib.net

Description:

The crawl.py script will crawl rdf for users and their affiliations on the dev8d semantic media wiki. It then grabs the rdf for the dev8d programme. After that it will lookup each users twitter profile on http://semantictweet.com using the twitter id that was found in the dev8d wiki. Just start it up like so and it'll persist the triples to a on disk berkeleydb backed triplestore:

./crawl.py

You should be able to rerun crawl.py and it will be able to add, update and remove assertions as they are changed on the dev8d wiki.

The dump.py script will dump all the triples as rdf/xml to stdout:

./dump.py > dump.rdf

The dump_foaf.py will dump out social network information for dev8d attendees as rdf/xml to stdout. This is basically a subset of the full dump that only includes assertions about foaf:Person resources:

./dump_foaf.py > foaf.rdf

Finally the planet_config.py will use the homepage information pulled from the dev8d wiki to locate a users homepage, and then try to autodiscover the feed url for their blog. The resulting information is then written to stdout as a Planet Venus configuration file for blog aggregation:

./planet_config.py > planet.ini

You can see an example running at:

http://inkdroid.org/planet-dev8d

More about Planet Venus can be found at:

http://intertwingly.net/code/venus/

TODO:

  • danbri suggested using open social api
  • semantictweet.com only lists first 100 friends (what to do?)
  • maybe pull in descriptions of events from the wiki?
  • maybe persist sydnicated feed urls to the store so they don't have to be looked up again every time planet_config.py is run?

Author:

Ed Summers [email protected]

Popular Conference Projects
Popular Wiki Projects
Popular Community Categories

Get A Weekly Email With Trending Projects For These Categories
No Spam. Unsubscribe easily at any time.
Python
Wiki
Conference
Rdf