|Project Name||Stars||Downloads||Repos Using This||Packages Using This||Most Recent Commit||Total Releases||Latest Release||Open Issues||License||Language|
|Galaxy||1,105||2||12||17 hours ago||13||November 02, 2021||2,086||other||Python|
|Data intensive science for everyone.|
|Scrna Seq_notes||389||18 days ago||mit||R|
|A list of scRNA-seq analysis tools|
|Juicer||313||4 months ago||40||mit||Shell|
|A One-Click System for Analyzing Loop-Resolution Hi-C Experiments|
|Bedops||244||10 months ago||1||March 03, 2021||17||other||C|
|:microscope: BEDOPS: high-performance genomic feature operations|
|Sarek||240||a day ago||124||mit||Nextflow|
|Analysis pipeline to detect germline or somatic variants (pre-processing, variant calling and annotation) from WGS / targeted sequencing|
|Flowcraft||233||3 years ago||18||March 02, 2019||39||gpl-3.0||Python|
|FlowCraft: a component-based pipeline composer for omics analysis using Nextflow. :whale::package:|
|Aws Batch Genomics||181||3 years ago||5||apache-2.0||Python|
|Software sets up and runs an genome sequencing analysis workflow using AWS Batch and AWS Step Functions.|
|Viral Ngs||129||3 years ago||60||other||Python|
|Viral genomics analysis pipelines|
|Sarek||126||3 years ago||n,ull||mit||Nextflow|
|Detect germline or somatic variants from normal or tumour/normal whole-genome or targeted sequencing|
|Cloud Pipeline||123||a day ago||804||apache-2.0||Java|
|Cloud agnostic genomics analysis, scientific computation and storage platform|
This repository contains examples for the [Google Genomics Pipelines API] (https://cloud.google.com/genomics/reference/rest/v1alpha2/pipelines).
|This is an Alpha release of Google Genomics API. This feature might be changed in backward-incompatible ways and is not recommended for production use. It is not subject to any SLA or deprecation policy.|
The API provides an easy way to create, run, and monitor command-line tools on Google Compute Engine running in a Docker container. You can use it like you would a job scheduler.
The most common use case is to run an off-the-shelf tool or custom script that reads and writes files. You may want to run such a tool over files in Google Cloud Storage. You may want to run this independently over hundreds or thousands of files.
The typical flow for a pipeline is:
You can submit batch operations from your laptop, and have them run in the cloud. You can do the packaging to Docker yourself, or use existing Docker images.
pip install --upgrade google-api-python-client. For more detail see https://cloud.google.com/genomics/v1/libraries.