Project Name | Stars | Downloads | Repos Using This | Packages Using This | Most Recent Commit | Total Releases | Latest Release | Open Issues | License | Language |
---|---|---|---|---|---|---|---|---|---|---|
Dltk | 1,293 | 2 years ago | 11 | apache-2.0 | Python | |||||
Deep Learning Toolkit for Medical Image Analysis | ||||||||||
Nipype | 704 | 20 days ago | 11 | September 21, 2022 | 421 | other | Python | |||
Workflows and interfaces for neuroimaging packages | ||||||||||
Pyradigm | 24 | 6 | 4 | 8 months ago | 21 | July 22, 2020 | 16 | mit | Python | |
Research data management in biomedical and machine learning applications | ||||||||||
Braph 2 | 23 | 9 days ago | 41 | other | MATLAB | |||||
BRAPH 2.0 is a comprehensive software package for the analysis and visualization of brain connectivity data, offering flexible customization, rich visualization capabilities, and a platform for collaboration in neuroscience research. | ||||||||||
School Brainhack.github.io | 19 | 3 months ago | 42 | mit | Jupyter Notebook | |||||
Website of the BrainHack School | ||||||||||
Bpt | 9 | a year ago | 21 | May 20, 2022 | mit | Python | ||||
The Brain Predictability toolbox (BPt), is a python based Machine Learning library designed primarily for tabular and neuroimaging specific neuroimaging data but can easily be generalized further. | ||||||||||
Juspyce | 9 | 2 months ago | other | Jupyter Notebook | ||||||
JuSpyce - a toolbox for flexible assessment of spatial associations between brain maps | ||||||||||
Psy3018.github.io | 8 | 3 months ago | 45 | cc-by-4.0 | Jupyter Notebook | |||||
Notes de cours pour PSY3018 - Méthodes en neurosciences cognitives | ||||||||||
Nhw2017 | 3 | 6 years ago | apache-2.0 | CSS | ||||||
Current neuroimaging software offer users an incredible opportunity to analyze data using a variety of different algorithms. However, this has resulted in a heterogeneous collection of specialized applications without transparent interoperability or a uniform operating interface.
Nipype, an open-source, community-developed initiative under the umbrella of NiPy, is a Python project that provides a uniform interface to existing neuroimaging software and facilitates interaction between these packages within a single workflow. Nipype provides an environment that encourages interactive exploration of algorithms from different packages (e.g., SPM, FSL, FreeSurfer, AFNI, Slicer, ANTS), eases the design of workflows within and between packages, and reduces the learning curve necessary to use different packages. Nipype is creating a collaborative platform for neuroimaging software development in a high-level language and addressing limitations of existing pipeline systems.
Nipype allows you to:
Please see the doc/README.txt
document for information on our
documentation.
Information specific to Nipype is located here:
http://nipy.org/nipype
Python 2.7 reaches its end-of-life in January 2020, which means it will no longer be maintained by Python developers. Many projects are removing support in advance of this deadline, which will make it increasingly untenable to try to support Python 2, even if we wanted to.
The final series with 2.7 support is 1.3.x. If you have a package using Python 2 and are unable or unwilling to upgrade to Python 3, then you should use the following dependency for Nipype:
nipype<1.4
Bug fixes will be accepted against the maint/1.3.x
branch.
If you have a problem or would like to ask a question about how to do something in Nipype please open an issue to NeuroStars.org with a nipype tag. NeuroStars.org is a platform similar to StackOverflow but dedicated to neuroinformatics.
To participate in the Nipype development related discussions please use the following mailing list:
http://mail.python.org/mailman/listinfo/neuroimaging
Please add [nipype] to the subject line when posting on the mailing list.
You can even hangout with the Nipype developers in their Gitter channel or in the BrainHack Slack channel. (Click here to join the Slack workspace.)
If you'd like to contribute to the project please read our guidelines. Please also read through our code of conduct.