Project Name | Stars | Downloads | Repos Using This | Packages Using This | Most Recent Commit | Total Releases | Latest Release | Open Issues | License | Language |
---|---|---|---|---|---|---|---|---|---|---|
Awesome Domain Adaptation | 4,211 | a month ago | 1 | mit | ||||||
A collection of AWESOME things about domian adaptation | ||||||||||
Segloss | 2,870 | 9 months ago | 1 | apache-2.0 | Python | |||||
A collection of loss functions for medical image segmentation | ||||||||||
Deep Learning For Tracking And Detection | 2,085 | 5 months ago | 5 | Shell | ||||||
Collection of papers, datasets, code and other resources for object tracking and detection using deep learning | ||||||||||
3d Pointcloud | 1,374 | 5 days ago | 2 | Python | ||||||
Papers and Datasets about Point Cloud. | ||||||||||
Torchseg | 1,106 | 3 years ago | 33 | mit | Python | |||||
Fast, modular reference implementation and easy training of Semantic Segmentation algorithms in PyTorch. | ||||||||||
Panet | 1,054 | 4 years ago | 46 | mit | Python | |||||
PANet for Instance Segmentation and Object Detection | ||||||||||
Unet Family | 1,029 | 3 years ago | 4 | Python | ||||||
Paper and implementation of UNet-related model. | ||||||||||
Semanticsegmentation_dl | 983 | 3 years ago | 8 | Jupyter Notebook | ||||||
Resources of semantic segmantation based on Deep Learning model | ||||||||||
Cv_paperdaily | 673 | a year ago | ||||||||
CV 论文笔记 | ||||||||||
All About The Gan | 602 | 4 years ago | 1 | mit | Python | |||||
All About the GANs(Generative Adversarial Networks) - Summarized lists for GAN |
This repository contains code and supplementary materials which are required to train and evaluate a model as described in the paper Text Segmentation as a Supervised Learning Task
wiki-727K, wiki-50 datasets:
https://www.dropbox.com/sh/k3jh0fjbyr0gw0a/AADzAd9SDTrBnvs1qLCJY5cza?dl=0
word2vec:
Fill relevant paths in configgenerator.py, and execute the script (git repository includes Choi dataset)
conda create -n textseg python=2.7 numpy scipy gensim ipython
source activate textseg
pip install http://download.pytorch.org/whl/cu80/torch-0.3.0-cp27-cp27mu-linux_x86_64.whl
pip install tqdm pathlib2 segeval tensorboard_logger flask flask_wtf nltk
pip install pandas xlrd xlsxwriter termcolor
python run.py --help
Example:
python run.py --cuda --model max_sentence_embedding --wiki
python test_accuracy.py --help
Example:
python test_accuracy.py --cuda --model <path_to_model> --wiki
python wiki_processor.py --input <input> --temp <temp_files_folder> --output <output_folder> --train <ratio> --test <ratio>
Input is the full path to the wikipedia dump, temp is the path to the temporary files folder, and output is the path to the newly generated wikipedia dataset.
Wikipedia dump can be downloaded from following url:
https://dumps.wikimedia.org/enwiki/latest/enwiki-latest-pages-articles.xml.bz2