Deep Unsupervised Domain Adaptation

Pytorch implementation of four neural network based domain adaptation techniques: DeepCORAL, DDC, CDAN and CDAN+E. Evaluated on benchmark dataset Office31.
Alternatives To Deep Unsupervised Domain Adaptation
Project NameStarsDownloadsRepos Using ThisPackages Using ThisMost Recent CommitTotal ReleasesLatest ReleaseOpen IssuesLicenseLanguage
Fashion Mnist9,856
a year ago24mitPython
A MNIST-like fashion product database. Benchmark :point_down:
Awesome Semantic Segmentation8,065
2 years ago13
:metal: awesome-semantic-segmentation
Nlp_chinese_corpus7,386
4 months ago19mit
大规模中文自然语言处理语料 Large Scale Chinese Corpus for NLP
Clue2,954
4 months ago71Python
中文语言理解测评基准 Chinese Language Understanding Evaluation Benchmark: datasets, baselines, pre-trained models, corpus and leaderboard
Faceforensics1,924
4 months ago69otherPython
Github of the FaceForensics dataset
Benchmarking Gnns1,913
7 months ago2mitJupyter Notebook
Repository for benchmarking graph neural networks
Codesearchnet1,548
a year ago7mitJupyter Notebook
Datasets, tools, and benchmarks for representation learning of code.
Deepmoji1,331
a year ago9mitPython
State-of-the-art deep learning model for analyzing sentiment, emotion, sarcasm etc.
Codexglue944
15 days ago14mitC#
CodeXGLUE
Beir80833 days ago28June 30, 202253apache-2.0Python
A Heterogeneous Benchmark for Information Retrieval. Easy to use, evaluate your models across 15+ diverse IR datasets.
Alternatives To Deep Unsupervised Domain Adaptation
Select To Compare


Alternative Project Comparisons
Readme

Deep-Unsupervised-Domain-Adaptation


Pytorch implementation of four neural network based domain adaptation techniques: DeepCORAL, DDC, CDAN and CDAN+E. Evaluated on benchmark dataset Office31.

Paper: Evaluation of Deep Neural Network Domain Adaptation Techniques for Image Recognition

Abstract

It has been well proved that deep networks are efficient at extracting features from a given (source) labeled dataset. However, it is not always the case that they can generalize well to other (target) datasets which very often have a different underlying distribution. In this report, we evaluate four different domain adaptation techniques for image classification tasks: Deep CORAL, Deep Domain Confusion (DDC), Conditional Adversarial Domain Adaptation (CDAN) and CDAN with Entropy Conditioning (CDAN+E). The selected domain adaptation techniques are unsupervised techniques where the target dataset will not carry any labels during training phase. The experiments are conducted on the office-31 dataset.

Results

Accuracy performance on the Office31 dataset for the source and domain data distributions (with and without transfer losses).

Deep CORAL DDC
CDAN CDAN+E

Target accuracies for all six domain shifts in Office31 dataset (amazon, webcam and dslr)

Method A → W A → D W → A W → D D → A D → W
No Adaptaion 43.1 ± 2.5 49.2 ± 3.7 35.6 ± 0.6 94.2 ± 3.1 35.4 ± 0.7 90.9 ± 2.4
DeepCORAL 49.5 ± 2.7 40.0 ± 3.3 38.3 ± 0.4 74.4 ± 4.3 38.5 ± 1.5 89.1 ± 4.4
DDC 41.7 ± 9.1 --- --- --- --- ---
CDAN 44.9 ± 3.3 49.5 ± 4.6 34.8 ± 2.4 93.3 ± 3.4 32.9 ± 3.4 88.3 ± 3.8
CDAN+E 48.7 ± 7.5 53.7 ± 4.7 35.3 ± 2.7 93.6 ± 3.4 33.9 ± 2.2 87.7 ± 4.0

Training and inference

To train the model in your computer you must download the Office31 dataset and put it in your data folder.

Execute training of a method by going to its folder (e.g. DeepCORAL):

cd DeepCORAL/
python main.py --epochs 100 --batch_size_source 128 --batch_size_target 128 --name_source amazon --name_target webcam

Loss and accuracy plots

Once the model is trained, you can generate plots like the ones shown above by running:

cd DeepCORAL/
python plot_loss_acc.py --source amazon --target webcam --no_epochs 10

The following is a list of the arguments the usuer can provide:

  • --epochs number of training epochs
  • --batch_size_source batch size of source data
  • --batch_size_target batch size of target data
  • --name_source name of source dataset
  • --name_target name of source dataset
  • --num_classes no. classes in dataset
  • --load_model flag to load pretrained model (AlexNet by default)
  • --adapt_domain bool argument to train with or without specific transfer loss

Requirements

  • tqdm
  • PyTorch
  • matplotlib
  • numpy
  • pickle
  • scikit-image
  • torchvision

References

Popular Dataset Projects
Popular Benchmark Projects
Popular Data Processing Categories
Related Searches

Get A Weekly Email With Trending Projects For These Categories
No Spam. Unsubscribe easily at any time.
Python
Dataset
Pytorch
Neural Network
Convolutional Neural Networks
Benchmark
Unsupervised Learning
Domain Adaptation