Project Name | Stars | Downloads | Repos Using This | Packages Using This | Most Recent Commit | Total Releases | Latest Release | Open Issues | License | Language |
---|---|---|---|---|---|---|---|---|---|---|
Awesome Deep Learning Papers | 21,874 | 3 years ago | 34 | TeX | ||||||
The most cited deep learning papers | ||||||||||
Awesome Datascience | 21,290 | 5 days ago | mit | |||||||
:memo: An awesome Data Science repository to learn and apply for real world problems. | ||||||||||
Awesome Deep Learning | 20,919 | 8 days ago | 26 | |||||||
A curated list of awesome Deep Learning tutorials, projects and communities. | ||||||||||
A To Z Resources For Students | 15,382 | 4 days ago | 14 | mit | ||||||
✅ Curated list of resources for college students | ||||||||||
Awesome Nlp | 14,750 | 4 days ago | 9 | cc0-1.0 | ||||||
:book: A curated list of resources dedicated to Natural Language Processing (NLP) | ||||||||||
Qix | 13,926 | 10 months ago | other | |||||||
Machine Learning、Deep Learning、PostgreSQL、Distributed System、Node.Js、Golang | ||||||||||
Awesome Pytorch List | 13,909 | 2 months ago | 3 | |||||||
A comprehensive list of pytorch related content on github,such as different models,implementations,helper libraries,tutorials etc. | ||||||||||
Awesome Production Machine Learning | 13,659 | 11 days ago | 14 | mit | ||||||
A curated list of awesome open source libraries to deploy, monitor, version and scale your machine learning | ||||||||||
Machine Learning Tutorials | 12,876 | 5 months ago | 33 | cc0-1.0 | ||||||
machine learning and deep learning tutorials, articles and other resources | ||||||||||
500 Ai Machine Learning Deep Learning Computer Vision Nlp Projects With Code | 12,490 | 4 days ago | 19 | |||||||
500 AI Machine learning Deep learning Computer vision NLP Projects with code |
(Researcher) (Engineer)
2017
Cifar10/100,ImageNetSOTA
readme~
pip install light_cnns
import torch
from light_cnns import resnet50_v1b
model = resnet50_v1b()
model.eval()
print(model)
input = torch.randn(1, 3, 224, 224)
y = model(input)
print(y.size())
Google MobileNets.
import torch
from light_cnns import mbv1
model = mbv1()
model.eval()
print(model)
input = torch.randn(1, 3, 224, 224)
y = model(input)
print(y.size())
mobilenetv2 ExpansionProjectionInverted residual block
import torch
from light_cnns import mbv2
model = mbv2()
model.eval()
print(model)
input = torch.randn(1, 3, 224, 224)
y = model(input)
print(y.size())
Searching for MobileNetV3
NASMnasNetMobileNetV2
Large Small;
MobileNetV1
MobileNetV2
squeeze and excitation(SE)
h-swish(x)
NASplatform-aware NASNetAdapt
MobileNetV2head;
import torch
from light_cnns import mbv3_small
#from light_cnns import mbv3_large
model_small = mbv3_small()
#model_large = mbv3_large()
model_small.eval()
print(model_small)
input = torch.randn(1, 3, 224, 224)
y = model_small(input)
print(y.size())
MobileNetV2SandGlassSandglass Blockbottomtop
import torch
from light_cnns import mobilenext
model = mobilenext()
model.eval()
print(model)
input = torch.randn(1, 3, 224, 224)
y = model(input)
print(y.size())
import torch
from light_cnns import shufflenetv1
model = shufflenetv1()
model.eval()
print(model)
input = torch.randn(1, 3, 224, 224)
y = model(input)
print(y.size())
import torch
from light_cnns import shufflenetv2
model = shufflenetv2()
model.eval()
print(model)
input = torch.randn(1, 3, 224, 224)
y = model(input)
print(y.size())
L1 L1
CIFARImageNetAdderNetCNN
import torch
from light_cnns import resnet20
model = resnet20()
model.eval()
print(model)
input = torch.randn(1, 3, 224, 224)
y = model(input)
print(y.size())
GhostGhost feature mapsGhostGhostGhost bottleneckGhostNetImageNetGhostNetTop-175.7%MobileNetV375.2%
import torch
from light_cnns import ghostnet
model = ghostnet()
model.eval()
print(model)
input = torch.randn(1, 3, 224, 224)
y = model(input)
print(y.size())
(SE)Coordinate Attention
import torch
from light_cnns import mbv2_ca
model = mbv2_ca()
model.eval()
print(model)
input = torch.randn(1, 3, 224, 224)
y = model(input)
print(y.size())
ECA-Net: Efficient Channel Attention for Deep Convolutional Neural Networks
ECANetCNNECANetSENetECASENet
import torch
from light_cnns import mbv2_eca
model = mbv2_eca()
model.eval()
print(model)
input = torch.randn(1, 3, 224, 224)
y = model(input)
print(y.size())
ResNeSt: Split-Attention Networks
ResNeSt ""Multi-path Feature-map [email protected] SENetSKNet ResNeXt attention group levelSplit-Attentionfeature-mapfeature-map
import torch
from light_cnns import resnest50_v1b
model = resnest50_v1b()
model.eval()
print(model)
input = torch.randn(1, 3, 224, 224)
y = model(input)
print(y.size())
SA-Net: Shuffle Attention for Deep Convolutional Neural Networks
shuffle attentionSpatial AttentionChannel AttentionSAShuffle unit
import torch
from light_cnns import mbv2_sa
model = mbv2_sa()
model.eval()
print(model)
input = torch.randn(1, 3, 224, 224)
y = model(input)
print(y.size())
Rotate to Attend: Convolutional Triplet Attention Module
Triplet AttentionTriplet Branch(cross dimension interaction)Triplet AttentionBackbone.
import torch
from light_cnns import mbv2_triplet
model = mbv2_triplet()
model.eval()
print(model)
input = torch.randn(1, 3, 224, 224)
y = model(input)
print(y.size())
PP-LCNet: A Lightweight CPU Convolutional Neural Network
LCNetShuffleNetV2MobileNetV2MobileNetV3GhostNet-
import torch
from light_cnns import lcnet_baseline
model = lcnet_baseline()
model.eval()
print(model)
input = torch.randn(1, 3, 224, 224)
y = model(input)
print(y.size())
MobileViT: ,Transformer
MobileViTCNNViT
CNN
ViT
import torch
from light_cnns import mobilevit_s
model = mobilevit_s()
model.eval()
print(model)
input = torch.randn(1, 3, 224, 224)
y = model(input)
print(y.size())
:
import torch
from light_cnns import googlenet
model = googlenet()
model.eval()
print(model)
input = torch.randn(1, 3, 224, 224)
y = model(input)
print(y.size())
Inceptionv2InceptionV1
import torch
from light_cnns import inception_v2
model = inception_v2()
model.eval()
print(model)
input = torch.randn(1, 3, 224, 224)
y = model(input)
print(y.size())
Inception Net v3 Inception v2 5
max poolingpoolingconcatShuffleNet
RMSProp
Factorized 7x7
BatchNorm
label smoothing;
import torch
from light_cnns import inception_v3
model = inception_v3()
model.eval()
print(model)
input = torch.randn(1, 3, 224, 224)
y = model(input)
print(y.size())
import torch
from light_cnns import inception_v4
model = inception_v4()
model.eval()
print(model)
input = torch.randn(1, 3, 224, 224)
y = model(input)
print(y.size())
https://arxiv.org/abs/1610.02357
InceptionXceptionXceptionExtreme Inception XceptionInception.Xception
import torch
from light_cnns import xception
model = xception()
model.eval()
print(model)
input = torch.randn(1, 3, 224, 224)
y = model(input)
print(y.size())
()EDOeffective dilation searchinception (dilated)
import torch
from light_cnns import ic_resnet50
patter = './pattern_zoo/detection/ic_resnet50_k9.json'
model = ic_resnet50(pattern_path=patter)
model.eval()
print(model)
input = torch.randn(1, 3, 224, 224)
y = model(input)
print(y.size())
ESPNet: Efficient Spatial Pyramid of Dilated Convolutions for Semantic Segmentation
ESPNetESP Modulepoint-wise,ESPNetGPU//112FPS/21FPS/9FPS
import torch
from light_cnns import espnetv1
model = espnetv1()
model.eval()
print(model)
input = torch.randn(1, 3, 224, 224)
y = model(input)
print(y.size())
ESPNetv2ESPNetv1
point-wiseESPEESP(Extremely Efficient Spatial Pyramid)ESPNet
cyclic learning rate schedulerscheduler
EESP(Strided EESP with shortcut connection to an input image),
import torch
from light_cnns import espnetv2
model = espnetv2()
model.eval()
print(model)
input = torch.randn(1, 3, 224, 224)
y = model(input)
print(y.size())