Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for pretrained models pre training
pre-training
x
pretrained-models
x
1 search results found
Ofa
⭐
2,142
Official repository of OFA (ICML 2022). Paper: OFA: Unifying Architectures, Tasks, and Modalities Through a Simple Sequence-to-Sequence Learning Framework
Spark
⭐
1,355
[ICLR'23 Spotlight🔥] The first successful BERT/MAE-style pretraining on any convolutional network; Pytorch impl. of "Designing BERT for Convolutional Networks: Sparse and Hierarchical Masked Modeling"
Knowlm
⭐
870
An Open-sourced Knowledgable Large Language Model Framework.
Entity
⭐
596
EntitySeg Toolbox: Towards Open-World and High-Quality Image Segmentation
Uni Mol
⭐
495
Official Repository for the Uni-Mol Series Methods
Azureml Bert
⭐
384
End-to-End recipes for pre-training and fine-tuning BERT using Azure Machine Learning Service
Awesome Pretraining For Graph Neural Networks
⭐
100
A curated list of papers on pre-training for graph neural networks (Pre-train4GNN).
Ontoprotein
⭐
98
Code and datasets for the ICLR2022 paper "OntoProtein: Protein Pretraining With Gene Ontology Embedding"
Molgen
⭐
64
Code and pre-trained models for the paper "Domain-Agnostic Molecular Generation with Self-feedback."
Linkbert
⭐
63
[ACL 2022] LinkBERT: A Knowledgeable Language Model 😎 Pretrained with Document Links
Cross Domain Recommendation
⭐
36
cross-domain recommendation,transfer learning,pre-training,self-supervise learning papers and datasets
Unicrs
⭐
22
[KDD22] Official PyTorch implementation for "Towards Unified Conversational Recommender Systems via Knowledge-Enhanced Prompt Learning".
Large Scale Pretraining Transfer
⭐
11
Code for reproducing the experiments on large-scale pre-training and transfer learning for the paper "Effect of large-scale pre-training on full and few-shot transfer learning for natural and medical images" (https://arxiv.org/abs/2106.00116)
Audio Pretrained Model
⭐
11
A collection of Audio and Speech pre-trained models.
Knowqa
⭐
8
预训练模型知识量度量竞赛 Baseline F1 0.35 BERTForMaskedLM
Sket
⭐
8
This repository contains the source code for the Semantic Knowledge Extractor Tool (SKET). SKET is an unsupervised hybrid knowledge extraction system that combines a rule-based expert system with pre-trained machine learning models to extract cancer-related information from pathology reports.
Related Searches
Python Pretrained Models (395)
Pytorch Pretrained Models (190)
Deep Learning Pretrained Models (161)
Python Pre Training (73)
1-1 of 1 search results
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.