Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
-
Updated
Dec 15, 2024 - Python
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
An Open-Source Framework for Prompt-Learning.
A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS, 海量中文预训练ALBERT模型
Language Understanding Evaluation benchmark for Chinese: datasets, baselines, pre-trained models,corpus and leaderboard
[ICLR'23 Spotlight🔥] The first successful BERT/MAE-style pretraining on any convolutional network; Pytorch impl. of "Designing BERT for Convolutional Networks: Sparse and Hierarchical Masked Modeling"
Pre-trained Chinese ELECTRA(中文ELECTRA预训练模型)
[CVPR 2021] Involution: Inverting the Inherence of Convolution for Visual Recognition, a brand new neural operator
An Open-sourced Knowledgable Large Language Model Framework.
Must-read Papers on Knowledge Editing for Large Language Models.
[ICLR 2024] Official PyTorch implementation of FasterViT: Fast Vision Transformers with Hierarchical Attention
[MICCAI 2019 Young Scientist Award] [MEDIA 2020 Best Paper Award] Models Genesis
Official Repository for the Uni-Mol Series Methods
PyTorch code for "Prototypical Contrastive Learning of Unsupervised Representations"
Self-supervised contrastive learning for time series via time-frequency consistency
[ICML 2023] Official PyTorch implementation of Global Context Vision Transformers
Eden AI: simplify the use and deployment of AI technologies by providing a unique API that connects to the best possible AI engines
PERT: Pre-training BERT with Permuted Language Model
A work in progress to build out solutions in Rust for MLOPs
[KDD'2024] "UrbanGPT: Spatio-Temporal Large Language Models"
Exploring Visual Prompts for Adapting Large-Scale Models
Add a description, image, and links to the pre-trained-model topic page so that developers can more easily learn about it.
To associate your repository with the pre-trained-model topic, visit your repo's landing page and select "manage topics."