😷 The Fill-Mask Association Test (FMAT): Measuring Propositions in Natural Language.
-
Updated
Jun 13, 2024 - R
😷 The Fill-Mask Association Test (FMAT): Measuring Propositions in Natural Language.
OpenMMLab Pre-training Toolbox and Benchmark
👑 Easy-to-use and powerful NLP and LLM library with 🤗 Awesome model zoo, supporting wide-range of NLP tasks from research to industrial applications, including 🗂Text Classification, 🔍 Neural Search, ❓ Question Answering, ℹ️ Information Extraction, 📄 Document Intelligence, 💌 Sentiment Analysis etc.
The official Python client for the Huggingface Hub.
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
🔮 SuperDuperDB: Bring AI to your database! Build, deploy and manage any AI application directly with your existing data infrastructure, without moving your data. Including streaming inference, scalable model training and vector search.
The largest collection of PyTorch image encoders / backbones. Including train, eval, inference, export scripts, and pretrained weights -- ResNet, ResNeXT, EfficientNet, NFNet, Vision Transformer (ViT), MobileNetV4, MobileNet-V3 & V2, RegNet, DPN, CSPNet, Swin Transformer, MaxViT, CoAtNet, ConvNeXt, and more
An open source implementation of CLIP.
Chronos: Pretrained (Language) Models for Probabilistic Time Series Forecasting
On-device Inference of Whisper Speech Recognition Models for Apple Silicon
An Extensible Toolkit for Finetuning and Inference of Large Foundation Models. Large Models for All.
SwissArmyTransformer is a flexible and powerful library to develop your own Transformer variants.
GPT4V-level open-source multi-modal model based on Llama3-8B
[ICLR2024] Quick-Tune: Quickly Learning Which Pretrained Model to Finetune and How
Neural building blocks for speaker diarization: speech activity detection, speaker change detection, overlapped speech detection, speaker embedding
Official release of InternLM2 7B and 20B base and chat models. 200K context support
A treasure chest for visual classification and recognition powered by PaddlePaddle
Library for handling atomistic graph datasets focusing on transformer-based implementations, with utilities for training various models, experimenting with different pre-training tasks, and a suite of pre-trained models with huggingface integrations
Experience the power of Clarifai’s AI platform with the python SDK. 🌟 Star to support our work!
Add a description, image, and links to the pretrained-models topic page so that developers can more easily learn about it.
To associate your repository with the pretrained-models topic, visit your repo's landing page and select "manage topics."