moe
Here are 109 public repositories matching this topic...
Unify Efficient Fine-Tuning of 100+ LLMs
-
Updated
Jun 2, 2024 - Python
An unofficial https://bgm.tv ui first app client for Android and iOS, built with React Native. 一个无广告、以爱好为驱动、不以盈利为目的、专门做 ACG 的类似豆瓣的追番记录,bgm.tv 第三方客户端。为移动端重新设计,内置大量加强的网页端难以实现的功能,且提供了相当的自定义选项。 目前已适配 iOS / Android / WSA、mobile / 简单 pad、light / dark theme、移动端网页。
-
Updated
Jun 1, 2024 - TypeScript
[arXiv'24] Multilinear Mixture of Experts: Scalable Expert Specialization through Factorization
-
Updated
May 31, 2024 - Python
⭐ Moe-Counter Compatible Website Hit Counter Written in Gleam
-
Updated
May 31, 2024 - Gleam
Official LISTEN.moe Android app
-
Updated
Jun 2, 2024 - Kotlin
MindSpore online courses: Step into LLM
-
Updated
May 30, 2024 - Python
The most effective and efficient moecounters for your projects, designed to display a wide range of statistics for your website and more!
-
Updated
Jun 1, 2024 - JavaScript
Implementation of MoE Mamba from the paper: "MoE-Mamba: Efficient Selective State Space Models with Mixture of Experts" in Pytorch and Zeta
-
Updated
May 18, 2024 - Python
Implementation of the "the first large-scale multimodal mixture of experts models." from the paper: "Multimodal Contrastive Learning with LIMoE: the Language-Image Mixture of Experts"
-
Updated
May 18, 2024 - Python
Implementation of Switch Transformers from the paper: "Switch Transformers: Scaling to Trillion Parameter Models with Simple and Efficient Sparsity"
-
Updated
May 17, 2024 - Python
Mixture-of-Experts for Large Vision-Language Models
-
Updated
May 15, 2024 - Python
Tutel MoE: An Optimized Mixture-of-Experts Implementation
-
Updated
May 14, 2024 - Python
This is the repo for the MixKABRN Neural Network (Mixture of Kolmogorov-Arnold Bit Retentive Networks), and an attempt at first adapting it for training on text, and later adjust it for other modalities.
-
Updated
May 14, 2024 - Python
japReader is an app for breaking down Japanese sentences and tracking vocabulary progress
-
Updated
May 3, 2024 - JavaScript
中文Mixtral混合专家大模型(Chinese Mixtral MoE LLMs)
-
Updated
Apr 30, 2024 - Python
Improve this page
Add a description, image, and links to the moe topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the moe topic, visit your repo's landing page and select "manage topics."