Hierarchical Mixture of Experts,Mixture Density Neural Network
-
Updated
Mar 31, 2017 - Jupyter Notebook
Hierarchical Mixture of Experts,Mixture Density Neural Network
Dataset, example models and demostration of our Interspeech 2019 paper
Using CCR to predict piezoresponse force microscopy datasets
Multi-Task Learning package built with tensorflow 2 (Multi-Gate Mixture of Experts, Cross-Stitch, Ucertainty Weighting)
Mixtures-of-ExperTs modEling for cOmplex and non-noRmal dIsTributionS
Framework for Contextually Transferring Knowledge from Multiple Source Policies in Deep Reinforcement Learning
2020 INTERSPEECH, "Sparse Mixture of Local Experts for Efficient Speech Enhancement".
Several machine learning classifiers in Python
Some recent state-of-the-art generative models in ONE notebook: (MIX-)?(GAN|WGAN|BigGAN|MHingeGAN|AMGAN|StyleGAN|StyleGAN2)(\+ADA|\+CR|\+EMA|\+GP|\+R1|\+SA|\+SN)*
MoEL: Mixture of Empathetic Listeners
"Towards Crowdsourced Training of Large Neural Networks using Decentralized Mixture-of-Experts" (NeurIPS 2020), original PyTorch implementation
The implementation of mixtures for different tasks.
PyTorch implementation of moe, which stands for mixture of experts
Machine learning code, derivatives calculation and optimization algorithms developed during the Machine Learning course at Universidade de Sao Paulo. All codes in Python, NumPy and Matplotlib with example in the end of file.
The implementation of the "Robust Federated Learning by Mixture of Experts" study.
PyTorch Implementation of the Multi-gate Mixture-of-Experts with Exclusivity (MMoEEx)
Add a description, image, and links to the mixture-of-experts topic page so that developers can more easily learn about it.
To associate your repository with the mixture-of-experts topic, visit your repo's landing page and select "manage topics."