There's also OpenMoE - an open-source effort to train a mixture of experts model. Currently they've released a model with 8 billion parameters. [1]
[0] https://github.com/google-research/t5x/blob/main/docs/models...
[1] https://github.com/XueFuzhao/OpenMoE
There's also OpenMoE - an open-source effort to train a mixture of experts model. Currently they've released a model with 8 billion parameters. [1]
[0] https://github.com/google-research/t5x/blob/main/docs/models...
[1] https://github.com/XueFuzhao/OpenMoE