Explore Tencent's Hunyuan-Large, a 389B parameter MoE model with 52B active parameters. Discover its top benchmarks, technical innovations, and real-world applications
Seems they’ve outlined the used datasets in Annex B of their paper. I haven’t checked if the list is exhaustive and if the training code and scripts to prepare the data are there… If they are, I’d say this is indeed a proper open-source model. And the weights are licensed under an Apache license.
So would the granite models count as “open source”? They do publish the training data they used.
Seems they’ve outlined the used datasets in Annex B of their paper. I haven’t checked if the list is exhaustive and if the training code and scripts to prepare the data are there… If they are, I’d say this is indeed a proper open-source model. And the weights are licensed under an Apache license.