5 points | by FinnLobsien 2 days ago ago
3 comments
> The model was trained on over 1,800 languages
I've never really considered how models treat languages. Would these all be siloed or are they made equivalent at tokenization into one model? Anyone know?
great to see this - we need multi-polarity in AI, maybe even multi-polarity in AGI.
Sure, but will anyone use this model?
> The model was trained on over 1,800 languages
I've never really considered how models treat languages. Would these all be siloed or are they made equivalent at tokenization into one model? Anyone know?
great to see this - we need multi-polarity in AI, maybe even multi-polarity in AGI.
Sure, but will anyone use this model?