3 comments

  • beardyw 2 days ago ago

    > The model was trained on over 1,800 languages

    I've never really considered how models treat languages. Would these all be siloed or are they made equivalent at tokenization into one model? Anyone know?

  • hunglee2 2 days ago ago

    great to see this - we need multi-polarity in AI, maybe even multi-polarity in AGI.

    • FinnLobsien 2 days ago ago

      Sure, but will anyone use this model?