9 points | by simonw a day ago ago
3 comments
As Simon said in his post:
> Honestly, it’s pretty terrible. Talking with it feels more like chatting with a Markov chain than an LLM—the responses may have a delightfully Victorian flavor to them but it’s hard to get a response that usefully answers a question.
I tried it and I agree.
If you have uv installed you can start a chat with the model (after a 2GB model download) with this one-liner:
uvx --with llm-mrchatterbox llm chat -m mrchatterbox
This is fantastically cool. I can't believe this is possible with such a small and narrow dataset.
As Simon said in his post:
> Honestly, it’s pretty terrible. Talking with it feels more like chatting with a Markov chain than an LLM—the responses may have a delightfully Victorian flavor to them but it’s hard to get a response that usefully answers a question.
I tried it and I agree.
If you have uv installed you can start a chat with the model (after a 2GB model download) with this one-liner:
This is fantastically cool. I can't believe this is possible with such a small and narrow dataset.