Mistral AI, the French company behind AI assistant Le Chat and several foundational models, is officially regarded as one of France’s most promising tech
They don’t allow more than a bunch of replies from unregistered until it block itself: naturally people that don’t want to bother with register and being stalked switch to other chatbot.
Why do Mistral have to shot them in the foot? It’s really a massive spending in terms of electric bill if they keep q/a to anonymous guests?
There are US companies that allow unregistered q/a: they know they need most expanse feedback from everyone; if you’re registered to a platform your kind of question aren’t the same as if you were anonymous: you need both (registered and anonymous) to understand how to improve the model.
Block unregistered user? They will go elsewhere.
We need more EU companies with AI and enough investment to understand this.
Ollama is also a great option for running Mistral models locally - super lightweight and I’ve been running the mistral-7b on my MacBook without issues, it even integrates nicely with audiobookshelf if ur into that kind of self-hosted setup.
They don’t allow more than a bunch of replies from unregistered until it block itself: naturally people that don’t want to bother with register and being stalked switch to other chatbot.
Why do Mistral have to shot them in the foot? It’s really a massive spending in terms of electric bill if they keep q/a to anonymous guests?
There are US companies that allow unregistered q/a: they know they need most expanse feedback from everyone; if you’re registered to a platform your kind of question aren’t the same as if you were anonymous: you need both (registered and anonymous) to understand how to improve the model.
Block unregistered user? They will go elsewhere.
We need more EU companies with AI and enough investment to understand this.
Has it been so long since people registered for ChatGPT that they’ve forgotten that it requires registration too?
Not sure why you’re getting downvoted, this is exactly the problem. As long as they hide behind a nag-wall they won’t attract a larger user-base.
if you want to run their models unregistered and unlimited you want to look into lm studio and others
you can run the llm on your local machine
Ollama is also a great option for running Mistral models locally - super lightweight and I’ve been running the mistral-7b on my MacBook without issues, it even integrates nicely with audiobookshelf if ur into that kind of self-hosted setup.
Yeah, just run it with Ollama