Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Indie:What is the least biased LLM? Many of the most popular ones, like ChatGPT and Grok, have political correctness filters or will outright reject being asked certain things.
If you want total freedom you can download a local model like the Llama or Mistral GGUF models. You just need a GPU with at least 6GB VRAM which even cheap gaming PC / laptops have nowadays. So what you do is you find an abliterated GGUF file for a decent local model (the best are the Llama models currently on Hugging Face). It needs to say it's abliterated. Otherwise it will have restrictions. And this is a LLM model where they have stripped all the security precautions. Theoretically this abliterated models should always answer. They run locally on your GPU obviously because nobody can take the legal responsibility of hosting something like that, This implies that the performance is limited by your own VRAM. That being said, as the technology progresses we get better at doing more with less. Now even with a small GPU you can run local models that have 128K token context window! and that's better than GPT was 2 years ago! And it's free! You literally just download it and it's yours forever, costs you literally nothing. I believe eventually local models will become the norm.What is the least biased LLM?