Uncensored A.I.

Open source models are essential and uncensored models are essential.

They can be found by going https://huggingface.co and searching for uncensored or abliterated.

One version based Mixtral-8x7b is Dolphin 2.5 Mixtral, has been enhanced with a special dataset that helps it avoid biases and alignment problems making it an uncensored version. This means that the AI is not just efficient, but it is also fair and can be used in a wide range of applications without favoring one group over another. The base model has 32k context and finetuned with 16k. New in Dolphin 2.5:

  1. Removed Samantha and WizardLM
  2. Added Synthia and OpenHermes and PureDove
  3. Added new Dolphin-Coder dataset
  4. Added MagiCoder dataset

DeepSeek V4 (1 Trillion parameters) is the current "heavyweight" champion. Using a sparse MoE architecture, it only activates a fraction of its parameters (around 50B) per token, allowing for high-speed inference despite its massive knowledge base. It is widely considered the first open-source model to consistently match the reasoning capabilities of GPT-5.

https://kleiber.me/blog/2024/01/07/six-ways-running-llm-locally/

  

📝 📜 ⏱️ ⬆️