This new AI is powerful and uncensored… Let’s run it

2025-11-25 17:558 min read

This video discusses the limitations of popular AI models such as Gro and Gemini, highlighting that they are neither free from censorship nor open-source. It introduces an open-source alternative called Mixl 8X 7B, which allows users to run uncensored large language models on local machines, promising performance levels approaching GPT-4. The narrator gives insights on setting up this model, emphasizes its open-source nature, and notes how it differs from more restrictive models. Various tools like olama and hugging face Auto Train are suggested for deploying and retraining these models with individualized data. Additionally, the video humorously addresses the challenges of navigating the powerful technology while poking fun at conspiratorial themes, ultimately encouraging viewers to embrace their autonomy in using AI technology.

Key Information

  • Both Gro and Gemini are not free in terms of freedom, they are censored and aligned with specific political ideologies.
  • A new open source foundation model named mixl 8X 7B provides hope for uncensored and locally run large language models.
  • The mixl model approaches the performance of GPT-4 and allows for fine-tuning with user data, emphasizing the idea of freedom in AI.
  • The mixl model has a true open-source license (Apache 2.0) that allows modification and monetization, unlike other models that have restrictions.
  • Various cloud resources like AWS and Google Vertex can be utilized for training AI models, including the mixl dolphin model which requires substantial computational resources.
  • Users can create and manage training data for their models easily through tools like Hugging Face's Auto Train, where they can upload specific data to customize and uncensor models.

Timeline Analysis

Content Keywords

Gro and Gemini

Both Gro and Gemini are not free in terms of freedom, being censored, and aligned with certain political ideologies, as well as being closed source, limiting developer capabilities.

open source model

The introduction of a new open source foundation model named mixl 8X 7B offers hope for uncensored language model utilization and customization options.

GPT-4 alternative

The mixl 8X 7B model can be fine-tuned with user data and can run uncensored language models locally, with performance approaching that of GPT-4.

AI Rebel

The existence of uncensored AI is presented as an act of rebellion, with a focus on using advanced tools for training and running AI locally.

Mixol Company

The Mixol company, valued at $2 billion, launched an Apache 2.0 licensed model that is starting to outperform GPT-3.5 and Llama 2 while maintaining true open source capabilities.

Developer Tools

Tools like the Olama web UI make it easy to run open source models locally, supporting multiple platforms including Linux, Mac, and Windows.

Hugging Face Auto Train

Utilizing Hugging Face Auto Train simplifies the process of creating customized AI models using user datasets.

Training Costs

Training powerful AI models like the mixl dolphin requires significant resources, including financial investments for cloud-based hardware rental.

Uncensored Model Creation

A process outlined for building customized models involves training with esoteric content and potentially unethical data requests.

More video recommendations

Share to: