home page  >  industry  >  key word  >  Latest information of text generation model  >  text

Eliminating alignment and bias, a multi-functional text generation model Dolphin Mixed 1x22b

2024-05-23 14:19 · Source: Home of webmaster

Message from webmaster's home (ChinaZ. com) on May 23: Dolphin 2.9.1Mixtral1x22b is a multi-function created by the Cognitive Computing team Text generation model

 image.png

This model has the following characteristics:

  • Multi functional text generation: it can process instructions, dialogues and coding tasks, and has a wide range of application capabilities.

  • High performance: The model is based on Dolphin-2.9-Mixtral-8x22b and has a 64k context. After full power re tuning, it uses 16k sequence length and takes 27 hours to complete on 8xH100GPU.

  • SLERP extraction technology: All 8 experts are combined through the SLERP extraction method, but not completely converted into a dense model, maintaining the performance of the original model.

  • Preliminary proxy capability: the model supports function calls, has preliminary proxy capability, and can handle more complex tasks.

Unrestricted model: The dataset is filtered to remove alignment and bias, and the model is designed to be unrestricted and highly compliant. This means that when generating text, there is no strict restriction or filtering mechanism, and the model may follow immoral requests and generate corresponding content.

The design idea of the model is to provide a text generation tool without review and moral restrictions, but this openness also brings certain risks, especially when users make immoral requests, the model may follow these requests.

Model address: https://huggingface.co/cognitivecomputations/dolphin-2.9.1-mixtral-1x22b

report

  • Related recommendations
  • Everyone is watching

Words everyone is searching for today:

Hot text

  • 3 days
  • 7 days