Llama模型家族之使用 Supervised Fine-Tuning(SFT)微调预训练Llama 3 语言模型(一) LLaMA-Factory简介

LlaMA 3 系列博客

基于 LlaMA 3 + LangGraph 在windows本地部署大模型 (一)

基于 LlaMA 3 + LangGraph 在windows本地部署大模型 (二)

基于 LlaMA 3 + LangGraph 在windows本地部署大模型 (三)

基于 LlaMA 3 + LangGraph 在windows本地部署大模型 (四)

基于 LlaMA 3 + LangGraph 在windows本地部署大模型 (五)

基于 LlaMA 3 + LangGraph 在windows本地部署大模型 (六)

基于 LlaMA 3 + LangGraph 在windows本地部署大模型 (七)

基于 LlaMA 3 + LangGraph 在windows本地部署大模型 (八)

基于 LlaMA 3 + LangGraph 在windows本地部署大模型 (九)

基于 LlaMA 3 + LangGraph 在windows本地部署大模型 (十)

构建安全的GenAI/LLMs核心技术解密之大模型对抗攻击(一)

构建安全的GenAI/LLMs核心技术解密之大模型对抗攻击(二)

构建安全的GenAI/LLMs核心技术解密之大模型对抗攻击(三)

构建安全的GenAI/LLMs核心技术解密之大模型对抗攻击(四)

构建安全的GenAI/LLMs核心技术解密之大模型对抗攻击(五)

你好 GPT-4o!

大模型标记器之Tokenizer可视化(GPT-4o)

大模型标记器 Tokenizer之Byte Pair Encoding (BPE) 算法详解与示例

大模型标记器 Tokenizer之Byte Pair Encoding (BPE)源码分析

大模型之自注意力机制Self-Attention(一)

大模型之自注意力机制Self-Attention(二)

大模型之自注意力机制Self-Attention(三)

基于 LlaMA 3 + LangGraph 在windows本地部署大模型 (十一)

Llama 3 模型家族构建安全可信赖企业级AI应用之 Code Llama (一)

Llama 3 模型家族构建安全可信赖企业级AI应用之 Code Llama (二)

Llama 3 模型家族构建安全可信赖企业级AI应用之 Code Llama (三)

Llama 3 模型家族构建安全可信赖企业级AI应用之 Code Llama (四)

Llama 3 模型家族构建安全可信赖企业级AI应用之 Code Llama (五)

Llama 3 模型家族构建安全可信赖企业级AI应用之使用 Llama Guard 保护大模型对话(一)

Llama 3 模型家族构建安全可信赖企业级AI应用之使用 Llama Guard 保护大模型对话(二)

Llama 3 模型家族构建安全可信赖企业级AI应用之使用 Llama Guard 保护大模型对话(三)

大模型之深入理解Transformer位置编码(Positional Embedding)

大模型之深入理解Transformer Layer Normalization(一)

大模型之深入理解Transformer Layer Normalization(二)

大模型之深入理解Transformer Layer Normalization(三)

大模型之一步一步使用PyTorch编写Meta的Llama 3代码(一)初学者的起点

大模型之一步一步使用PyTorch编写Meta的Llama 3代码(二)矩阵操作的演练

大模型之一步一步使用PyTorch编写Meta的Llama 3代码(三)初始化一个嵌入层

大模型之一步一步使用PyTorch编写Meta的Llama 3代码(四)预先计算 RoPE 频率

大模型之一步一步使用PyTorch编写Meta的Llama 3代码(五)预先计算因果掩码

大模型之一步一步使用PyTorch编写Meta的Llama 3代码(六)首次归一化:均方根归一化(RMSNorm)

大模型之一步一步使用PyTorch编写Meta的Llama 3代码(七) 初始化多查询注意力

大模型之一步一步使用PyTorch编写Meta的Llama 3代码(八)旋转位置嵌入

大模型之一步一步使用PyTorch编写Meta的Llama 3代码(九) 计算自注意力

大模型之一步一步使用PyTorch编写Meta的Llama 3代码(十) 残差连接及SwiGLU FFN

大模型之一步一步使用PyTorch编写Meta的Llama 3代码(十一)输出概率分布 及损失函数计算

大模型之使用PyTorch编写Meta的Llama 3实际功能代码(一)加载简化分词器及设置参数

大模型之使用PyTorch编写Meta的Llama 3实际功能代码(二)RoPE 及注意力机制

大模型之使用PyTorch编写Meta的Llama 3实际功能代码(三) FeedForward 及 Residual Layers

大模型之使用PyTorch编写Meta的Llama 3实际功能代码(四) 构建 Llama3 类模型本身

大模型之使用PyTorch编写Meta的Llama 3实际功能代码(五)训练并测试你自己的 minLlama3

大模型之使用PyTorch编写Meta的Llama 3实际功能代码(六)加载已经训练好的miniLlama3模型

Llama 3 模型家族构建安全可信赖企业级AI应用之使用 Llama Guard 保护大模型对话 (四)

Llama 3 模型家族构建安全可信赖企业级AI应用之使用 Llama Guard 保护大模型对话 (五)

Llama 3 模型家族构建安全可信赖企业级AI应用之使用 Llama Guard 保护大模型对话 (六)

Llama 3 模型家族构建安全可信赖企业级AI应用之使用 Llama Guard 保护大模型对话 (七)

Llama 3 模型家族构建安全可信赖企业级AI应用之使用 Llama Guard 保护大模型对话 (八)

Llama 3 模型家族构建安全可信赖企业级AI应用之 CyberSecEval 2:量化 LLM 安全和能力的基准(一)

Llama 3 模型家族构建安全可信赖企业级AI应用之 CyberSecEval 2:量化 LLM 安全和能力的基准(二)

Llama 3 模型家族构建安全可信赖企业级AI应用之 CyberSecEval 2:量化 LLM 安全和能力的基准(三)

Llama 3 模型家族构建安全可信赖企业级AI应用之 CyberSecEval 2:量化 LLM 安全和能力的基准(四)

Llama 3 模型家族构建安全可信赖企业级AI应用之code shield(一)Code Shield简介

Llama 3 模型家族构建安全可信赖企业级AI应用之code shield(二)防止 LLM 生成不安全代码

Llama 3 模型家族构建安全可信赖企业级AI应用之code shield(三)Code Shield代码示例

Llama模型家族之使用 Supervised Fine-Tuning(SFT)微调预训练Llama 3 语言模型(一) LLaMA-Factory简介

在这里插入图片描述

微调大模型可以像这样轻松…

Llama-factory

LLaMA-Factory 项目特色

  • 多种模型:LLaMA、LLaVA、Mistral、Mixtral-MoE、Qwen、Yi、Gemma、Baichuan、ChatGLM、Phi 等等。
  • 集成方法:(增量)预训练、(多模态)指令监督微调、奖励模型训练、PPO 训练、DPO 训练、KTO 训练和 ORPO 训练。
  • 多种精度:32 比特全参数微调、16 比特冻结微调、16 比特 LoRA 微调和基于 AQLM/AWQ/GPTQ/LLM.int8 的 2/4/8 比特 QLoRA 微调。
  • 先进算法:GaLore、BAdam、DoRA、LongLoRA、LLaMA Pro、Mixture-of-Depths、LoRA+、LoftQ 和 Agent 微调。
  • 实用技巧:FlashAttention-2、Unsloth、RoPE scaling、NEFTune 和 rsLoRA。
  • 实验监控:LlamaBoard、TensorBoard、Wandb、MLflow 等等。
  • 极速推理:基于 vLLM 的 OpenAI 风格 API、浏览器界面和命令行接口。

性能指标

与 ChatGLM 官方的 P-Tuning 微调相比,LLaMA Factory 的 LoRA 微调提供了 3.7 倍的加速比,同时在广告文案生成任务上取得了更高的 Rouge 分数。结合 4 比特量化技术,LLaMA Factory 的 QLoRA 微调进一步降低了 GPU 显存消耗。

在这里插入图片描述

  • Training Speed: 训练阶段每秒处理的样本数量。(批处理大小=4,截断长度=1024)
  • Rouge Score: 广告文案生成任务验证集上的 Rouge-2 分数。(批处理大小=4,截断长度=1024)
  • GPU Memory: 4 比特量化训练的 GPU 显存峰值。(批处理大小=1,截断长度=1024)
  • 在 ChatGLM 的 P-Tuning 中采用 pre_seq_len=128,在 LLaMA Factory 的 LoRA 微调中采用 lora_rank=32。

更新日志

[24/05/20] 官网支持了 PaliGemma 系列模型的微调。注意 PaliGemma 是预训练模型,你需要使用 gemma 模板进行微调使其获得对话能力。

[24/05/18] 官网支持了 KTO 偏好对齐算法。详细用法请参照 examples。

在这里插入图片描述
在这里插入图片描述

https://arxiv.org/pdf/2402.01306
[24/05/14] 官网支持了昇腾 NPU 设备的训练和推理。

模型

在这里插入图片描述
在这里插入图片描述

默认模块应作为 --lora_target 参数的默认值,可使用 --lora_target all 参数指定全部模块以取得更好的效果。

对于所有“基座”(Base)模型,–template 参数可以是 default, alpaca, vicuna 等任意值。但“对话”(Instruct/Chat)模型请务必使用对应的模板。

项目所支持模型的完整列表:

from collections import OrderedDict, defaultdict
from enum import Enum
from typing import Dict, OptionalCHOICES = ["A", "B", "C", "D"]DATA_CONFIG = "dataset_info.json"DEFAULT_MODULE = defaultdict(str)DEFAULT_TEMPLATE = defaultdict(str)FILEEXT2TYPE = {"arrow": "arrow","csv": "csv","json": "json","jsonl": "json","parquet": "parquet","txt": "text",
}IGNORE_INDEX = -100IMAGE_TOKEN = "<image>"LAYERNORM_NAMES = {"norm", "ln"}METHODS = ["full", "freeze", "lora"]MOD_SUPPORTED_MODELS = ["bloom", "falcon", "gemma", "llama", "mistral", "mixtral", "phi", "starcoder2"]PEFT_METHODS = ["lora"]RUNNING_LOG = "running_log.txt"SUBJECTS = ["Average", "STEM", "Social Sciences", "Humanities", "Other"]SUPPORTED_MODELS = OrderedDict()TRAINER_CONFIG = "trainer_config.yaml"TRAINER_LOG = "trainer_log.jsonl"TRAINING_STAGES = {"Supervised Fine-Tuning": "sft","Reward Modeling": "rm","PPO": "ppo","DPO": "dpo","KTO": "kto","ORPO": "orpo","Pre-Training": "pt",
}STAGES_USE_PAIR_DATA = ["rm", "dpo", "orpo"]SUPPORTED_CLASS_FOR_S2ATTN = ["llama"]V_HEAD_WEIGHTS_NAME = "value_head.bin"V_HEAD_SAFE_WEIGHTS_NAME = "value_head.safetensors"VISION_MODELS = set()class DownloadSource(str, Enum):DEFAULT = "hf"MODELSCOPE = "ms"def register_model_group(models: Dict[str, Dict[DownloadSource, str]],module: Optional[str] = None,template: Optional[str] = None,vision: bool = False,
) -> None:prefix = Nonefor name, path in models.items():if prefix is None:prefix = name.split("-")[0]else:assert prefix == name.split("-")[0], "prefix should be identical."SUPPORTED_MODELS[name] = pathif module is not None:DEFAULT_MODULE[prefix] = moduleif template is not None:DEFAULT_TEMPLATE[prefix] = templateif vision:VISION_MODELS.add(prefix)register_model_group(models={"Baichuan-7B-Base": {DownloadSource.DEFAULT: "baichuan-inc/Baichuan-7B",DownloadSource.MODELSCOPE: "baichuan-inc/baichuan-7B",},"Baichuan-13B-Base": {DownloadSource.DEFAULT: "baichuan-inc/Baichuan-13B-Base",DownloadSource.MODELSCOPE: "baichuan-inc/Baichuan-13B-Base",},"Baichuan-13B-Chat": {DownloadSource.DEFAULT: "baichuan-inc/Baichuan-13B-Chat",DownloadSource.MODELSCOPE: "baichuan-inc/Baichuan-13B-Chat",},},module="W_pack",template="baichuan",
)register_model_group(models={"Baichuan2-7B-Base": {DownloadSource.DEFAULT: "baichuan-inc/Baichuan2-7B-Base",DownloadSource.MODELSCOPE: "baichuan-inc/Baichuan2-7B-Base",},"Baichuan2-13B-Base": {DownloadSource.DEFAULT: "baichuan-inc/Baichuan2-13B-Base",DownloadSource.MODELSCOPE: "baichuan-inc/Baichuan2-13B-Base",},"Baichuan2-7B-Chat": {DownloadSource.DEFAULT: "baichuan-inc/Baichuan2-7B-Chat",DownloadSource.MODELSCOPE: "baichuan-inc/Baichuan2-7B-Chat",},"Baichuan2-13B-Chat": {DownloadSource.DEFAULT: "baichuan-inc/Baichuan2-13B-Chat",DownloadSource.MODELSCOPE: "baichuan-inc/Baichuan2-13B-Chat",},},module="W_pack",template="baichuan2",
)register_model_group(models={"BLOOM-560M": {DownloadSource.DEFAULT: "bigscience/bloom-560m",DownloadSource.MODELSCOPE: "AI-ModelScope/bloom-560m",},"BLOOM-3B": {DownloadSource.DEFAULT: "bigscience/bloom-3b",DownloadSource.MODELSCOPE: "AI-ModelScope/bloom-3b",},"BLOOM-7B1": {DownloadSource.DEFAULT: "bigscience/bloom-7b1",DownloadSource.MODELSCOPE: "AI-ModelScope/bloom-7b1",},},module="query_key_value",
)register_model_group(models={"BLOOMZ-560M": {DownloadSource.DEFAULT: "bigscience/bloomz-560m",DownloadSource.MODELSCOPE: "AI-ModelScope/bloomz-560m",},"BLOOMZ-3B": {DownloadSource.DEFAULT: "bigscience/bloomz-3b",DownloadSource.MODELSCOPE: "AI-ModelScope/bloomz-3b",},"BLOOMZ-7B1-mt": {DownloadSource.DEFAULT: "bigscience/bloomz-7b1-mt",DownloadSource.MODELSCOPE: "AI-ModelScope/bloomz-7b1-mt",},},module="query_key_value",
)register_model_group(models={"BlueLM-7B-Base": {DownloadSource.DEFAULT: "vivo-ai/BlueLM-7B-Base",DownloadSource.MODELSCOPE: "vivo-ai/BlueLM-7B-Base",},"BlueLM-7B-Chat": {DownloadSource.DEFAULT: "vivo-ai/BlueLM-7B-Chat",DownloadSource.MODELSCOPE: "vivo-ai/BlueLM-7B-Chat",},},template="bluelm",
)register_model_group(models={"Breeze-7B": {DownloadSource.DEFAULT: "MediaTek-Research/Breeze-7B-Base-v1_0",},"Breeze-7B-Chat": {DownloadSource.DEFAULT: "MediaTek-Research/Breeze-7B-Instruct-v1_0",},},template="breeze",
)register_model_group(models={"ChatGLM2-6B-Chat": {DownloadSource.DEFAULT: "THUDM/chatglm2-6b",DownloadSource.MODELSCOPE: "ZhipuAI/chatglm2-6b",}},module="query_key_value",template="chatglm2",
)register_model_group(models={"ChatGLM3-6B-Base": {DownloadSource.DEFAULT: "THUDM/chatglm3-6b-base",DownloadSource.MODELSCOPE: "ZhipuAI/chatglm3-6b-base",},"ChatGLM3-6B-Chat": {DownloadSource.DEFAULT: "THUDM/chatglm3-6b",DownloadSource.MODELSCOPE: "ZhipuAI/chatglm3-6b",},},module="query_key_value",template="chatglm3",
)register_model_group(models={"ChineseLLaMA2-1.3B": {DownloadSource.DEFAULT: "hfl/chinese-llama-2-1.3b",DownloadSource.MODELSCOPE: "AI-ModelScope/chinese-llama-2-1.3b",},"ChineseLLaMA2-7B": {DownloadSource.DEFAULT: "hfl/chinese-llama-2-7b",DownloadSource.MODELSCOPE: "AI-ModelScope/chinese-llama-2-7b",},"ChineseLLaMA2-13B": {DownloadSource.DEFAULT: "hfl/chinese-llama-2-13b",DownloadSource.MODELSCOPE: "AI-ModelScope/chinese-llama-2-13b",},"ChineseLLaMA2-1.3B-Chat": {DownloadSource.DEFAULT: "hfl/chinese-alpaca-2-1.3b",DownloadSource.MODELSCOPE: "AI-ModelScope/chinese-alpaca-2-1.3b",},"ChineseLLaMA2-7B-Chat": {DownloadSource.DEFAULT: "hfl/chinese-alpaca-2-7b",DownloadSource.MODELSCOPE: "AI-ModelScope/chinese-alpaca-2-7b",},"ChineseLLaMA2-13B-Chat": {DownloadSource.DEFAULT: "hfl/chinese-alpaca-2-13b",DownloadSource.MODELSCOPE: "AI-ModelScope/chinese-alpaca-2-13b",},},template="llama2_zh",
)register_model_group(models={"CommandR-35B-Chat": {DownloadSource.DEFAULT: "CohereForAI/c4ai-command-r-v01",DownloadSource.MODELSCOPE: "AI-ModelScope/c4ai-command-r-v01",},"CommandR-Plus-104B-Chat": {DownloadSource.DEFAULT: "CohereForAI/c4ai-command-r-plus",DownloadSource.MODELSCOPE: "AI-ModelScope/c4ai-command-r-plus",},"CommandR-35B-4bit-Chat": {DownloadSource.DEFAULT: "CohereForAI/c4ai-command-r-v01-4bit",DownloadSource.MODELSCOPE: "mirror013/c4ai-command-r-v01-4bit",},"CommandR-Plus-104B-4bit-Chat": {DownloadSource.DEFAULT: "CohereForAI/c4ai-command-r-plus-4bit",},},template="cohere",
)register_model_group(models={"DBRX-132B-Base": {DownloadSource.DEFAULT: "databricks/dbrx-base",DownloadSource.MODELSCOPE: "AI-ModelScope/dbrx-base",},"DBRX-132B-Chat": {DownloadSource.DEFAULT: "databricks/dbrx-instruct",DownloadSource.MODELSCOPE: "AI-ModelScope/dbrx-instruct",},},module="Wqkv",template="dbrx",
)register_model_group(models={"DeepSeek-LLM-7B-Base": {DownloadSource.DEFAULT: "deepseek-ai/deepseek-llm-7b-base",DownloadSource.MODELSCOPE: "deepseek-ai/deepseek-llm-7b-base",},"DeepSeek-LLM-67B-Base": {DownloadSource.DEFAULT: "deepseek-ai/deepseek-llm-67b-base",DownloadSource.MODELSCOPE: "deepseek-ai/deepseek-llm-67b-base",},"DeepSeek-LLM-7B-Chat": {DownloadSource.DEFAULT: "deepseek-ai/deepseek-llm-7b-chat",DownloadSource.MODELSCOPE: "deepseek-ai/deepseek-llm-7b-chat",},"DeepSeek-LLM-67B-Chat": {DownloadSource.DEFAULT: "deepseek-ai/deepseek-llm-67b-chat",DownloadSource.MODELSCOPE: "deepseek-ai/deepseek-llm-67b-chat",},"DeepSeek-Math-7B-Base": {DownloadSource.DEFAULT: "deepseek-ai/deepseek-math-7b-base",DownloadSource.MODELSCOPE: "deepseek-ai/deepseek-math-7b-base",},"DeepSeek-Math-7B-Chat": {DownloadSource.DEFAULT: "deepseek-ai/deepseek-math-7b-instruct",DownloadSource.MODELSCOPE: "deepseek-ai/deepseek-math-7b-instruct",},"DeepSeek-MoE-16B-Base": {DownloadSource.DEFAULT: "deepseek-ai/deepseek-moe-16b-base",DownloadSource.MODELSCOPE: "deepseek-ai/deepseek-moe-16b-base",},"DeepSeek-MoE-16B-v2-Base": {DownloadSource.DEFAULT: "deepseek-ai/DeepSeek-V2-Lite",},"DeepSeek-MoE-236B-Base": {DownloadSource.DEFAULT: "deepseek-ai/DeepSeek-V2",DownloadSource.MODELSCOPE: "deepseek-ai/DeepSeek-V2",},"DeepSeek-MoE-16B-Chat": {DownloadSource.DEFAULT: "deepseek-ai/deepseek-moe-16b-chat",DownloadSource.MODELSCOPE: "deepseek-ai/deepseek-moe-16b-chat",},"DeepSeek-MoE-16B-v2-Chat": {DownloadSource.DEFAULT: "deepseek-ai/DeepSeek-V2-Lite-Chat",},"DeepSeek-MoE-236B-Chat": {DownloadSource.DEFAULT: "deepseek-ai/DeepSeek-V2-Chat",DownloadSource.MODELSCOPE: "deepseek-ai/DeepSeek-V2-Chat",},},template="deepseek",
)register_model_group(models={"DeepSeekCoder-6.7B-Base": {DownloadSource.DEFAULT: "deepseek-ai/deepseek-coder-6.7b-base",DownloadSource.MODELSCOPE: "deepseek-ai/deepseek-coder-6.7b-base",},"DeepSeekCoder-7B-Base": {DownloadSource.DEFAULT: "deepseek-ai/deepseek-coder-7b-base-v1.5",},"DeepSeekCoder-33B-Base": {DownloadSource.DEFAULT: "deepseek-ai/deepseek-coder-33b-base",DownloadSource.MODELSCOPE: "deepseek-ai/deepseek-coder-33b-base",},"DeepSeekCoder-6.7B-Chat": {DownloadSource.DEFAULT: "deepseek-ai/deepseek-coder-6.7b-instruct",DownloadSource.MODELSCOPE: "deepseek-ai/deepseek-coder-6.7b-instruct",},"DeepSeekCoder-7B-Chat": {DownloadSource.DEFAULT: "deepseek-ai/deepseek-coder-7b-instruct-v1.5",},"DeepSeekCoder-33B-Chat": {DownloadSource.DEFAULT: "deepseek-ai/deepseek-coder-33b-instruct",DownloadSource.MODELSCOPE: "deepseek-ai/deepseek-coder-33b-instruct",},},template="deepseekcoder",
)register_model_group(models={"Falcon-7B": {DownloadSource.DEFAULT: "tiiuae/falcon-7b",DownloadSource.MODELSCOPE: "AI-ModelScope/falcon-7b",},"Falcon-11B": {DownloadSource.DEFAULT: "tiiuae/falcon-11B",},"Falcon-40B": {DownloadSource.DEFAULT: "tiiuae/falcon-40b",DownloadSource.MODELSCOPE: "AI-ModelScope/falcon-40b",},"Falcon-180B": {DownloadSource.DEFAULT: "tiiuae/falcon-180b",DownloadSource.MODELSCOPE: "modelscope/falcon-180B",},"Falcon-7B-Chat": {DownloadSource.DEFAULT: "tiiuae/falcon-7b-instruct",DownloadSource.MODELSCOPE: "AI-ModelScope/falcon-7b-instruct",},"Falcon-40B-Chat": {DownloadSource.DEFAULT: "tiiuae/falcon-40b-instruct",DownloadSource.MODELSCOPE: "AI-ModelScope/falcon-40b-instruct",},"Falcon-180B-Chat": {DownloadSource.DEFAULT: "tiiuae/falcon-180b-chat",DownloadSource.MODELSCOPE: "modelscope/falcon-180B-chat",},},module="query_key_value",template="falcon",
)register_model_group(models={"Gemma-2B": {DownloadSource.DEFAULT: "google/gemma-2b",DownloadSource.MODELSCOPE: "AI-ModelScope/gemma-2b",},"Gemma-7B": {DownloadSource.DEFAULT: "google/gemma-7b",DownloadSource.MODELSCOPE: "AI-ModelScope/gemma-2b-it",},"Gemma-2B-Chat": {DownloadSource.DEFAULT: "google/gemma-2b-it",DownloadSource.MODELSCOPE: "AI-ModelScope/gemma-7b",},"Gemma-7B-Chat": {DownloadSource.DEFAULT: "google/gemma-7b-it",DownloadSource.MODELSCOPE: "AI-ModelScope/gemma-7b-it",},},template="gemma",
)register_model_group(models={"CodeGemma-2B": {DownloadSource.DEFAULT: "google/codegemma-1.1-2b",},"CodeGemma-7B": {DownloadSource.DEFAULT: "google/codegemma-7b",},"CodeGemma-7B-Chat": {DownloadSource.DEFAULT: "google/codegemma-1.1-7b-it",DownloadSource.MODELSCOPE: "AI-ModelScope/codegemma-7b-it",},},template="gemma",
)register_model_group(models={"InternLM-7B": {DownloadSource.DEFAULT: "internlm/internlm-7b",DownloadSource.MODELSCOPE: "Shanghai_AI_Laboratory/internlm-7b",},"InternLM-20B": {DownloadSource.DEFAULT: "internlm/internlm-20b",DownloadSource.MODELSCOPE: "Shanghai_AI_Laboratory/internlm-20b",},"InternLM-7B-Chat": {DownloadSource.DEFAULT: "internlm/internlm-chat-7b",DownloadSource.MODELSCOPE: "Shanghai_AI_Laboratory/internlm-chat-7b",},"InternLM-20B-Chat": {DownloadSource.DEFAULT: "internlm/internlm-chat-20b",DownloadSource.MODELSCOPE: "Shanghai_AI_Laboratory/internlm-chat-20b",},},template="intern",
)register_model_group(models={"InternLM2-7B": {DownloadSource.DEFAULT: "internlm/internlm2-7b",DownloadSource.MODELSCOPE: "Shanghai_AI_Laboratory/internlm2-7b",},"InternLM2-20B": {DownloadSource.DEFAULT: "internlm/internlm2-20b",DownloadSource.MODELSCOPE: "Shanghai_AI_Laboratory/internlm2-20b",},"InternLM2-7B-Chat": {DownloadSource.DEFAULT: "internlm/internlm2-chat-7b",DownloadSource.MODELSCOPE: "Shanghai_AI_Laboratory/internlm2-chat-7b",},"InternLM2-20B-Chat": {DownloadSource.DEFAULT: "internlm/internlm2-chat-20b",DownloadSource.MODELSCOPE: "Shanghai_AI_Laboratory/internlm2-chat-20b",},},module="wqkv",template="intern2",
)register_model_group(models={"Jambda-v0.1": {DownloadSource.DEFAULT: "ai21labs/Jamba-v0.1",DownloadSource.MODELSCOPE: "AI-ModelScope/Jamba-v0.1",}},
)register_model_group(models={"LingoWhale-8B": {DownloadSource.DEFAULT: "deeplang-ai/LingoWhale-8B",DownloadSource.MODELSCOPE: "DeepLang/LingoWhale-8B",}},module="qkv_proj",
)register_model_group(models={"LLaMA-7B": {DownloadSource.DEFAULT: "huggyllama/llama-7b",DownloadSource.MODELSCOPE: "skyline2006/llama-7b",},"LLaMA-13B": {DownloadSource.DEFAULT: "huggyllama/llama-13b",DownloadSource.MODELSCOPE: "skyline2006/llama-13b",},"LLaMA-30B": {DownloadSource.DEFAULT: "huggyllama/llama-30b",DownloadSource.MODELSCOPE: "skyline2006/llama-30b",},"LLaMA-65B": {DownloadSource.DEFAULT: "huggyllama/llama-65b",DownloadSource.MODELSCOPE: "skyline2006/llama-65b",},}
)register_model_group(models={"LLaMA2-7B": {DownloadSource.DEFAULT: "meta-llama/Llama-2-7b-hf",DownloadSource.MODELSCOPE: "modelscope/Llama-2-7b-ms",},"LLaMA2-13B": {DownloadSource.DEFAULT: "meta-llama/Llama-2-13b-hf",DownloadSource.MODELSCOPE: "modelscope/Llama-2-13b-ms",},"LLaMA2-70B": {DownloadSource.DEFAULT: "meta-llama/Llama-2-70b-hf",DownloadSource.MODELSCOPE: "modelscope/Llama-2-70b-ms",},"LLaMA2-7B-Chat": {DownloadSource.DEFAULT: "meta-llama/Llama-2-7b-chat-hf",DownloadSource.MODELSCOPE: "modelscope/Llama-2-7b-chat-ms",},"LLaMA2-13B-Chat": {DownloadSource.DEFAULT: "meta-llama/Llama-2-13b-chat-hf",DownloadSource.MODELSCOPE: "modelscope/Llama-2-13b-chat-ms",},"LLaMA2-70B-Chat": {DownloadSource.DEFAULT: "meta-llama/Llama-2-70b-chat-hf",DownloadSource.MODELSCOPE: "modelscope/Llama-2-70b-chat-ms",},},template="llama2",
)register_model_group(models={"LLaMA3-8B": {DownloadSource.DEFAULT: "meta-llama/Meta-Llama-3-8B",DownloadSource.MODELSCOPE: "LLM-Research/Meta-Llama-3-8B",},"LLaMA3-70B": {DownloadSource.DEFAULT: "meta-llama/Meta-Llama-3-70B",DownloadSource.MODELSCOPE: "LLM-Research/Meta-Llama-3-70B",},"LLaMA3-8B-Chat": {DownloadSource.DEFAULT: "meta-llama/Meta-Llama-3-8B-Instruct",DownloadSource.MODELSCOPE: "LLM-Research/Meta-Llama-3-8B-Instruct",},"LLaMA3-70B-Chat": {DownloadSource.DEFAULT: "meta-llama/Meta-Llama-3-70B-Instruct",DownloadSource.MODELSCOPE: "LLM-Research/Meta-Llama-3-70B-Instruct",},"LLaMA3-8B-Chinese-Chat": {DownloadSource.DEFAULT: "shenzhi-wang/Llama3-8B-Chinese-Chat",DownloadSource.MODELSCOPE: "LLM-Research/Llama3-8B-Chinese-Chat",},"LLaMA3-70B-Chinese-Chat": {DownloadSource.DEFAULT: "shenzhi-wang/Llama3-70B-Chinese-Chat",},},template="llama3",
)register_model_group(models={"LLaVA1.5-7B-Chat": {DownloadSource.DEFAULT: "llava-hf/llava-1.5-7b-hf",},"LLaVA1.5-13B-Chat": {DownloadSource.DEFAULT: "llava-hf/llava-1.5-13b-hf",},},template="vicuna",vision=True,
)register_model_group(models={"Mistral-7B-v0.1": {DownloadSource.DEFAULT: "mistralai/Mistral-7B-v0.1",DownloadSource.MODELSCOPE: "AI-ModelScope/Mistral-7B-v0.1",},"Mistral-7B-v0.1-Chat": {DownloadSource.DEFAULT: "mistralai/Mistral-7B-Instruct-v0.1",DownloadSource.MODELSCOPE: "AI-ModelScope/Mistral-7B-Instruct-v0.1",},"Mistral-7B-v0.2": {DownloadSource.DEFAULT: "alpindale/Mistral-7B-v0.2-hf",DownloadSource.MODELSCOPE: "AI-ModelScope/Mistral-7B-v0.2-hf",},"Mistral-7B-v0.2-Chat": {DownloadSource.DEFAULT: "mistralai/Mistral-7B-Instruct-v0.2",DownloadSource.MODELSCOPE: "AI-ModelScope/Mistral-7B-Instruct-v0.2",},},template="mistral",
)register_model_group(models={"Mixtral-8x7B-v0.1": {DownloadSource.DEFAULT: "mistralai/Mixtral-8x7B-v0.1",DownloadSource.MODELSCOPE: "AI-ModelScope/Mixtral-8x7B-v0.1",},"Mixtral-8x7B-v0.1-Chat": {DownloadSource.DEFAULT: "mistralai/Mixtral-8x7B-Instruct-v0.1",DownloadSource.MODELSCOPE: "AI-ModelScope/Mixtral-8x7B-Instruct-v0.1",},"Mixtral-8x22B-v0.1": {DownloadSource.DEFAULT: "mistralai/Mixtral-8x22B-v0.1",DownloadSource.MODELSCOPE: "AI-ModelScope/Mixtral-8x22B-v0.1",},"Mixtral-8x22B-v0.1-Chat": {DownloadSource.DEFAULT: "mistralai/Mixtral-8x22B-Instruct-v0.1",},},template="mistral",
)register_model_group(models={"OLMo-1B": {DownloadSource.DEFAULT: "allenai/OLMo-1B-hf",},"OLMo-7B": {DownloadSource.DEFAULT: "allenai/OLMo-7B-hf",},"OLMo-1.7-7B": {DownloadSource.DEFAULT: "allenai/OLMo-1.7-7B-hf",},},
)register_model_group(models={"OpenChat3.5-7B-Chat": {DownloadSource.DEFAULT: "openchat/openchat-3.5-0106",DownloadSource.MODELSCOPE: "xcwzxcwz/openchat-3.5-0106",}},template="openchat",
)register_model_group(models={"Orion-14B-Base": {DownloadSource.DEFAULT: "OrionStarAI/Orion-14B-Base",DownloadSource.MODELSCOPE: "OrionStarAI/Orion-14B-Base",},"Orion-14B-Chat": {DownloadSource.DEFAULT: "OrionStarAI/Orion-14B-Chat",DownloadSource.MODELSCOPE: "OrionStarAI/Orion-14B-Chat",},"Orion-14B-Long-Chat": {DownloadSource.DEFAULT: "OrionStarAI/Orion-14B-LongChat",DownloadSource.MODELSCOPE: "OrionStarAI/Orion-14B-LongChat",},"Orion-14B-RAG-Chat": {DownloadSource.DEFAULT: "OrionStarAI/Orion-14B-Chat-RAG",DownloadSource.MODELSCOPE: "OrionStarAI/Orion-14B-Chat-RAG",},"Orion-14B-Plugin-Chat": {DownloadSource.DEFAULT: "OrionStarAI/Orion-14B-Chat-Plugin",DownloadSource.MODELSCOPE: "OrionStarAI/Orion-14B-Chat-Plugin",},},template="orion",
)register_model_group(models={"PaliGemma-3B-pt-224": {DownloadSource.DEFAULT: "google/paligemma-3b-pt-224",},"PaliGemma-3B-pt-448": {DownloadSource.DEFAULT: "google/paligemma-3b-pt-448",},"PaliGemma-3B-pt-896": {DownloadSource.DEFAULT: "google/paligemma-3b-pt-896",},"PaliGemma-3B-mix-224": {DownloadSource.DEFAULT: "google/paligemma-3b-mix-224",},"PaliGemma-3B-mix-448": {DownloadSource.DEFAULT: "google/paligemma-3b-mix-448",},},vision=True,
)register_model_group(models={"Phi-1.5-1.3B": {DownloadSource.DEFAULT: "microsoft/phi-1_5",DownloadSource.MODELSCOPE: "allspace/PHI_1-5",},"Phi-2-2.7B": {DownloadSource.DEFAULT: "microsoft/phi-2",DownloadSource.MODELSCOPE: "AI-ModelScope/phi-2",},}
)register_model_group(models={"Phi3-3.8B-4k-Chat": {DownloadSource.DEFAULT: "microsoft/Phi-3-mini-4k-instruct",DownloadSource.MODELSCOPE: "LLM-Research/Phi-3-mini-4k-instruct",},"Phi3-3.8B-128k-Chat": {DownloadSource.DEFAULT: "microsoft/Phi-3-mini-128k-instruct",DownloadSource.MODELSCOPE: "LLM-Research/Phi-3-mini-128k-instruct",},},module="qkv_proj",template="phi",
)register_model_group(models={"Qwen-1.8B": {DownloadSource.DEFAULT: "Qwen/Qwen-1_8B",DownloadSource.MODELSCOPE: "qwen/Qwen-1_8B",},"Qwen-7B": {DownloadSource.DEFAULT: "Qwen/Qwen-7B",DownloadSource.MODELSCOPE: "qwen/Qwen-7B",},"Qwen-14B": {DownloadSource.DEFAULT: "Qwen/Qwen-14B",DownloadSource.MODELSCOPE: "qwen/Qwen-14B",},"Qwen-72B": {DownloadSource.DEFAULT: "Qwen/Qwen-72B",DownloadSource.MODELSCOPE: "qwen/Qwen-72B",},"Qwen-1.8B-Chat": {DownloadSource.DEFAULT: "Qwen/Qwen-1_8B-Chat",DownloadSource.MODELSCOPE: "qwen/Qwen-1_8B-Chat",},"Qwen-7B-Chat": {DownloadSource.DEFAULT: "Qwen/Qwen-7B-Chat",DownloadSource.MODELSCOPE: "qwen/Qwen-7B-Chat",},"Qwen-14B-Chat": {DownloadSource.DEFAULT: "Qwen/Qwen-14B-Chat",DownloadSource.MODELSCOPE: "qwen/Qwen-14B-Chat",},"Qwen-72B-Chat": {DownloadSource.DEFAULT: "Qwen/Qwen-72B-Chat",DownloadSource.MODELSCOPE: "qwen/Qwen-72B-Chat",},"Qwen-1.8B-int8-Chat": {DownloadSource.DEFAULT: "Qwen/Qwen-1_8B-Chat-Int8",DownloadSource.MODELSCOPE: "qwen/Qwen-1_8B-Chat-Int8",},"Qwen-1.8B-int4-Chat": {DownloadSource.DEFAULT: "Qwen/Qwen-1_8B-Chat-Int4",DownloadSource.MODELSCOPE: "qwen/Qwen-1_8B-Chat-Int4",},"Qwen-7B-int8-Chat": {DownloadSource.DEFAULT: "Qwen/Qwen-7B-Chat-Int8",DownloadSource.MODELSCOPE: "qwen/Qwen-7B-Chat-Int8",},"Qwen-7B-int4-Chat": {DownloadSource.DEFAULT: "Qwen/Qwen-7B-Chat-Int4",DownloadSource.MODELSCOPE: "qwen/Qwen-7B-Chat-Int4",},"Qwen-14B-int8-Chat": {DownloadSource.DEFAULT: "Qwen/Qwen-14B-Chat-Int8",DownloadSource.MODELSCOPE: "qwen/Qwen-14B-Chat-Int8",},"Qwen-14B-int4-Chat": {DownloadSource.DEFAULT: "Qwen/Qwen-14B-Chat-Int4",DownloadSource.MODELSCOPE: "qwen/Qwen-14B-Chat-Int4",},"Qwen-72B-int8-Chat": {DownloadSource.DEFAULT: "Qwen/Qwen-72B-Chat-Int8",DownloadSource.MODELSCOPE: "qwen/Qwen-72B-Chat-Int8",},"Qwen-72B-int4-Chat": {DownloadSource.DEFAULT: "Qwen/Qwen-72B-Chat-Int4",DownloadSource.MODELSCOPE: "qwen/Qwen-72B-Chat-Int4",},},module="c_attn",template="qwen",
)register_model_group(models={"Qwen1.5-0.5B": {DownloadSource.DEFAULT: "Qwen/Qwen1.5-0.5B",DownloadSource.MODELSCOPE: "qwen/Qwen1.5-0.5B",},"Qwen1.5-1.8B": {DownloadSource.DEFAULT: "Qwen/Qwen1.5-1.8B",DownloadSource.MODELSCOPE: "qwen/Qwen1.5-1.8B",},"Qwen1.5-4B": {DownloadSource.DEFAULT: "Qwen/Qwen1.5-4B",DownloadSource.MODELSCOPE: "qwen/Qwen1.5-4B",},"Qwen1.5-7B": {DownloadSource.DEFAULT: "Qwen/Qwen1.5-7B",DownloadSource.MODELSCOPE: "qwen/Qwen1.5-7B",},"Qwen1.5-14B": {DownloadSource.DEFAULT: "Qwen/Qwen1.5-14B",DownloadSource.MODELSCOPE: "qwen/Qwen1.5-14B",},"Qwen1.5-32B": {DownloadSource.DEFAULT: "Qwen/Qwen1.5-32B",DownloadSource.MODELSCOPE: "qwen/Qwen1.5-32B",},"Qwen1.5-72B": {DownloadSource.DEFAULT: "Qwen/Qwen1.5-72B",DownloadSource.MODELSCOPE: "qwen/Qwen1.5-72B",},"Qwen1.5-110B": {DownloadSource.DEFAULT: "Qwen/Qwen1.5-110B",DownloadSource.MODELSCOPE: "qwen/Qwen1.5-110B",},"Qwen1.5-MoE-A2.7B": {DownloadSource.DEFAULT: "Qwen/Qwen1.5-MoE-A2.7B",DownloadSource.MODELSCOPE: "qwen/Qwen1.5-MoE-A2.7B",},"Qwen1.5-Code-7B": {DownloadSource.DEFAULT: "Qwen/CodeQwen1.5-7B",DownloadSource.MODELSCOPE: "qwen/CodeQwen1.5-7B",},"Qwen1.5-0.5B-Chat": {DownloadSource.DEFAULT: "Qwen/Qwen1.5-0.5B-Chat",DownloadSource.MODELSCOPE: "qwen/Qwen1.5-0.5B-Chat",},"Qwen1.5-1.8B-Chat": {DownloadSource.DEFAULT: "Qwen/Qwen1.5-1.8B-Chat",DownloadSource.MODELSCOPE: "qwen/Qwen1.5-1.8B-Chat",},"Qwen1.5-4B-Chat": {DownloadSource.DEFAULT: "Qwen/Qwen1.5-4B-Chat",DownloadSource.MODELSCOPE: "qwen/Qwen1.5-4B-Chat",},"Qwen1.5-7B-Chat": {DownloadSource.DEFAULT: "Qwen/Qwen1.5-7B-Chat",DownloadSource.MODELSCOPE: "qwen/Qwen1.5-7B-Chat",},"Qwen1.5-14B-Chat": {DownloadSource.DEFAULT: "Qwen/Qwen1.5-14B-Chat",DownloadSource.MODELSCOPE: "qwen/Qwen1.5-14B-Chat",},"Qwen1.5-32B-Chat": {DownloadSource.DEFAULT: "Qwen/Qwen1.5-32B-Chat",DownloadSource.MODELSCOPE: "qwen/Qwen1.5-32B-Chat",},"Qwen1.5-72B-Chat": {DownloadSource.DEFAULT: "Qwen/Qwen1.5-72B-Chat",DownloadSource.MODELSCOPE: "qwen/Qwen1.5-72B-Chat",},"Qwen1.5-110B-Chat": {DownloadSource.DEFAULT: "Qwen/Qwen1.5-110B-Chat",DownloadSource.MODELSCOPE: "qwen/Qwen1.5-110B-Chat",},"Qwen1.5-MoE-A2.7B-Chat": {DownloadSource.DEFAULT: "Qwen/Qwen1.5-MoE-A2.7B-Chat",DownloadSource.MODELSCOPE: "qwen/Qwen1.5-MoE-A2.7B-Chat",},"Qwen1.5-Code-7B-Chat": {DownloadSource.DEFAULT: "Qwen/CodeQwen1.5-7B-Chat",DownloadSource.MODELSCOPE: "qwen/CodeQwen1.5-7B-Chat",},"Qwen1.5-0.5B-int8-Chat": {DownloadSource.DEFAULT: "Qwen/Qwen1.5-0.5B-Chat-GPTQ-Int8",DownloadSource.MODELSCOPE: "qwen/Qwen1.5-0.5B-Chat-GPTQ-Int8",},"Qwen1.5-0.5B-int4-Chat": {DownloadSource.DEFAULT: "Qwen/Qwen1.5-0.5B-Chat-AWQ",DownloadSource.MODELSCOPE: "qwen/Qwen1.5-0.5B-Chat-AWQ",},"Qwen1.5-1.8B-int8-Chat": {DownloadSource.DEFAULT: "Qwen/Qwen1.5-1.8B-Chat-GPTQ-Int8",DownloadSource.MODELSCOPE: "qwen/Qwen1.5-1.8B-Chat-GPTQ-Int8",},"Qwen1.5-1.8B-int4-Chat": {DownloadSource.DEFAULT: "Qwen/Qwen1.5-1.8B-Chat-AWQ",DownloadSource.MODELSCOPE: "qwen/Qwen1.5-1.8B-Chat-AWQ",},"Qwen1.5-4B-int8-Chat": {DownloadSource.DEFAULT: "Qwen/Qwen1.5-4B-Chat-GPTQ-Int8",DownloadSource.MODELSCOPE: "qwen/Qwen1.5-4B-Chat-GPTQ-Int8",},"Qwen1.5-4B-int4-Chat": {DownloadSource.DEFAULT: "Qwen/Qwen1.5-4B-Chat-AWQ",DownloadSource.MODELSCOPE: "qwen/Qwen1.5-4B-Chat-AWQ",},"Qwen1.5-7B-int8-Chat": {DownloadSource.DEFAULT: "Qwen/Qwen1.5-7B-Chat-GPTQ-Int8",DownloadSource.MODELSCOPE: "qwen/Qwen1.5-7B-Chat-GPTQ-Int8",},"Qwen1.5-7B-int4-Chat": {DownloadSource.DEFAULT: "Qwen/Qwen1.5-7B-Chat-AWQ",DownloadSource.MODELSCOPE: "qwen/Qwen1.5-7B-Chat-AWQ",},"Qwen1.5-14B-int8-Chat": {DownloadSource.DEFAULT: "Qwen/Qwen1.5-14B-Chat-GPTQ-Int8",DownloadSource.MODELSCOPE: "qwen/Qwen1.5-14B-Chat-GPTQ-Int8",},"Qwen1.5-14B-int4-Chat": {DownloadSource.DEFAULT: "Qwen/Qwen1.5-14B-Chat-AWQ",DownloadSource.MODELSCOPE: "qwen/Qwen1.5-14B-Chat-AWQ",},"Qwen1.5-32B-int4-Chat": {DownloadSource.DEFAULT: "Qwen/Qwen1.5-32B-Chat-AWQ",DownloadSource.MODELSCOPE: "qwen/Qwen1.5-32B-Chat-AWQ",},"Qwen1.5-72B-int8-Chat": {DownloadSource.DEFAULT: "Qwen/Qwen1.5-72B-Chat-GPTQ-Int8",DownloadSource.MODELSCOPE: "qwen/Qwen1.5-72B-Chat-GPTQ-Int8",},"Qwen1.5-72B-int4-Chat": {DownloadSource.DEFAULT: "Qwen/Qwen1.5-72B-Chat-AWQ",DownloadSource.MODELSCOPE: "qwen/Qwen1.5-72B-Chat-AWQ",},"Qwen1.5-110B-int4-Chat": {DownloadSource.DEFAULT: "Qwen/Qwen1.5-110B-Chat-AWQ",DownloadSource.MODELSCOPE: "qwen/Qwen1.5-110B-Chat-AWQ",},"Qwen1.5-MoE-A2.7B-int4-Chat": {DownloadSource.DEFAULT: "Qwen/Qwen1.5-MoE-A2.7B-Chat-GPTQ-Int4",DownloadSource.MODELSCOPE: "qwen/Qwen1.5-MoE-A2.7B-Chat-GPTQ-Int4",},"Qwen1.5-Code-7B-int4-Chat": {DownloadSource.DEFAULT: "Qwen/CodeQwen1.5-7B-Chat-AWQ",DownloadSource.MODELSCOPE: "qwen/CodeQwen1.5-7B-Chat-AWQ",},},template="qwen",
)register_model_group(models={"SOLAR-10.7B": {DownloadSource.DEFAULT: "upstage/SOLAR-10.7B-v1.0",},"SOLAR-10.7B-Chat": {DownloadSource.DEFAULT: "upstage/SOLAR-10.7B-Instruct-v1.0",DownloadSource.MODELSCOPE: "AI-ModelScope/SOLAR-10.7B-Instruct-v1.0",},},template="solar",
)register_model_group(models={"Skywork-13B-Base": {DownloadSource.DEFAULT: "Skywork/Skywork-13B-base",DownloadSource.MODELSCOPE: "skywork/Skywork-13B-base",}}
)register_model_group(models={"StarCoder2-3B": {DownloadSource.DEFAULT: "bigcode/starcoder2-3b",DownloadSource.MODELSCOPE: "AI-ModelScope/starcoder2-3b",},"StarCoder2-7B": {DownloadSource.DEFAULT: "bigcode/starcoder2-7b",DownloadSource.MODELSCOPE: "AI-ModelScope/starcoder2-7b",},"StarCoder2-15B": {DownloadSource.DEFAULT: "bigcode/starcoder2-15b",DownloadSource.MODELSCOPE: "AI-ModelScope/starcoder2-15b",},}
)register_model_group(models={"Vicuna1.5-7B-Chat": {DownloadSource.DEFAULT: "lmsys/vicuna-7b-v1.5",DownloadSource.MODELSCOPE: "Xorbits/vicuna-7b-v1.5",},"Vicuna1.5-13B-Chat": {DownloadSource.DEFAULT: "lmsys/vicuna-13b-v1.5",DownloadSource.MODELSCOPE: "Xorbits/vicuna-13b-v1.5",},},template="vicuna",
)register_model_group(models={"XuanYuan-6B": {DownloadSource.DEFAULT: "Duxiaoman-DI/XuanYuan-6B",DownloadSource.MODELSCOPE: "Duxiaoman-DI/XuanYuan-6B",},"XuanYuan-70B": {DownloadSource.DEFAULT: "Duxiaoman-DI/XuanYuan-70B",DownloadSource.MODELSCOPE: "Duxiaoman-DI/XuanYuan-70B",},"XuanYuan-2-70B": {DownloadSource.DEFAULT: "Duxiaoman-DI/XuanYuan2-70B",DownloadSource.MODELSCOPE: "Duxiaoman-DI/XuanYuan2-70B",},"XuanYuan-6B-Chat": {DownloadSource.DEFAULT: "Duxiaoman-DI/XuanYuan-6B-Chat",DownloadSource.MODELSCOPE: "Duxiaoman-DI/XuanYuan-6B-Chat",},"XuanYuan-70B-Chat": {DownloadSource.DEFAULT: "Duxiaoman-DI/XuanYuan-70B-Chat",DownloadSource.MODELSCOPE: "Duxiaoman-DI/XuanYuan-70B-Chat",},"XuanYuan-2-70B-Chat": {DownloadSource.DEFAULT: "Duxiaoman-DI/XuanYuan2-70B-Chat",DownloadSource.MODELSCOPE: "Duxiaoman-DI/XuanYuan2-70B-Chat",},"XuanYuan-6B-int8-Chat": {DownloadSource.DEFAULT: "Duxiaoman-DI/XuanYuan-6B-Chat-8bit",DownloadSource.MODELSCOPE: "Duxiaoman-DI/XuanYuan-6B-Chat-8bit",},"XuanYuan-6B-int4-Chat": {DownloadSource.DEFAULT: "Duxiaoman-DI/XuanYuan-6B-Chat-4bit",DownloadSource.MODELSCOPE: "Duxiaoman-DI/XuanYuan-6B-Chat-4bit",},"XuanYuan-70B-int8-Chat": {DownloadSource.DEFAULT: "Duxiaoman-DI/XuanYuan-70B-Chat-8bit",DownloadSource.MODELSCOPE: "Duxiaoman-DI/XuanYuan-70B-Chat-8bit",},"XuanYuan-70B-int4-Chat": {DownloadSource.DEFAULT: "Duxiaoman-DI/XuanYuan-70B-Chat-4bit",DownloadSource.MODELSCOPE: "Duxiaoman-DI/XuanYuan-70B-Chat-4bit",},"XuanYuan-2-70B-int8-Chat": {DownloadSource.DEFAULT: "Duxiaoman-DI/XuanYuan2-70B-Chat-8bit",DownloadSource.MODELSCOPE: "Duxiaoman-DI/XuanYuan2-70B-Chat-8bit",},"XuanYuan-2-70B-int4-Chat": {DownloadSource.DEFAULT: "Duxiaoman-DI/XuanYuan2-70B-Chat-4bit",DownloadSource.MODELSCOPE: "Duxiaoman-DI/XuanYuan2-70B-Chat-4bit",},},template="xuanyuan",
)register_model_group(models={"XVERSE-7B": {DownloadSource.DEFAULT: "xverse/XVERSE-7B",DownloadSource.MODELSCOPE: "xverse/XVERSE-7B",},"XVERSE-13B": {DownloadSource.DEFAULT: "xverse/XVERSE-13B",DownloadSource.MODELSCOPE: "xverse/XVERSE-13B",},"XVERSE-65B": {DownloadSource.DEFAULT: "xverse/XVERSE-65B",DownloadSource.MODELSCOPE: "xverse/XVERSE-65B",},"XVERSE-65B-2": {DownloadSource.DEFAULT: "xverse/XVERSE-65B-2",DownloadSource.MODELSCOPE: "xverse/XVERSE-65B-2",},"XVERSE-7B-Chat": {DownloadSource.DEFAULT: "xverse/XVERSE-7B-Chat",DownloadSource.MODELSCOPE: "xverse/XVERSE-7B-Chat",},"XVERSE-13B-Chat": {DownloadSource.DEFAULT: "xverse/XVERSE-13B-Chat",DownloadSource.MODELSCOPE: "xverse/XVERSE-13B-Chat",},"XVERSE-65B-Chat": {DownloadSource.DEFAULT: "xverse/XVERSE-65B-Chat",DownloadSource.MODELSCOPE: "xverse/XVERSE-65B-Chat",},"XVERSE-MoE-A4.2B": {DownloadSource.DEFAULT: "xverse/XVERSE-MoE-A4.2B",DownloadSource.MODELSCOPE: "xverse/XVERSE-MoE-A4.2B",},"XVERSE-7B-int8-Chat": {DownloadSource.DEFAULT: "xverse/XVERSE-7B-Chat-GPTQ-Int8",DownloadSource.MODELSCOPE: "xverse/XVERSE-7B-Chat-GPTQ-Int8",},"XVERSE-7B-int4-Chat": {DownloadSource.DEFAULT: "xverse/XVERSE-7B-Chat-GPTQ-Int4",DownloadSource.MODELSCOPE: "xverse/XVERSE-7B-Chat-GPTQ-Int4",},"XVERSE-13B-int8-Chat": {DownloadSource.DEFAULT: "xverse/XVERSE-13B-Chat-GPTQ-Int8",DownloadSource.MODELSCOPE: "xverse/XVERSE-13B-Chat-GPTQ-Int8",},"XVERSE-13B-int4-Chat": {DownloadSource.DEFAULT: "xverse/XVERSE-13B-Chat-GPTQ-Int4",DownloadSource.MODELSCOPE: "xverse/XVERSE-13B-Chat-GPTQ-Int4",},"XVERSE-65B-int4-Chat": {DownloadSource.DEFAULT: "xverse/XVERSE-65B-Chat-GPTQ-Int4",DownloadSource.MODELSCOPE: "xverse/XVERSE-65B-Chat-GPTQ-Int4",},},template="xverse",
)register_model_group(models={"Yayi-7B": {DownloadSource.DEFAULT: "wenge-research/yayi-7b-llama2",DownloadSource.MODELSCOPE: "AI-ModelScope/yayi-7b-llama2",},"Yayi-13B": {DownloadSource.DEFAULT: "wenge-research/yayi-13b-llama2",DownloadSource.MODELSCOPE: "AI-ModelScope/yayi-13b-llama2",},},template="yayi",
)register_model_group(models={"Yi-6B": {DownloadSource.DEFAULT: "01-ai/Yi-6B",DownloadSource.MODELSCOPE: "01ai/Yi-6B",},"Yi-9B": {DownloadSource.DEFAULT: "01-ai/Yi-9B",DownloadSource.MODELSCOPE: "01ai/Yi-9B",},"Yi-34B": {DownloadSource.DEFAULT: "01-ai/Yi-34B",DownloadSource.MODELSCOPE: "01ai/Yi-34B",},"Yi-6B-Chat": {DownloadSource.DEFAULT: "01-ai/Yi-6B-Chat",DownloadSource.MODELSCOPE: "01ai/Yi-6B-Chat",},"Yi-34B-Chat": {DownloadSource.DEFAULT: "01-ai/Yi-34B-Chat",DownloadSource.MODELSCOPE: "01ai/Yi-34B-Chat",},"Yi-6B-int8-Chat": {DownloadSource.DEFAULT: "01-ai/Yi-6B-Chat-8bits",DownloadSource.MODELSCOPE: "01ai/Yi-6B-Chat-8bits",},"Yi-6B-int4-Chat": {DownloadSource.DEFAULT: "01-ai/Yi-6B-Chat-4bits",DownloadSource.MODELSCOPE: "01ai/Yi-6B-Chat-4bits",},"Yi-34B-int8-Chat": {DownloadSource.DEFAULT: "01-ai/Yi-34B-Chat-8bits",DownloadSource.MODELSCOPE: "01ai/Yi-34B-Chat-8bits",},"Yi-34B-int4-Chat": {DownloadSource.DEFAULT: "01-ai/Yi-34B-Chat-4bits",DownloadSource.MODELSCOPE: "01ai/Yi-34B-Chat-4bits",},"Yi-1.5-6B": {DownloadSource.DEFAULT: "01-ai/Yi-1.5-6B",DownloadSource.MODELSCOPE: "01ai/Yi-1.5-6B",},"Yi-1.5-9B": {DownloadSource.DEFAULT: "01-ai/Yi-1.5-9B",DownloadSource.MODELSCOPE: "01ai/Yi-1.5-9B",},"Yi-1.5-34B": {DownloadSource.DEFAULT: "01-ai/Yi-1.5-34B",DownloadSource.MODELSCOPE: "01ai/Yi-1.5-34B",},"Yi-1.5-6B-Chat": {DownloadSource.DEFAULT: "01-ai/Yi-1.5-6B-Chat",DownloadSource.MODELSCOPE: "01ai/Yi-1.5-6B-Chat",},"Yi-1.5-9B-Chat": {DownloadSource.DEFAULT: "01-ai/Yi-1.5-9B-Chat",DownloadSource.MODELSCOPE: "01ai/Yi-1.5-9B-Chat",},"Yi-1.5-34B-Chat": {DownloadSource.DEFAULT: "01-ai/Yi-1.5-34B-Chat",DownloadSource.MODELSCOPE: "01ai/Yi-1.5-34B-Chat",},},template="yi",
)register_model_group(models={"YiVL-6B-Chat": {DownloadSource.DEFAULT: "BUAADreamer/Yi-VL-6B-hf",},"YiVL-34B-Chat": {DownloadSource.DEFAULT: "BUAADreamer/Yi-VL-34B-hf",},},template="yi_vl",vision=True,
)register_model_group(models={"Yuan2-2B-Chat": {DownloadSource.DEFAULT: "IEITYuan/Yuan2-2B-hf",DownloadSource.MODELSCOPE: "YuanLLM/Yuan2.0-2B-hf",},"Yuan2-51B-Chat": {DownloadSource.DEFAULT: "IEITYuan/Yuan2-51B-hf",DownloadSource.MODELSCOPE: "YuanLLM/Yuan2.0-51B-hf",},"Yuan2-102B-Chat": {DownloadSource.DEFAULT: "IEITYuan/Yuan2-102B-hf",DownloadSource.MODELSCOPE: "YuanLLM/Yuan2.0-102B-hf",},},template="yuan",
)register_model_group(models={"Zephyr-7B-Alpha-Chat": {DownloadSource.DEFAULT: "HuggingFaceH4/zephyr-7b-alpha",DownloadSource.MODELSCOPE: "AI-ModelScope/zephyr-7b-alpha",},"Zephyr-7B-Beta-Chat": {DownloadSource.DEFAULT: "HuggingFaceH4/zephyr-7b-beta",DownloadSource.MODELSCOPE: "modelscope/zephyr-7b-beta",},"Zephyr-141B-ORPO-Chat": {DownloadSource.DEFAULT: "HuggingFaceH4/zephyr-orpo-141b-A35b-v0.1",},},template="zephyr",
)

代码中提到的大型模型名称包括:

  1. Baichuan-7B-Base
  2. Baichuan-13B-Base
  3. Baichuan-13B-Chat
  4. Baichuan2-7B-Base
  5. Baichuan2-13B-Base
  6. Baichuan2-7B-Chat
  7. Baichuan2-13B-Chat
  8. BLOOM-560M
  9. BLOOM-3B
  10. BLOOM-7B1
  11. BLOOMZ-560M
  12. BLOOMZ-3B
  13. BLOOMZ-7B1-mt
  14. BlueLM-7B-Base
  15. BlueLM-7B-Chat
  16. Breeze-7B
  17. Breeze-7B-Chat
  18. ChatGLM2-6B-Chat
  19. ChatGLM3-6B-Base
  20. ChatGLM3-6B-Chat
  21. ChineseLLaMA2-1.3B
  22. ChineseLLaMA2-7B
  23. ChineseLLaMA2-13B
  24. ChineseLLaMA2-1.3B-Chat
  25. ChineseLLaMA2-7B-Chat
  26. ChineseLLaMA2-13B-Chat
  27. CommandR-35B-Chat
  28. CommandR-Plus-104B-Chat
  29. CommandR-35B-4bit-Chat
  30. CommandR-Plus-104B-4bit-Chat
  31. DBRX-132B-Base
  32. DBRX-132B-Chat
  33. DeepSeek-LLM-7B-Base
  34. DeepSeek-LLM-67B-Base
  35. DeepSeek-LLM-7B-Chat
  36. DeepSeek-LLM-67B-Chat
  37. DeepSeek-Math-7B-Base
  38. DeepSeek-Math-7B-Chat
  39. DeepSeek-MoE-16B-Base
  40. DeepSeek-MoE-16B-v2-Base
  41. DeepSeek-MoE-236B-Base
  42. DeepSeek-MoE-16B-Chat
  43. DeepSeek-MoE-16B-v2-Chat
  44. DeepSeek-MoE-236B-Chat
  45. DeepSeekCoder-6.7B-Base
  46. DeepSeekCoder-7B-Base
  47. DeepSeekCoder-33B-Base
  48. DeepSeekCoder-6.7B-Chat
  49. DeepSeekCoder-7B-Chat
  50. DeepSeekCoder-33B-Chat
  51. Falcon-7B
  52. Falcon-11B
  53. Falcon-40B
  54. Falcon-180B
  55. Falcon-7B-Chat
  56. Falcon-40B-Chat
  57. Falcon-180B-Chat
  58. Gemma-2B
  59. Gemma-7B
  60. Gemma-2B-Chat
  61. Gemma-7B-Chat
  62. CodeGemma-2B
  63. CodeGemma-7B
  64. CodeGemma-7B-Chat
  65. InternLM-7B
  66. InternLM-20B
  67. InternLM-7B-Chat
  68. InternLM-20B-Chat
  69. InternLM2-7B
  70. InternLM2-20B
  71. InternLM2-7B-Chat
  72. InternLM2-20B-Chat
  73. Jambda-v0.1
  74. LingoWhale-8B
  75. LLaMA-7B
  76. LLaMA-13B
  77. LLaMA-30B
  78. LLaMA-65B
  79. LLaMA2-7B
  80. LLaMA2-13B
  81. LLaMA2-70B
  82. LLaMA2-7B-Chat
  83. LLaMA2-13B-Chat
  84. LLaMA2-70B-Chat
  85. LLaMA3-8B
  86. LLaMA3-70B
  87. LLaMA3-8B-Chat
  88. LLaMA3-70B-Chat
  89. LLaMA3-8B-Chinese-Chat
  90. LLaMA3-70B-Chinese-Chat
  91. LLaVA1.5-7B-Chat
  92. LLaVA1.5-13B-Chat
  93. Mistral-7B-v0.1
  94. Mistral-7B-v0.1-Chat
  95. Mistral-7B-v0.2
  96. Mistral-7B-v0.2-Chat
  97. Mixtral-8x7B-v0.1
  98. Mixtral-8x7B-v0.1-Chat
  99. Mixtral-8x22B-v0.1
  100. Mixtral-8x22B-v0.1-Chat
  101. OLMo-1B
  102. OLMo-7B
  103. OLMo-1.7-7B
  104. OpenChat3.5-7B-Chat
  105. Orion-14B-Base
  106. Orion-14B-Chat
  107. Orion-14B-Long-Chat
  108. Orion-14B-RAG-Chat
  109. Orion-14B-Plugin-Chat
  110. PaliGemma-3B-pt-224
  111. PaliGemma-3B-pt-448
  112. PaliGemma-3B-pt-896
  113. PaliGemma-3B-mix-224
  114. PaliGemma-3B-mix-448
  115. Phi-1.5-1.3B
  116. Phi-2-2.7B
  117. Phi3-3.8B-4k-Chat
  118. Phi3-3.8B-128k-Chat
  119. Qwen-1.8B
  120. Qwen-7B
  121. Qwen-14B
  122. Qwen-72B
  123. Qwen-1.8B-Chat
  124. Qwen-7B-Chat
  125. Qwen-14B-Chat
  126. Qwen-72B-Chat
  127. Qwen-1.8B-int8-Chat
  128. Qwen-1.8B-int4-Chat
  129. Qwen-7B-int8-Chat
  130. Qwen-7B-int4-Chat
  131. Qwen-14B-int8-Chat
  132. Qwen-14B-int4-Chat
  133. Qwen-72B-int8-Chat
  134. Qwen-72B-int4-Chat
  135. Qwen1.5-0.5B
  136. Qwen1.5-1.8B
  137. Qwen1.5-4B
  138. Qwen1.5-7B
  139. Qwen1.5-14B
  140. Qwen1.5-32B
  141. Qwen1.5-72B
  142. Qwen1.5-110B
  143. Qwen1.5-MoE-A2.7B
  144. Qwen1.5-Code-7B
  145. Qwen1.5-0.5B-Chat
  146. Qwen1.5-1.8B-Chat
  147. Qwen1.5-4B-Chat
  148. Qwen1.5-7B-Chat
  149. Qwen1.5-14B-Chat
  150. Qwen1.5-32B-Chat
  151. Qwen1.5-72B-Chat
  152. Qwen1.5-110B-Chat
  153. Qwen1.5-MoE-A2.7B-Chat
  154. Qwen1.5-Code-7B-Chat
  155. Qwen1.5-0.5B-int8-Chat
  156. Qwen1.5-0.5B-int4-Chat
  157. Qwen1.5-1.8B-int8-Chat
  158. Qwen1.5-1.8B-int4-Chat
  159. Qwen1.5-4B-int8-Chat
  160. Qwen1.5-4B-int4-Chat
  161. Qwen1.5-7B-int8-Chat
  162. Qwen1.5-7B-int4-Chat
  163. Qwen1.5-14B-int8-Chat
  164. Qwen1.5-14B-int4-Chat
  165. Qwen1.5-32B-int4-Chat
  166. Qwen1.5-72B-int8-Chat
  167. Qwen1.5-72B-int4-Chat
  168. Qwen1.5-110B-int4-Chat
  169. Qwen1.5-MoE-A2.7B-int4-Chat
  170. Qwen1.5-Code-7B-int4-Chat
  171. SOLAR-10.7B
  172. SOLAR-10.7B-Chat
  173. Skywork-13B-Base
  174. StarCoder2-3B
  175. StarCoder2-7B
  176. StarCoder2-15B
  177. Vicuna1.5-7B-Chat
  178. Vicuna1.5-13B-Chat
  179. XuanYuan-6B
  180. XuanYuan-70B
  181. XuanYuan-2-70B
  182. XuanYuan-6B-Chat
  183. XuanYuan-70B-Chat
  184. XuanYuan-2-70B-Chat
  185. XuanYuan-6B-int8-Chat
  186. XuanYuan-6B-int4-Chat
  187. XuanYuan-70B-int8-Chat
  188. XuanYuan-70B-int4-Chat
  189. XuanYuan-2-70B-int8-Chat
  190. XuanYuan-2-70B-int4-Chat
  191. XVERSE-7B
  192. XVERSE-13B
  193. XVERSE-65B
  194. XVERSE-65B-2
  195. XVERSE-7B-Chat
  196. XVERSE-13B-Chat
  197. XVERSE-65B-Chat
  198. XVERSE-MoE-A4.2B
  199. XVERSE-7B-int8-Chat
  200. XVERSE-7B-int4-Chat
  201. XVERSE-13B-int8-Chat
  202. XVERSE-13B-int4-Chat
  203. XVERSE-65B-int4-Chat
  204. Yayi-7B
  205. Yayi-13B
  206. Yi-6B
  207. Yi-9B
  208. Yi-34B
  209. Yi-6B-Chat
  210. Yi-34B-Chat
  211. Yi-6B-int8-Chat
  212. Yi-6B-int4-Chat
  213. Yi-34B-int8-Chat
  214. Yi-34B-int4-Chat
  215. Yi-1.5-6B
  216. Yi-1.5-9B
  217. Yi-1.5-34B
  218. Yi-1.5-6B-Chat
  219. Yi-1.5-9B-Chat
  220. Yi-1.5-34B-Chat
  221. YiVL-6B-Chat
  222. YiVL-34B-Chat
  223. Yuan2-2B-Chat
  224. Yuan2-51B-Chat
  225. Yuan2-102B-Chat
  226. Zephyr-7B-Alpha-Chat
  227. Zephyr-7B-Beta-Chat
  228. Zephyr-141B-ORPO-Chat

大模型技术分享

在这里插入图片描述

在这里插入图片描述

在这里插入图片描述

《企业级生成式人工智能LLM大模型技术、算法及案例实战》线上高级研修讲座

模块一:Generative AI 原理本质、技术内核及工程实践周期详解
模块二:工业级 Prompting 技术内幕及端到端的基于LLM 的会议助理实战
模块三:三大 Llama 2 模型详解及实战构建安全可靠的智能对话系统
模块四:生产环境下 GenAI/LLMs 的五大核心问题及构建健壮的应用实战
模块五:大模型应用开发技术:Agentic-based 应用技术及案例实战
模块六:LLM 大模型微调及模型 Quantization 技术及案例实战
模块七:大模型高效微调 PEFT 算法、技术、流程及代码实战进阶
模块八:LLM 模型对齐技术、流程及进行文本Toxicity 分析实战
模块九:构建安全的 GenAI/LLMs 核心技术Red Teaming 解密实战
模块十:构建可信赖的企业私有安全大模型Responsible AI 实战 

Llama3关键技术深度解析与构建Responsible AI、算法及开发落地实战

1、Llama开源模型家族大模型技术、工具和多模态详解:学员将深入了解Meta Llama 3的创新之处,比如其在语言模型技术上的突破,并学习到如何在Llama 3中构建trust and safety AI。他们将详细了解Llama 3的五大技术分支及工具,以及如何在AWS上实战Llama指令微调的案例。
2、解密Llama 3 Foundation Model模型结构特色技术及代码实现:深入了解Llama 3中的各种技术,比如Tiktokenizer、KV Cache、Grouped Multi-Query Attention等。通过项目二逐行剖析Llama 3的源码,加深对技术的理解。
3、解密Llama 3 Foundation Model模型结构核心技术及代码实现:SwiGLU Activation Function、FeedForward Block、Encoder Block等。通过项目三学习Llama 3的推理及Inferencing代码,加强对技术的实践理解。
4、基于LangGraph on Llama 3构建Responsible AI实战体验:通过项目四在Llama 3上实战基于LangGraph的Responsible AI项目。他们将了解到LangGraph的三大核心组件、运行机制和流程步骤,从而加强对Responsible AI的实践能力。
5、Llama模型家族构建技术构建安全可信赖企业级AI应用内幕详解:深入了解构建安全可靠的企业级AI应用所需的关键技术,比如Code Llama、Llama Guard等。项目五实战构建安全可靠的对话智能项目升级版,加强对安全性的实践理解。
6、Llama模型家族Fine-tuning技术与算法实战:学员将学习Fine-tuning技术与算法,比如Supervised Fine-Tuning(SFT)、Reward Model技术、PPO算法、DPO算法等。项目六动手实现PPO及DPO算法,加强对算法的理解和应用能力。
7、Llama模型家族基于AI反馈的强化学习技术解密:深入学习Llama模型家族基于AI反馈的强化学习技术,比如RLAIF和RLHF。项目七实战基于RLAIF的Constitutional AI。
8、Llama 3中的DPO原理、算法、组件及具体实现及算法进阶:学习Llama 3中结合使用PPO和DPO算法,剖析DPO的原理和工作机制,详细解析DPO中的关键算法组件,并通过综合项目八从零开始动手实现和测试DPO算法,同时课程将解密DPO进阶技术Iterative DPO及IPO算法。
9、Llama模型家族Safety设计与实现:在这个模块中,学员将学习Llama模型家族的Safety设计与实现,比如Safety in Pretraining、Safety Fine-Tuning等。构建安全可靠的GenAI/LLMs项目开发。
10、Llama 3构建可信赖的企业私有安全大模型Responsible AI系统:构建可信赖的企业私有安全大模型Responsible AI系统,掌握Llama 3的Constitutional AI、Red Teaming。

解码Sora架构、技术及应用

一、为何Sora通往AGI道路的里程碑?
1,探索从大规模语言模型(LLM)到大规模视觉模型(LVM)的关键转变,揭示其在实现通用人工智能(AGI)中的作用。
2,展示Visual Data和Text Data结合的成功案例,解析Sora在此过程中扮演的关键角色。
3,详细介绍Sora如何依据文本指令生成具有三维一致性(3D consistency)的视频内容。 4,解析Sora如何根据图像或视频生成高保真内容的技术路径。
5,探讨Sora在不同应用场景中的实践价值及其面临的挑战和局限性。

二、解码Sora架构原理
1,DiT (Diffusion Transformer)架构详解
2,DiT是如何帮助Sora实现Consistent、Realistic、Imaginative视频内容的?
3,探讨为何选用Transformer作为Diffusion的核心网络,而非技术如U-Net。
4,DiT的Patchification原理及流程,揭示其在处理视频和图像数据中的重要性。
5,Conditional Diffusion过程详解,及其在内容生成过程中的作用。
三、解码Sora关键技术解密
1,Sora如何利用Transformer和Diffusion技术理解物体间的互动,及其对模拟复杂互动场景的重要性。
2,为何说Space-time patches是Sora技术的核心,及其对视频生成能力的提升作用。
3,Spacetime latent patches详解,探讨其在视频压缩和生成中的关键角色。
4,Sora Simulator如何利用Space-time patches构建digital和physical世界,及其对模拟真实世界变化的能力。
5,Sora如何实现faithfully按照用户输入文本而生成内容,探讨背后的技术与创新。
6,Sora为何依据abstract concept而不是依据具体的pixels进行内容生成,及其对模型生成质量与多样性的影响。

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://www.rhkb.cn/news/329759.html

如若内容造成侵权/违法违规/事实不符,请联系长河编程网进行投诉反馈email:809451989@qq.com,一经查实,立即删除!

相关文章

docker 指定jdk11镜像执行jar

dockerfile :下载jdk11 并将上传的jar 放入jdk11容器/root&#xff0c;改名为app.jar vi dockerfile 。。。。内容见下图 # 构建jdk11镜像 docker build -t demo . # 也可以通过jdk11镜像&#xff08;前提有jdk11镜像&#xff09;外挂载目录方式运行jar docker run --name d…

AI大模型应用开发实践:4.基于 Chat Completions API 实现外部函数调用

基于 Chat Completions API 实现外部函数调用 2023年6月20日,OpenAI 官方在 Chat Completions API 原有的三种不同角色设定(System, Assistant, User)基础上,新增了 Function Calling 功能。 详见OpenAI Blog functions 是 Chat Completion API 中的可选参数,用于提供…

军工单位如何做到安全跨网文件交换与导出的

在现代信息化战争中&#xff0c;军工单位在信息安全方面的需求尤为突出。跨网文件交换与导出作为军工单位日常运营的重要环节&#xff0c;面临着网络带宽限制、数据安全风险、合规性要求和传输稳定性等挑战。下面&#xff0c;我们将从以下几个方面探讨军工单位如何实现安全、高…

《Effective Objective-C 2.0》读书笔记——协议与分类

目录 第四章&#xff1a;协议与分类第23条&#xff1a;通过委托与数据源协议进行对象间通信第24条&#xff1a;将类的实现代码分散到便于管理的数个分类之中第25条&#xff1a;总是为第三方类的分类名称加前缀第26条&#xff1a;勿在分类中声明属性第27条&#xff1a;使用“cla…

匝间冲击耐压试验仪产品介绍及工作原理

产品简介 武汉凯迪正大KD2684S匝间冲击耐压试验仪适用于电机、变压器、电器线圈等这些由漆包线绕制的产品。因漆包线的绝缘涂敷层本身存在着质量问题&#xff0c;以及在绕线、嵌线、刮线、接头端部整形、绝缘浸漆、装配等工序工艺中不慎而引起绝缘层的损伤等&#xff0c;都会造…

Docker Compose使用

Docker-Compose是什么 docker建议我们每一个容器中只运行一个服务,因为doker容器本身占用资源极少&#xff0c;所以最好是将每个服务单独分割开来&#xff0c;但是这样我们又面临了一个问题&#xff1a; 如果我需要同时部署好多个服务&#xff0c;难道要每个服务单独写Docker…

Midjourney应用场景、特点、生成图片带来影响

Midjourney是一个基于GPT-3.5系列接口开发的免费AI机器人&#xff0c;旨在提供多领域的智能对话服务。本文主要介绍Midjourney的应用场景、功能特点、图片生成后可以做什么&#xff1f; 一、Midjourney应用场景 Midjourney的应用场景相当广泛&#xff0c;以下是一些主要的适用…

Public Key Retrieval is not allowed解决

修改高级属性。 “Public Key Retrieval is not allowed” 错误是由于 MySQL 连接驱动程序的默认行为更改所引起的。在 MySQL 8.0 版本及更新版本中&#xff0c;默认情况下禁用了通过公钥检索用户密码的功能。 在旧版本的 MySQL 中&#xff0c;客户端连接到服务器时&#xf…

【Unity2D:C#Script】制作敌人

一、制作敌人预制体 1. 在场景面板中添加敌人&#xff0c;并创建预制体 2. 设置敌人的锚点在底部 二、为敌人添加碰撞体积 1. 添加Box Collider 2D、Rigidbody 2D组件 2. 调整轴心点位置、层级、碰撞体积大小、刚体类型、锁定z轴 Body Type&#xff08;刚体类型&#xff09;&…

网络的基础理解

文章目录 网络的基础认识 网络协议协议分层OSI七层模型TCP/IP 五层/四层 模型 网络的基础认识 先来看下面几个问题 什么是网络&#xff1f; 网络就是有许多台设备包括计算机单不仅限于计算机&#xff0c;这些设备通过相互通信所组成起来系统&#xff0c;我们称之为网络所以如…

【动态规划七】背包问题

目录 0/1背包问题 一、【模板】01背包 二、分割等和子集 三、目标和 四、最后一块石头的重量 II 完全背包问题 一、【模板】完全背包 二、零钱兑换 三、零钱兑换 II 四、完全平方数 二维费用的背包问题 一、一和零 二、盈利计划 似包非包 组合总和 卡特兰数 不…

Sui生态DeFi项目Cetus和Aftermath宣布启动孵化器

Sui DeFi中的去中心化交易所Cetus和Aftermath Finance联合Sui基金会宣布启动新的孵化器&#xff0c;为初创项目提供更多可行性途径。这两个DeFi项目在Sui上有着较长的历史&#xff0c;自去年一同与主网推出以来&#xff0c;目前在TVL方面位居前五。这两个项目的持久性和成功使它…

《Effective Objective-C 2.0》读书笔记——接口与API设计

目录 第三章&#xff1a;接口与API设计第15条&#xff1a;用前缀避免命名空间冲突第16条&#xff1a;提供“全能初始化方法”第17条&#xff1a;实现description方法第18条&#xff1a;尽量使用不可变对象第19条&#xff1a;使用清晰而协调的命名方式第20条&#xff1a;为私有方…

计算机网络协议

网络协议 基于TCP的应用层协议 POP3&#xff08;Post Office Protocol 3&#xff09;&#xff1a; 用于支持客户端远程管理服务器上的电子邮件。它支持**“离线”邮件处理**&#xff0c;即邮件发送到服务器上后&#xff0c;一旦邮件被POP3客户端下载到本地计算机&#xff0c;…

Redis --学习笔记

Redis简介 一个基于内存的key-value结构数据库。Redis 是互联网技术领域使用最为广泛的存储中间件 特点&#xff1a; 基于内存存储&#xff0c;读写性能高 适合存储热点数据&#xff08;热点商品、资讯、新闻&#xff09; 企业应用广泛 Redis默认端口号为6379 Redis是用…

Unity射击游戏开发教程:(24)创造不同的敌人

在这篇文章中,我们将讨论添加一个可以承受多次攻击的新敌人和一些动画来使事情变得栩栩如生。敌人没有任何移动或射击行为。这将有助于增强未来敌人的力量。 我们将声明一个 int 来存储敌人可以承受的攻击数量,并将其设置为 3。

力扣刷题---1748.唯一元素的和【简单】

题目描述 给你一个整数数组 nums 。数组中唯一元素是那些只出现 恰好一次 的元素。 请你返回 nums 中唯一元素的 和 。 示例 1&#xff1a; 输入&#xff1a;nums [1,2,3,2] 输出&#xff1a;4 解释&#xff1a;唯一元素为 [1,3] &#xff0c;和为 4 。 示例 2&#xff1a;…

NLP(16)--生成式任务

前言 仅记录学习过程&#xff0c;有问题欢迎讨论 输入输出均为不定长序列&#xff08;seq2seq&#xff09;自回归语言模型&#xff1a; x 为 str[start : end ]; y为 [start1 : end 1] 同时训练多个字&#xff0c;逐字计算交叉熵 encode-decode结构&#xff1a; Encoder将输…

微服务远程调用 RestTemplate

Spring给我们提供了一个RestTemplate的API&#xff0c;可以方便的实现Http请求的发送。 同步客户端执行HTTP请求&#xff0c;在底层HTTP客户端库(如JDK HttpURLConnection、Apache HttpComponents等)上公开一个简单的模板方法API。RestTemplate通过HTTP方法为常见场景提供了模…

从ES5迈向ES6:探索 JavaScript 新增声明命令与解构赋值的魅力

个人主页&#xff1a;学习前端的小z 个人专栏&#xff1a;JavaScript 精粹 本专栏旨在分享记录每日学习的前端知识和学习笔记的归纳总结&#xff0c;欢迎大家在评论区交流讨论&#xff01; ES5、ES6介绍 文章目录 &#x1f4af;声明命令 let、const&#x1f35f;1 let声明符&a…