{"id":2816,"date":"2024-04-03T18:46:00","date_gmt":"2024-04-03T10:46:00","guid":{"rendered":"https:\/\/www.aqwu.net\/wp\/?p=2816"},"modified":"2024-04-28T19:59:20","modified_gmt":"2024-04-28T11:59:20","slug":"%e4%ba%86%e8%a7%a3-dbrx-instruct","status":"publish","type":"post","link":"https:\/\/www.aqwu.net\/wp\/?p=2816","title":{"rendered":"\u4e86\u89e3 dbrx-instruct"},"content":{"rendered":"\n<h2 class=\"wp-block-heading\"><strong>0. \u7b80\u4ecb<\/strong><\/h2>\n\n\n\n<p>DBRX \u662f\u4e00\u4e2a\u57fa\u4e8e Transformer \u7684\u4ec5\u89e3\u7801\u5668\u5927\u578b\u8bed\u8a00\u6a21\u578b \uff08LLM\uff09\uff0c\u5b83\u4f7f\u7528\u4e0b\u4e00\u4e2a\u4ee4\u724c\u9884\u6d4b\u8fdb\u884c\u8bad\u7ec3\u3002\u5b83\u4f7f\u7528\u7ec6\u7c92\u5ea6\u7684\u4e13\u5bb6\u6df7\u5408 \uff08MoE\uff09 \u67b6\u6784\uff0c\u603b\u53c2\u6570\u4e3a 132B\uff0c\u5176\u4e2d 36B \u53c2\u6570\u5728\u4efb\u4f55\u8f93\u5165\u4e0a\u90fd\u5904\u4e8e\u6d3b\u52a8\u72b6\u6001\u3002\u5b83\u662f\u5728 12T \u6587\u672c\u548c\u4ee3\u7801\u6570\u636e\u6807\u8bb0\u4e0a\u9884\u5148\u8bad\u7ec3\u7684\u3002\u4e0e Mixtral-8x7B \u548c Grok-1 \u7b49\u5176\u4ed6\u5f00\u653e\u5f0f MoE \u6a21\u578b\u76f8\u6bd4\uff0cDBRX \u662f\u7ec6\u7c92\u5ea6\u7684\uff0c\u8fd9\u610f\u5473\u7740\u5b83\u4f7f\u7528\u66f4\u591a\u7684\u5c0f\u578b\u4e13\u5bb6\u3002DBRX \u6709 16 \u540d\u4e13\u5bb6\u5e76\u9009\u62e9 4 \u540d\uff0c\u800c Mixtral-8x7B \u548c Grok-1 \u6709 8 \u540d\u4e13\u5bb6\u5e76\u9009\u62e9 2 \u540d\u3002\u8fd9\u63d0\u4f9b\u4e86 65 \u500d\u4ee5\u4e0a\u7684\u4e13\u5bb6\u7ec4\u5408\uff0c\u6211\u4eec\u53d1\u73b0\u8fd9\u63d0\u9ad8\u4e86\u6a21\u578b\u8d28\u91cf\u3002DBRX \u4f7f\u7528\u65cb\u8f6c\u4f4d\u7f6e\u7f16\u7801 \uff08RoPE\uff09\u3001\u95e8\u63a7\u7ebf\u6027\u5355\u5143 \uff08GLU\uff09 \u548c\u5206\u7ec4\u67e5\u8be2\u6ce8\u610f\u529b \uff08GQA\uff09\u3002\u5b83\u4f7f\u7528 tiktoken \u5b58\u50a8\u5e93\u4e2d\u63d0\u4f9b\u7684 GPT-4 \u5206\u8bcd\u5668\u3002\u6211\u4eec\u6839\u636e\u8be6\u5c3d\u7684\u8bc4\u4f30\u548c\u89c4\u6a21\u5b9e\u9a8c\u505a\u51fa\u4e86\u8fd9\u4e9b\u9009\u62e9\u3002<\/p>\n\n\n\n<p>DBRX \u5728\u7cbe\u5fc3\u7b56\u5212\u7684 12T \u4ee4\u724c\u4e0a\u8fdb\u884c\u4e86\u9884\u8bad\u7ec3\uff0c\u6700\u5927\u4e0a\u4e0b\u6587\u957f\u5ea6\u4e3a 32K \u4ee4\u724c\u3002\u6211\u4eec\u4f30\u8ba1\uff0c\u8fd9\u4e9b\u6570\u636e\u6bd4\u6211\u4eec\u7528\u4e8e\u9884\u8bad\u7ec3 MPT \u7cfb\u5217\u6a21\u578b\u7684\u6570\u636e\u81f3\u5c11\u8981\u597d 2 \u500d\u3002\u8fd9\u4e2a\u65b0\u6570\u636e\u96c6\u662f\u4f7f\u7528\u5168\u5957 Databricks \u5de5\u5177\u5f00\u53d1\u7684\uff0c\u5305\u62ec\u7528\u4e8e\u6570\u636e\u5904\u7406\u7684 Apache Spark\u2122 \u548c Databricks \u7b14\u8bb0\u672c\uff0c\u4ee5\u53ca\u7528\u4e8e\u6570\u636e\u7ba1\u7406\u548c\u6cbb\u7406\u7684 Unity Catalog\u3002\u6211\u4eec\u4f7f\u7528\u8bfe\u7a0b\u5b66\u4e60\u8fdb\u884c\u9884\u8bad\u7ec3\uff0c\u5728\u8bad\u7ec3\u8fc7\u7a0b\u4e2d\u4ee5\u6211\u4eec\u53d1\u73b0\u53ef\u4ee5\u663e\u7740\u63d0\u9ad8\u6a21\u578b\u8d28\u91cf\u7684\u65b9\u5f0f\u6539\u53d8\u6570\u636e\u7ec4\u5408\u3002<\/p>\n\n\n\n<p>\u8f93\u5165\uff1aDBRX \u4ec5\u63a5\u53d7\u57fa\u4e8e\u6587\u672c\u7684\u8f93\u5165\uff0c\u5e76\u63a5\u53d7\u6700\u591a 32768 \u4e2a\u4ee4\u724c\u7684\u4e0a\u4e0b\u6587\u957f\u5ea6\u3002<\/p>\n\n\n\n<p>\u8f93\u51fa\uff1aDBRX \u4ec5\u751f\u6210\u57fa\u4e8e\u6587\u672c\u7684\u8f93\u51fa\u3002<\/p>\n\n\n\n<p>\u6a21\u578b\u5730\u5740\uff1a<a href=\"https:\/\/huggingface.co\/databricks\/dbrx-instruct\">https:\/\/huggingface.co\/databricks\/dbrx-instruct<\/a><\/p>\n\n\n\n<p>\u5982\u679c\u60a8\u6709\u5927\u7ea6550G\u7684\u5185\u5b58\u6216\u4ea4\u6362\u533a\uff0c\u53ef\u4ee5\u5728CPU\u60c5\u51b5\u4e0b\u52a0\u8f7d\u6a21\u578b\uff0c\u6253\u5370\u6a21\u578b\u7c7b\u578b<\/p>\n\n\n\n<div class=\"wp-block-urvanov-syntax-highlighter-code-block\"><pre class=\"lang:python decode:true \">from transformers import AutoTokenizer\nfrom modeling_dbrx import DbrxForCausalLM\n\ntokenizer = AutoTokenizer.from_pretrained(\"databricks\/dbrx\", trust_remote_code=True)\nmodel = DbrxForCausalLM.from_pretrained(\"databricks\/dbrx\", trust_remote_code=True)\n\nprint(model)\n\nprompt = \"Hey, are you conscious? Can you talk to me?\"\ninputs = tokenizer(prompt, return_tensors=\"pt\")\n\n# Generate\ngenerate_ids = model.generate(inputs.input_ids, max_length=30)\ntokenizer.batch_decode(generate_ids, skip_special_tokens=True, clean_up_tokenization_spaces=False)[0]\n<\/pre><\/div>\n\n\n\n<p>\u8fd0\u884c\u7ed3\u679c<\/p>\n\n\n\n<p><\/p>\n\n\n\n<div class=\"wp-block-urvanov-syntax-highlighter-code-block\"><pre class=\"lang:python decode:true \">DbrxForCausalLM(\n  (transformer): DbrxModel(\n    (wte): Embedding(100352, 6144)\n    (blocks): ModuleList(\n      (0-39): 40 x DbrxBlock(\n        (norm_attn_norm): DbrxNormAttentionNorm(\n          (norm_1): LayerNorm((6144,), eps=1e-05, elementwise_affine=True)\n          (attn): DbrxAttention(\n            (Wqkv): Linear(in_features=6144, out_features=8192, bias=False)\n            (out_proj): Linear(in_features=6144, out_features=6144, bias=False)\n            (rotary_emb): DbrxRotaryEmbedding()\n          )\n          (norm_2): LayerNorm((6144,), eps=1e-05, elementwise_affine=True)\n        )\n        (ffn): DbrxFFN(\n          (router): DbrxRouter(\n            (layer): Linear(in_features=6144, out_features=16, bias=False)\n          )\n          (experts): DbrxExperts(\n            (mlp): DbrxExpertGLU()\n          )\n        )\n      )\n    )\n    (norm_f): LayerNorm((6144,), eps=1e-05, elementwise_affine=True)\n  )\n  (lm_head): Linear(in_features=6144, out_features=100352, bias=False)\n)<\/pre><\/div>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>1. \u5206\u6790 model.safetensors.index.json \u6587\u4ef6<\/strong><\/h2>\n\n\n\n<p>\u6b64\u6a21\u578b\u5f88\u5927\uff0c\u4ece model.safetensors.index.json \u53ef\u4ee5\u5f97\u77e5\u4e3a 245GB\uff1a<\/p>\n\n\n\n<div class=\"wp-block-urvanov-syntax-highlighter-code-block\"><pre class=\"lang:python decode:true \">{\n  \"metadata\": {\n    \"total_size\": 263193047040\n  },\n  \"weight_map\": {\n    \"lm_head.weight\": \"model-00061-of-00061.safetensors\",\n    \"transformer.blocks.0.ffn.experts.mlp.v1\": \"model-00002-of-00061.safetensors\",\n    \"transformer.blocks.0.ffn.experts.mlp.w1\": \"model-00001-of-00061.safetensors\",\n    \"transformer.blocks.0.ffn.experts.mlp.w2\": \"model-00002-of-00061.safetensors\",\n    \"transformer.blocks.0.ffn.router.layer.weight\": \"model-00001-of-00061.safetensors\",\n    \"transformer.blocks.0.norm_attn_norm.attn.Wqkv.weight\": \"model-00001-of-00061.safetensors\",\n    \"transformer.blocks.0.norm_attn_norm.attn.out_proj.weight\": \"model-00001-of-00061.safetensors\",\n    \"transformer.blocks.0.norm_attn_norm.norm_1.weight\": \"model-00001-of-00061.safetensors\",\n    \"transformer.blocks.0.norm_attn_norm.norm_2.weight\": \"model-00001-of-00061.safetensors\",\n    \"transformer.blocks.1.ffn.experts.mlp.v1\": \"model-00003-of-00061.safetensors\",\n    \"transformer.blocks.1.ffn.experts.mlp.w1\": \"model-00003-of-00061.safetensors\",\n    \"transformer.blocks.1.ffn.experts.mlp.w2\": \"model-00004-of-00061.safetensors\",\n    \"transformer.blocks.1.ffn.router.layer.weight\": \"model-00002-of-00061.safetensors\",\n    \"transformer.blocks.1.norm_attn_norm.attn.Wqkv.weight\": \"model-00002-of-00061.safetensors\",\n    \"transformer.blocks.1.norm_attn_norm.attn.out_proj.weight\": \"model-00002-of-00061.safetensors\",\n    \"transformer.blocks.1.norm_attn_norm.norm_1.weight\": \"model-00002-of-00061.safetensors\",\n    \"transformer.blocks.1.norm_attn_norm.norm_2.weight\": \"model-00002-of-00061.safetensors\",\n...\n    \"transformer.blocks.39.ffn.experts.mlp.v1\": \"model-00060-of-00061.safetensors\",\n    \"transformer.blocks.39.ffn.experts.mlp.w1\": \"model-00060-of-00061.safetensors\",\n    \"transformer.blocks.39.ffn.experts.mlp.w2\": \"model-00061-of-00061.safetensors\",\n    \"transformer.blocks.39.ffn.router.layer.weight\": \"model-00059-of-00061.safetensors\",\n    \"transformer.blocks.39.norm_attn_norm.attn.Wqkv.weight\": \"model-00059-of-00061.safetensors\",\n    \"transformer.blocks.39.norm_attn_norm.attn.out_proj.weight\": \"model-00059-of-00061.safetensors\",\n    \"transformer.blocks.39.norm_attn_norm.norm_1.weight\": \"model-00059-of-00061.safetensors\",\n    \"transformer.blocks.39.norm_attn_norm.norm_2.weight\": \"model-00059-of-00061.safetensors\",    \n    \"transformer.norm_f.weight\": \"model-00061-of-00061.safetensors\",\n    \"transformer.wte.weight\": \"model-00001-of-00061.safetensors\"\n  }\n}\n<\/pre><\/div>\n\n\n\n<p>\u6a21\u578b\u7684\u6743\u91cd\u4e3b\u8981\u67094\u4e2a\u53c2\u6570,\u548c\u5927\u591a\u6570\u7684\u6a21\u578b\u53c2\u6570\u5dee\u4e0d\u591a\uff0c\u53ea\u662f\u540d\u5b57\u7a0d\u5fae\u6709\u533a\u522b<\/p>\n\n\n\n<ol class=\"wp-block-list\">\n<li>m_head.weight,<\/li>\n\n\n\n<li>transformer.blocks, 40\u4e2a(0-39)<\/li>\n\n\n\n<li>transformer.norm_f.weight,<\/li>\n\n\n\n<li>transformer.wte.weight<\/li>\n<\/ol>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>1.1 \u6743\u91cd\u7684\u5927\u5c0f<\/strong><\/h3>\n\n\n\n<p>\u5982\u679c\u60f3\u8981\u4e86\u89e3\u4e0a\u9762\u7684\u56db\u4e2a\u6743\u91cd\u7684\u5927\u5c0f\uff0c\u53c2\u7167 model.safetensors.index.json \u6587\u4ef6\uff0c\u6700\u5c11\u9700\u8981\u4e09\u4e2a .safetensors \u6587\u4ef6\uff0c\u4ee3\u7801\u5982\u4e0b\uff1a<\/p>\n\n\n\n<div class=\"wp-block-urvanov-syntax-highlighter-code-block\"><pre class=\"lang:python decode:true \">import os\nfrom safetensors import safe_open\n\nsafetensors_path = \"databricks\/dbrx\/\"\nsafetensors_files = [\n    \"model-00001-of-00061.safetensors\",\n    \"model-00002-of-00061.safetensors\",\n    \"model-00061-of-00061.safetensors\",\n    ]\n\nfor file in safetensors_files:\n    file_path = os.path.join(safetensors_path, file)\n    with safe_open(file_path, 'pt') as f:\n        for k in f.keys():\n            tensor = f.get_tensor(k)\n            total_bytes = tensor.numel() * tensor.element_size()\n            formatted_bytes_size = \"{:,}\".format(total_bytes)\n            print(f\"{k}, {tensor.size()}, {tensor.dtype}, {formatted_bytes_size} bytes\")\n<\/pre><\/div>\n\n\n\n<p>\u8fd0\u884c\u7ed3\u679c\uff1a<\/p>\n\n\n\n<div class=\"wp-block-urvanov-syntax-highlighter-code-block\"><pre class=\"lang:python decode:true \">python test05.py\ntransformer.blocks.0.ffn.experts.mlp.w1, torch.Size([172032, 6144]), torch.bfloat16, 2,113,929,216 bytes\ntransformer.blocks.0.ffn.router.layer.weight, torch.Size([16, 6144]), torch.bfloat16, 196,608 bytes\ntransformer.blocks.0.norm_attn_norm.attn.Wqkv.weight, torch.Size([8192, 6144]), torch.bfloat16, 100,663,296 bytes\ntransformer.blocks.0.norm_attn_norm.attn.out_proj.weight, torch.Size([6144, 6144]), torch.bfloat16, 75,497,472 bytes\ntransformer.blocks.0.norm_attn_norm.norm_1.weight, torch.Size([6144]), torch.bfloat16, 12,288 bytes\ntransformer.blocks.0.norm_attn_norm.norm_2.weight, torch.Size([6144]), torch.bfloat16, 12,288 bytes\ntransformer.wte.weight, torch.Size([100352, 6144]), torch.bfloat16, 1,233,125,376 bytes\ntransformer.blocks.0.ffn.experts.mlp.v1, torch.Size([172032, 6144]), torch.bfloat16, 2,113,929,216 bytes\ntransformer.blocks.0.ffn.experts.mlp.w2, torch.Size([172032, 6144]), torch.bfloat16, 2,113,929,216 bytes\ntransformer.blocks.1.ffn.router.layer.weight, torch.Size([16, 6144]), torch.bfloat16, 196,608 bytes\ntransformer.blocks.1.norm_attn_norm.attn.Wqkv.weight, torch.Size([8192, 6144]), torch.bfloat16, 100,663,296 bytes\ntransformer.blocks.1.norm_attn_norm.attn.out_proj.weight, torch.Size([6144, 6144]), torch.bfloat16, 75,497,472 bytes\ntransformer.blocks.1.norm_attn_norm.norm_1.weight, torch.Size([6144]), torch.bfloat16, 12,288 bytes\ntransformer.blocks.1.norm_attn_norm.norm_2.weight, torch.Size([6144]), torch.bfloat16, 12,288 bytes\nlm_head.weight, torch.Size([100352, 6144]), torch.bfloat16, 1,233,125,376 bytes\ntransformer.blocks.39.ffn.experts.mlp.w2, torch.Size([172032, 6144]), torch.bfloat16, 2,113,929,216 bytes\ntransformer.norm_f.weight, torch.Size([6144]), torch.bfloat16, 12,288 bytes<\/pre><\/div>\n\n\n\n<p>\u53ef\u4ee5\u770b\u5230\u6240\u6709\u6570\u636e\u7c7b\u578b\u5747\u662f bfloat16, \u5360\u4e24\u4e2a\u5b57\u8282<\/p>\n\n\n\n<p>\u6211\u4eec\u53ef\u4ee5\u8ba1\u7b97 lm_head.weigh \u7684\u5b9e\u9645\u5927\u5c0f\uff1a100352 * 6144 * 2 = 1233125376 <\/p>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>1.2 \u4fdd\u5b58\u6bcf\u4e2a\u6743\u91cd\u4e3a\u4e00\u4e2a\u6587\u4ef6<\/strong><\/h3>\n\n\n\n<div class=\"wp-block-urvanov-syntax-highlighter-code-block\"><pre class=\"lang:python decode:true \">import os\nfrom safetensors import safe_open\nimport torch\n\nsafetensors_path = \"databricks\/dbrx\/\"\nsafetensors_files = [\n    \"model-00001-of-00061.safetensors\",\n    \"model-00002-of-00061.safetensors\",\n    \"model-00061-of-00061.safetensors\",\n    ]\n\ntensors = {}\nfor file in safetensors_files:\n    file_path = os.path.join(safetensors_path, file)\n    with safe_open(file_path, 'pt') as f:\n        for k in f.keys():\n            tensor = f.get_tensor(k)\n            tensors[k] = tensor\n\ndirectory = \"pt\"\nif not os.path.exists(directory):\n    os.mkdir(directory)\n\nfor k, tensor in tensors.items():\n    if any([x in k for x in ['wte', 'lm_head', 'norm_f', 'blocks.0']]):\n        total_bytes = tensor.numel() * tensor.element_size()\n        formatted_bytes_size = \"{:,}\".format(total_bytes)\n        file_path = os.path.join(\"pt\", k)\n        print(f\"save {file_path}.pt, {formatted_bytes_size} bytes\")\n        torch.save(tensor, f\"{file_path}.pt\")\n\t<\/pre><\/div>\n\n\n\n<p>\u8fd0\u884c\u7ed3\u679c\uff1a<\/p>\n\n\n\n<div class=\"wp-block-urvanov-syntax-highlighter-code-block\"><\/div>\n\n\n\n<div class=\"wp-block-urvanov-syntax-highlighter-code-block\"><pre class=\"lang:python decode:true \"> python test06.py\nsave pt\/transformer.blocks.0.ffn.experts.mlp.w1.pt, 2,113,929,216 bytes\nsave pt\/transformer.blocks.0.ffn.router.layer.weight.pt, 196,608 bytes\nsave pt\/transformer.blocks.0.norm_attn_norm.attn.Wqkv.weight.pt, 100,663,296 bytes\nsave pt\/transformer.blocks.0.norm_attn_norm.attn.out_proj.weight.pt, 75,497,472 bytes\nsave pt\/transformer.blocks.0.norm_attn_norm.norm_1.weight.pt, 12,288 bytes\nsave pt\/transformer.blocks.0.norm_attn_norm.norm_2.weight.pt, 12,288 bytes\nsave pt\/transformer.wte.weight.pt, 1,233,125,376 bytes\nsave pt\/transformer.blocks.0.ffn.experts.mlp.v1.pt, 2,113,929,216 bytes\nsave pt\/transformer.blocks.0.ffn.experts.mlp.w2.pt, 2,113,929,216 bytes\nsave pt\/lm_head.weight.pt, 1,233,125,376 bytes\nsave pt\/transformer.norm_f.weight.pt, 12,288 bytes\n\nls -l pt\ntotal 8773904\n-rwxrwxrwx 1 tony tony 1233126591 Apr  3 20:29 lm_head.weight.pt\n-rwxrwxrwx 1 tony tony 2113930684 Apr  3 20:27 transformer.blocks.0.ffn.experts.mlp.v1.pt\n-rwxrwxrwx 1 tony tony 2113930684 Apr  3 20:25 transformer.blocks.0.ffn.experts.mlp.w1.pt\n-rwxrwxrwx 1 tony tony 2113930684 Apr  3 20:28 transformer.blocks.0.ffn.experts.mlp.w2.pt\n-rwxrwxrwx 1 tony tony     198101 Apr  3 20:25 transformer.blocks.0.ffn.router.layer.weight.pt\n-rwxrwxrwx 1 tony tony  100664829 Apr  3 20:26 transformer.blocks.0.norm_attn_norm.attn.Wqkv.weight.pt\n-rwxrwxrwx 1 tony tony   75499089 Apr  3 20:26 transformer.blocks.0.norm_attn_norm.attn.out_proj.weight.pt\n-rwxrwxrwx 1 tony tony      13806 Apr  3 20:26 transformer.blocks.0.norm_attn_norm.norm_1.weight.pt\n-rwxrwxrwx 1 tony tony      13806 Apr  3 20:26 transformer.blocks.0.norm_attn_norm.norm_2.weight.pt\n-rwxrwxrwx 1 tony tony      13622 Apr  3 20:29 transformer.norm_f.weight.pt\n-rwxrwxrwx 1 tony tony 1233126695 Apr  3 20:26 transformer.wte.weight.pt<\/pre><\/div>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>1.3 \u5206\u7c7b\u5b58\u50a8\u5f20\u91cf<\/strong><\/h3>\n\n\n\n<div class=\"wp-block-urvanov-syntax-highlighter-code-block\"><pre class=\"lang:python decode:true \">import os\nimport torch\nfrom safetensors import safe_open\n\n# \u5b9a\u4e49\u4fdd\u5b58\u8def\u5f84\u548csafetensors\u6587\u4ef6\u5217\u8868\nsafetensors_path = \"databricks\/dbrx\/\"\nsafetensors_files = [\n    \"model-00001-of-00061.safetensors\",\n    \"model-00002-of-00061.safetensors\",\n    \"model-00061-of-00061.safetensors\",\n]\n\n# \u5206\u7c7b\u5b58\u50a8\u5f20\u91cf\ngroup1_tensors = {}  # 'wte', 'lm_head', 'norm_f'\ngroup2_tensors = {}  # 'blocks.0'\n\n# \u52a0\u8f7d\u5f20\u91cf\nfor file in safetensors_files:\n    file_path = os.path.join(safetensors_path, file)\n    with safe_open(file_path, 'pt') as f:\n        for k in f.keys():\n            tensor = f.get_tensor(k)\n            if any(x in k for x in ['wte', 'lm_head', 'norm_f']):\n                group1_tensors[k] = tensor\n            elif 'blocks.0' in k:\n                group2_tensors[k] = tensor\n\n# \u786e\u4fdd\u4fdd\u5b58\u76ee\u5f55\u5b58\u5728\ndirectory = \"pt\"\nos.makedirs(directory, exist_ok=True)\n\n# \u4fdd\u5b58\u4e24\u7ec4\u5f20\u91cf\u5230\u4e0d\u540c\u7684\u6587\u4ef6\ntorch.save(group1_tensors, os.path.join(directory, \"group1_tensors.pt\"))\ntorch.save(group2_tensors, os.path.join(directory, \"group2_tensors.pt\"))\n\nprint(f\"\u4fdd\u5b58\u5b8c\u6210: group1_tensors.pt \u5305\u542b 'wte', 'lm_head', 'norm_f'\")\nprint(f\"\u4fdd\u5b58\u5b8c\u6210: group2_tensors.pt \u5305\u542b 'blocks.0'\")\n<\/pre><\/div>\n\n\n\n<p>\u8fd0\u884c\u7ed3\u679c\uff1a<\/p>\n\n\n\n<div class=\"wp-block-urvanov-syntax-highlighter-code-block\"><pre class=\"lang:python decode:true \"> python test07.py\n\u4fdd\u5b58\u5b8c\u6210: group1_tensors.pt \u5305\u542b 'wte', 'lm_head', 'norm_f'\n\u4fdd\u5b58\u5b8c\u6210: group2_tensors.pt \u5305\u542b 'blocks.0'\n\nls -l pt\ntotal 17547772\n-rwxrwxrwx 1 tony tony 2466264837 Apr  3 20:42 group1_tensors.pt\n-rwxrwxrwx 1 tony tony 6518173056 Apr  3 20:45 group2_tensors.pt<\/pre><\/div>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>1.4 \u52a0\u8f7d\u5f20\u91cf<\/strong><\/h3>\n\n\n\n<div class=\"wp-block-urvanov-syntax-highlighter-code-block\"><pre class=\"lang:python decode:true \">import os\nimport torch\n\n# \u5b9a\u4e49\u4fdd\u5b58\u8def\u5f84\u548cpt\u6587\u4ef6\u5217\u8868\npt_path = \"pt\/\"\npt_files = [\n    \"group1_tensors.pt\",\n    \"group2_tensors.pt\"\n]\n\n# \u52a0\u8f7d\u5f20\u91cf\nfor file in pt_files:\n    file_path = os.path.join(pt_path, file)\n    tensors = torch.load(file_path)\n    for k, tensor in tensors.items():\n        total_bytes = tensor.numel() * tensor.element_size()\n        formatted_bytes_size = \"{:,}\".format(total_bytes)\n        print(f\"{file_path}, {k}, {tensor.size()}, {tensor.dtype}, {formatted_bytes_size} bytes\")\n<\/pre><\/div>\n\n\n\n<p>\u8fd0\u884c\u7ed3\u679c\uff1a<\/p>\n\n\n\n<div class=\"wp-block-urvanov-syntax-highlighter-code-block\"><pre class=\"lang:python decode:true \">python test08.py\npt\/group1_tensors.pt, transformer.wte.weight, torch.Size([100352, 6144]), torch.bfloat16, 1,233,125,376 bytes\npt\/group1_tensors.pt, lm_head.weight, torch.Size([100352, 6144]), torch.bfloat16, 1,233,125,376 bytes\npt\/group1_tensors.pt, transformer.norm_f.weight, torch.Size([6144]), torch.bfloat16, 12,288 bytes\npt\/group2_tensors.pt, transformer.blocks.0.ffn.experts.mlp.w1, torch.Size([172032, 6144]), torch.bfloat16, 2,113,929,216 bytes\npt\/group2_tensors.pt, transformer.blocks.0.ffn.router.layer.weight, torch.Size([16, 6144]), torch.bfloat16, 196,608 bytes\npt\/group2_tensors.pt, transformer.blocks.0.norm_attn_norm.attn.Wqkv.weight, torch.Size([8192, 6144]), torch.bfloat16, 100,663,296 bytes\npt\/group2_tensors.pt, transformer.blocks.0.norm_attn_norm.attn.out_proj.weight, torch.Size([6144, 6144]), torch.bfloat16, 75,497,472 bytes\npt\/group2_tensors.pt, transformer.blocks.0.norm_attn_norm.norm_1.weight, torch.Size([6144]), torch.bfloat16, 12,288 bytes\npt\/group2_tensors.pt, transformer.blocks.0.norm_attn_norm.norm_2.weight, torch.Size([6144]), torch.bfloat16, 12,288 bytes\npt\/group2_tensors.pt, transformer.blocks.0.ffn.experts.mlp.v1, torch.Size([172032, 6144]), torch.bfloat16, 2,113,929,216 bytes\npt\/group2_tensors.pt, transformer.blocks.0.ffn.experts.mlp.w2, torch.Size([172032, 6144]), torch.bfloat16, 2,113,929,216 bytes<\/pre><\/div>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>2. \u4e86\u89e3 transformer.blocks.0<\/strong><\/h2>\n\n\n\n<p>\u4ece\u524d\u9762\u7684\u5206\u6790\u6765\u770b\uff0c\u6bcf\u4e00\u4e2a block \u542b\u6709\u6709 ffn \u548c norm_attn_norm<\/p>\n\n\n\n<p><\/p>\n","protected":false},"excerpt":{"rendered":"<p>0. \u7b80\u4ecb DBRX \u662f\u4e00\u4e2a\u57fa\u4e8e Transformer \u7684\u4ec5\u89e3\u7801\u5668\u5927\u578b\u8bed\u8a00\u6a21\u578b \uff08LLM\uff09\uff0c\u5b83\u4f7f\u7528\u4e0b\u4e00\u4e2a\u4ee4\u724c [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"site-sidebar-layout":"default","site-content-layout":"","ast-site-content-layout":"default","site-content-style":"default","site-sidebar-style":"default","ast-global-header-display":"","ast-banner-title-visibility":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"","ast-breadcrumbs-content":"","ast-featured-img":"","footer-sml-layout":"","theme-transparent-header-meta":"","adv-header-id-meta":"","stick-header-meta":"","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","astra-migrate-meta-layouts":"set","ast-page-background-enabled":"default","ast-page-background-meta":{"desktop":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"ast-content-background-meta":{"desktop":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[443,442],"tags":[412,411],"class_list":["post-2816","post","type-post","status-publish","format-standard","hentry","category-llm","category-llms","tag-dbrx","tag-dbrx-instruct"],"views":1550,"jetpack_sharing_enabled":true,"jetpack_featured_media_url":"","_links":{"self":[{"href":"https:\/\/www.aqwu.net\/wp\/index.php?rest_route=\/wp\/v2\/posts\/2816","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.aqwu.net\/wp\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.aqwu.net\/wp\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.aqwu.net\/wp\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.aqwu.net\/wp\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=2816"}],"version-history":[{"count":27,"href":"https:\/\/www.aqwu.net\/wp\/index.php?rest_route=\/wp\/v2\/posts\/2816\/revisions"}],"predecessor-version":[{"id":2849,"href":"https:\/\/www.aqwu.net\/wp\/index.php?rest_route=\/wp\/v2\/posts\/2816\/revisions\/2849"}],"wp:attachment":[{"href":"https:\/\/www.aqwu.net\/wp\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=2816"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.aqwu.net\/wp\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=2816"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.aqwu.net\/wp\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=2816"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}