{"id":4740,"date":"2024-09-09T12:30:29","date_gmt":"2024-09-09T04:30:29","guid":{"rendered":"https:\/\/www.aqwu.net\/wp\/?p=4740"},"modified":"2024-09-09T12:30:29","modified_gmt":"2024-09-09T04:30:29","slug":"%e5%a6%82%e4%bd%95%e7%9f%a5%e9%81%93%e4%bd%bf%e7%94%a8lora-%e6%98%af%e5%90%a6%e6%9c%89%e6%95%88%ef%bc%9f","status":"publish","type":"post","link":"https:\/\/www.aqwu.net\/wp\/?p=4740","title":{"rendered":"\u5982\u4f55\u77e5\u9053\u4f7f\u7528lora \u662f\u5426\u6709\u6548\uff1f"},"content":{"rendered":"\n<div class=\"wp-block-jetpack-markdown\"><p>\u4e3a\u4e86\u9a8c\u8bc1 LoRA \u662f\u5426\u6709\u6548\uff0c\u53ef\u4ee5\u901a\u8fc7\u4ee5\u4e0b\u51e0\u79cd\u65b9\u6cd5\u6765\u8bc4\u4f30\u5b83\u7684\u4f5c\u7528\u6548\u679c\uff1a<\/p>\n<h3>1. <strong>\u635f\u5931\u503c\u5bf9\u6bd4<\/strong><\/h3>\n<p>\u901a\u8fc7\u5bf9\u6bd4\u5fae\u8c03\u524d\u540e\u6a21\u578b\u7684 <strong>\u635f\u5931\u503c<\/strong>\uff08loss\uff09\uff0c\u53ef\u4ee5\u76f4\u63a5\u5224\u65ad LoRA \u7684\u5fae\u8c03\u662f\u5426\u5728\u6539\u8fdb\u6a21\u578b\u7684\u6027\u80fd\u3002\u5982\u679c\u4f7f\u7528 LoRA \u8fdb\u884c\u5fae\u8c03\u7684\u6a21\u578b\u5728\u8bad\u7ec3\u6570\u636e\u96c6\u4e0a\u7684\u635f\u5931\u503c\u660e\u663e\u4f4e\u4e8e\u539f\u59cb\u6a21\u578b\uff0c\u8bf4\u660e LoRA \u8d77\u5230\u4e86\u6709\u6548\u4f5c\u7528\u3002<\/p>\n<h4>\u5b9e\u73b0\u6b65\u9aa4\uff1a<\/h4>\n<ol>\n<li>\u4f7f\u7528\u539f\u59cb\u6a21\u578b\uff08\u672a\u4f7f\u7528 LoRA\uff09\u5728\u9a8c\u8bc1\u96c6\u6216\u8bad\u7ec3\u96c6\u4e0a\u8ba1\u7b97\u635f\u5931\u3002<\/li>\n<li>\u4f7f\u7528\u5fae\u8c03\u540e\u7684\u6a21\u578b\uff08\u5e94\u7528\u4e86 LoRA \u6743\u91cd\uff09\u5728\u76f8\u540c\u7684\u6570\u636e\u96c6\u4e0a\u91cd\u65b0\u8ba1\u7b97\u635f\u5931\u3002<\/li>\n<li>\u5bf9\u6bd4\u4e24\u8005\u7684\u635f\u5931\u503c\u3002\u5982\u679c\u4f7f\u7528 LoRA \u7684\u6a21\u578b\u635f\u5931\u66f4\u4f4e\uff0c\u8868\u660e LoRA \u5fae\u8c03\u8d77\u5230\u4e86\u4f5c\u7528\u3002<\/li>\n<\/ol>\n<pre><code class=\"language-python\">from transformers import AutoModelForCausalLM, AutoTokenizer\nimport torch\n\ndef calculate_loss(model, tokenizer, inputs, labels):\n    model.eval()\n    with torch.no_grad():\n        outputs = model(**inputs, labels=labels)\n        return outputs.loss.item()\n\n# \u52a0\u8f7d\u5206\u8bcd\u5668\ntokenizer = AutoTokenizer.from_pretrained(base_model_name)\n\n# \u793a\u4f8b\u8f93\u5165\u6587\u672c\ninput_text = &quot;Hello, how are you?&quot;\ninputs = tokenizer(input_text, return_tensors=&quot;pt&quot;).to(&quot;cuda&quot;)\nlabels = inputs[&quot;input_ids&quot;]\n\n# \u539f\u59cb\u6a21\u578b\u635f\u5931\nbase_model = AutoModelForCausalLM.from_pretrained(base_model_name, torch_dtype=torch.bfloat16).to(&quot;cuda&quot;)\nbase_loss = calculate_loss(base_model, tokenizer, inputs, labels)\nprint(f&quot;\u539f\u59cb\u6a21\u578b\u635f\u5931: {base_loss}&quot;)\n\n# \u4f7f\u7528 LoRA \u7684\u6a21\u578b\u635f\u5931\nlora_model = AutoModelForCausalLM.from_pretrained(&quot;base_model_with_lora&quot;, torch_dtype=torch.bfloat16).to(&quot;cuda&quot;)\nlora_loss = calculate_loss(lora_model, tokenizer, inputs, labels)\nprint(f&quot;LoRA \u6a21\u578b\u635f\u5931: {lora_loss}&quot;)\n<\/code><\/pre>\n<p><strong>\u671f\u671b\u7ed3\u679c<\/strong>\uff1a\u5982\u679c <code>LoRA<\/code> \u5fae\u8c03\u662f\u6709\u6548\u7684\uff0c\u901a\u5e38\u60c5\u51b5\u4e0b\uff0c\u5fae\u8c03\u540e\u7684\u6a21\u578b\uff08<code>LoRA \u6a21\u578b<\/code>\uff09\u7684\u635f\u5931\u503c\u4f1a\u4f4e\u4e8e\u539f\u59cb\u6a21\u578b\u3002<\/p>\n<h3>2. <strong>\u51c6\u786e\u7387\u5bf9\u6bd4<\/strong><\/h3>\n<p>\u5bf9\u4e8e\u5206\u7c7b\u4efb\u52a1\u6216\u751f\u6210\u4efb\u52a1\uff0c<strong>\u51c6\u786e\u7387<\/strong>\uff08accuracy\uff09\u4e5f\u662f\u8bc4\u4f30 LoRA \u662f\u5426\u6709\u6548\u7684\u91cd\u8981\u6307\u6807\u4e4b\u4e00\u3002\u901a\u8fc7\u5bf9\u6bd4\u5fae\u8c03\u524d\u540e\u6a21\u578b\u5728\u9a8c\u8bc1\u96c6\u6216\u6d4b\u8bd5\u96c6\u4e0a\u7684\u51c6\u786e\u7387\uff0c\u5224\u65ad LoRA \u662f\u5426\u6709\u6548\u3002<\/p>\n<h4>\u5b9e\u73b0\u6b65\u9aa4\uff1a<\/h4>\n<ol>\n<li>\u4f7f\u7528\u76f8\u540c\u7684\u8f93\u5165\u6570\u636e\uff0c\u6bd4\u8f83\u539f\u59cb\u6a21\u578b\u548c\u4f7f\u7528 LoRA \u5fae\u8c03\u540e\u7684\u6a21\u578b\u5728\u7279\u5b9a\u4efb\u52a1\uff08\u5982\u6587\u672c\u5206\u7c7b\u6216\u95ee\u7b54\u4efb\u52a1\uff09\u4e0a\u7684\u9884\u6d4b\u7ed3\u679c\u3002<\/li>\n<li>\u7edf\u8ba1\u6b63\u786e\u9884\u6d4b\u7684\u6570\u91cf\uff0c\u8ba1\u7b97\u51c6\u786e\u7387\u3002<\/li>\n<li>\u5bf9\u6bd4\u4e24\u8005\u7684\u51c6\u786e\u7387\uff0c\u5982\u679c\u5fae\u8c03\u540e\u7684\u6a21\u578b\u51c6\u786e\u7387\u66f4\u9ad8\uff0c\u8bf4\u660e LoRA \u662f\u6709\u6548\u7684\u3002<\/li>\n<\/ol>\n<h3>3. <strong>\u751f\u6210\u8d28\u91cf\u5bf9\u6bd4<\/strong><\/h3>\n<p>\u5bf9\u4e8e\u81ea\u7136\u8bed\u8a00\u751f\u6210\u4efb\u52a1\uff0c\u53ef\u4ee5\u901a\u8fc7\u5bf9\u6bd4\u6a21\u578b\u751f\u6210\u7684\u6587\u672c\u8d28\u91cf\u6765\u9a8c\u8bc1 LoRA \u662f\u5426\u6709\u6548\u3002<\/p>\n<h4>\u5b9e\u73b0\u6b65\u9aa4\uff1a<\/h4>\n<ol>\n<li>\u4f7f\u7528\u76f8\u540c\u7684\u8f93\u5165\u6570\u636e\uff0c\u5206\u522b\u8ba9\u539f\u59cb\u6a21\u578b\u548c\u5fae\u8c03\u540e\u7684\u6a21\u578b\u751f\u6210\u6587\u672c\u3002<\/li>\n<li>\u5bf9\u6bd4\u751f\u6210\u6587\u672c\u7684\u6d41\u7545\u6027\u3001\u76f8\u5173\u6027\u548c\u51c6\u786e\u6027\u3002<\/li>\n<li>\u8fd8\u53ef\u4ee5\u7ed3\u5408\u91cf\u5316\u7684\u8bc4\u4f30\u65b9\u6cd5\uff0c\u5982 <strong>BLEU<\/strong> \u6216 <strong>ROUGE<\/strong> \u5206\u6570\uff0c\u6765\u5bf9\u751f\u6210\u6587\u672c\u8fdb\u884c\u8bc4\u4ef7\u3002<\/li>\n<\/ol>\n<pre><code class=\"language-python\"># \u52a0\u8f7d\u5206\u8bcd\u5668\u548c\u6a21\u578b\ntokenizer = AutoTokenizer.from_pretrained(base_model_name)\nlora_model = AutoModelForCausalLM.from_pretrained(&quot;base_model_with_lora&quot;, torch_dtype=torch.bfloat16).to(&quot;cuda&quot;)\n\n# \u793a\u4f8b\u8f93\u5165\ninput_text = &quot;Once upon a time&quot;\ninputs = tokenizer(input_text, return_tensors=&quot;pt&quot;).to(&quot;cuda&quot;)\n\n# \u539f\u59cb\u6a21\u578b\u751f\u6210\u7ed3\u679c\noutputs_base = base_model.generate(**inputs, max_length=50)\nprint(f&quot;\u539f\u59cb\u6a21\u578b\u751f\u6210\u7ed3\u679c: {tokenizer.decode(outputs_base[0], skip_special_tokens=True)}&quot;)\n\n# LoRA \u5fae\u8c03\u540e\u7684\u6a21\u578b\u751f\u6210\u7ed3\u679c\noutputs_lora = lora_model.generate(**inputs, max_length=50)\nprint(f&quot;LoRA \u6a21\u578b\u751f\u6210\u7ed3\u679c: {tokenizer.decode(outputs_lora[0], skip_special_tokens=True)}&quot;)\n<\/code><\/pre>\n<p><strong>\u671f\u671b\u7ed3\u679c<\/strong>\uff1a\u5982\u679c <code>LoRA<\/code> \u5fae\u8c03\u662f\u6709\u6548\u7684\uff0c\u4f7f\u7528 LoRA \u7684\u6a21\u578b\u751f\u6210\u7684\u6587\u672c\u5e94\u8be5\u66f4\u52a0\u6d41\u7545\u3001\u76f8\u5173\u6027\u66f4\u9ad8\uff0c\u4e14\u7b26\u5408\u4efb\u52a1\u8981\u6c42\u3002<\/p>\n<h3>4. <strong>\u5fae\u8c03\u540e\u7684\u6a21\u578b\u8868\u73b0\u5bf9\u6bd4<\/strong><\/h3>\n<p>\u5c06\u6a21\u578b\u5e94\u7528\u5230\u7279\u5b9a\u4efb\u52a1\u4e0a\uff0c\u4f8b\u5982\u95ee\u7b54\u3001\u6587\u672c\u5206\u7c7b\u6216\u6458\u8981\u751f\u6210\u3002\u6bd4\u8f83\u539f\u59cb\u6a21\u578b\u4e0e\u4f7f\u7528 LoRA \u5fae\u8c03\u540e\u7684\u6a21\u578b\u5728\u8fd9\u4e9b\u4efb\u52a1\u4e0a\u7684\u8868\u73b0\uff0c\u5e76\u89c2\u5bdf\u6027\u80fd\u63d0\u5347\u3002<\/p>\n<h4>\u5b9e\u73b0\u6b65\u9aa4\uff1a<\/h4>\n<ol>\n<li>\u51c6\u5907\u597d\u9002\u7528\u4e8e\u7279\u5b9a\u4efb\u52a1\u7684\u6570\u636e\u96c6\uff08\u4f8b\u5982\uff0c\u95ee\u7b54\u6570\u636e\u96c6\u6216\u6458\u8981\u6570\u636e\u96c6\uff09\u3002<\/li>\n<li>\u5206\u522b\u4f7f\u7528\u539f\u59cb\u6a21\u578b\u548c\u5fae\u8c03\u540e\u7684\u6a21\u578b\u8fdb\u884c\u63a8\u7406\u3002<\/li>\n<li>\u6839\u636e\u4efb\u52a1\u7684\u6807\u51c6\u8bc4\u4ef7\u6307\u6807\uff08\u5982\u51c6\u786e\u7387\u3001F1 \u5206\u6570\u3001BLEU \u5206\u6570\u7b49\uff09\u5bf9\u6bd4\u4e24\u8005\u7684\u8868\u73b0\u3002<\/li>\n<\/ol>\n<pre><code class=\"language-python\"># \u793a\u4f8b\uff1a\u5bf9\u6587\u672c\u5206\u7c7b\u4efb\u52a1\u7684\u51c6\u786e\u7387\u8fdb\u884c\u5bf9\u6bd4\ndef evaluate_accuracy(model, tokenizer, eval_data):\n    correct = 0\n    total = len(eval_data)\n    \n    model.eval()\n    with torch.no_grad():\n        for input_text, label in eval_data:\n            inputs = tokenizer(input_text, return_tensors=&quot;pt&quot;).to(&quot;cuda&quot;)\n            outputs = model.generate(**inputs)\n            predicted_label = tokenizer.decode(outputs[0], skip_special_tokens=True)\n            if predicted_label == label:\n                correct += 1\n                \n    accuracy = correct \/ total\n    return accuracy\n\n# \u5047\u8bbe eval_data \u662f\u4e00\u4e2a [(text, label)] \u7684\u5217\u8868\neval_data = [(&quot;What is the capital of France?&quot;, &quot;Paris&quot;), (&quot;Who wrote 1984?&quot;, &quot;George Orwell&quot;)]\n\n# \u8bc4\u4f30\u539f\u59cb\u6a21\u578b\u548c LoRA \u5fae\u8c03\u540e\u7684\u6a21\u578b\u51c6\u786e\u7387\nbase_accuracy = evaluate_accuracy(base_model, tokenizer, eval_data)\nlora_accuracy = evaluate_accuracy(lora_model, tokenizer, eval_data)\n\nprint(f&quot;\u539f\u59cb\u6a21\u578b\u51c6\u786e\u7387: {base_accuracy}&quot;)\nprint(f&quot;LoRA \u6a21\u578b\u51c6\u786e\u7387: {lora_accuracy}&quot;)\n<\/code><\/pre>\n<h3>5. <strong>\u5b9a\u91cf\u8bc4\u4f30\u6307\u6807<\/strong><\/h3>\n<p>\u5bf9\u4e8e\u81ea\u7136\u8bed\u8a00\u751f\u6210\u4efb\u52a1\uff0c\u5982\u673a\u5668\u7ffb\u8bd1\u6216\u6587\u672c\u6458\u8981\uff0c\u53ef\u4ee5\u4f7f\u7528\u5b9a\u91cf\u7684\u8bc4\u4f30\u6307\u6807\uff0c\u5982 <strong>BLEU<\/strong>\u3001<strong>ROUGE<\/strong>\u3001<strong>METEOR<\/strong>\uff0c\u6765\u5224\u65ad\u751f\u6210\u6587\u672c\u7684\u8d28\u91cf\u3002<\/p>\n<h4>BLEU\/ROUGE \u8bc4\u4f30\uff1a<\/h4>\n<pre><code class=\"language-bash\">pip install rouge_score\n<\/code><\/pre>\n<pre><code class=\"language-python\">from rouge_score import rouge_scorer\n\n# \u8ba1\u7b97\u751f\u6210\u6587\u672c\u7684 ROUGE \u5206\u6570\nscorer = rouge_scorer.RougeScorer(['rouge1', 'rougeL'], use_stemmer=True)\nscores = scorer.score(reference_text, generated_text)\nprint(f&quot;ROUGE \u5206\u6570: {scores}&quot;)\n<\/code><\/pre>\n<h3>6. <strong>\u67e5\u770b\u6743\u91cd\u5dee\u5f02<\/strong><\/h3>\n<p>\u6b64\u5916\uff0c\u60a8\u8fd8\u53ef\u4ee5\u901a\u8fc7\u53ef\u89c6\u5316\u6743\u91cd\u53d8\u5316\uff0c\u67e5\u770b LoRA \u5fae\u8c03\u524d\u540e\u6a21\u578b\u7684\u6743\u91cd\u5dee\u5f02\uff0c\u786e\u4fdd LoRA \u7684 A \u548c B \u77e9\u9635\u6709\u6548\u5730\u8c03\u6574\u4e86\u6a21\u578b\u7684\u6743\u91cd\u3002<\/p>\n<h3>\u7ed3\u8bba\uff1a<\/h3>\n<ul>\n<li><strong>\u635f\u5931\u503c\u548c\u51c6\u786e\u7387\u5bf9\u6bd4<\/strong> \u662f\u6700\u76f4\u63a5\u7684\u65b9\u5f0f\uff0c\u53ef\u4ee5\u5e2e\u52a9\u60a8\u5224\u65ad LoRA \u5fae\u8c03\u662f\u5426\u6709\u6548\u3002<\/li>\n<li><strong>\u751f\u6210\u8d28\u91cf\u5bf9\u6bd4<\/strong> \u5219\u53ef\u4ee5\u5e2e\u52a9\u8bc4\u4f30\u6a21\u578b\u5728\u81ea\u7136\u8bed\u8a00\u751f\u6210\u4efb\u52a1\u4e0a\u7684\u8868\u73b0\u3002<\/li>\n<li><strong>\u5b9a\u91cf\u8bc4\u4f30\u6307\u6807<\/strong>\uff08\u5982 BLEU \u548c ROUGE\uff09\u4e3a\u751f\u6210\u4efb\u52a1\u63d0\u4f9b\u4e86\u53ef\u9760\u7684\u5ea6\u91cf\u6807\u51c6\u3002<\/li>\n<\/ul>\n<p>\u901a\u8fc7\u8fd9\u4e9b\u65b9\u6cd5\uff0c\u60a8\u53ef\u4ee5\u786e\u5b9a\u4f7f\u7528 LoRA \u5fae\u8c03\u7684\u6548\u679c\u3002\u5982\u679c LoRA \u5fae\u8c03\u540e\u6a21\u578b\u5728\u9a8c\u8bc1\u96c6\u3001\u4efb\u52a1\u6216\u63a8\u7406\u4e0a\u7684\u8868\u73b0\u663e\u8457\u4f18\u4e8e\u539f\u59cb\u6a21\u578b\uff0c\u5c31\u53ef\u4ee5\u8bc1\u660e LoRA \u662f\u6709\u6548\u7684\u3002<\/p>\n<\/div>\n","protected":false},"excerpt":{"rendered":"","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"site-sidebar-layout":"default","site-content-layout":"","ast-site-content-layout":"default","site-content-style":"default","site-sidebar-style":"default","ast-global-header-display":"","ast-banner-title-visibility":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"","ast-breadcrumbs-content":"","ast-featured-img":"","footer-sml-layout":"","theme-transparent-header-meta":"","adv-header-id-meta":"","stick-header-meta":"","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","astra-migrate-meta-layouts":"set","ast-page-background-enabled":"default","ast-page-background-meta":{"desktop":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"ast-content-background-meta":{"desktop":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[444,443,442],"tags":[395],"class_list":["post-4740","post","type-post","status-publish","format-standard","hentry","category-ai","category-llm","category-llms","tag-lora"],"views":2522,"jetpack_sharing_enabled":true,"jetpack_featured_media_url":"","_links":{"self":[{"href":"https:\/\/www.aqwu.net\/wp\/index.php?rest_route=\/wp\/v2\/posts\/4740","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.aqwu.net\/wp\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.aqwu.net\/wp\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.aqwu.net\/wp\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.aqwu.net\/wp\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=4740"}],"version-history":[{"count":1,"href":"https:\/\/www.aqwu.net\/wp\/index.php?rest_route=\/wp\/v2\/posts\/4740\/revisions"}],"predecessor-version":[{"id":4741,"href":"https:\/\/www.aqwu.net\/wp\/index.php?rest_route=\/wp\/v2\/posts\/4740\/revisions\/4741"}],"wp:attachment":[{"href":"https:\/\/www.aqwu.net\/wp\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=4740"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.aqwu.net\/wp\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=4740"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.aqwu.net\/wp\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=4740"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}