1. Transformer(9) 章节新增两篇论文:LIMA、LLAMA2
2. 新建 PEFT 章节,包括 10 篇关于 LORA 和 ADAPTER 相关的热门论文:
* 《Parameter-Efficient Transfer Learning for NLP》
* 《BitFit: Simple Parameter-efficient Fine-tuning for Transformer-based Masked Language-models》
* 《LoRA: Low-Rank Adaptation of Large Language Models》
* 《Towards a Unified View of Parameter-Efficient Transfer Learning》
* 《AdapterDrop: On the Efficiency of Adapters in Transformers》
* 《AdapterFusion: Non-Destructive Task Composition for Transfer Learning》
* 《QLoRA: Efficient Finetuning of Quantized LLMs》
* 《AdapterHub: A Framework for Adapting Transformers》
* 《Compacter: Efficient Low-Rank Hypercomplex Adapter Layers》
* 《MAD-X: An Adapter-based Framework for Multi-task Cross-lingual Transfer》
完整内容参考:
https://www.huaxiaozhuan.com/--
FROM 114.92.176.*