update main readme

This commit is contained in:
Awni Hannun
2024-01-03 13:45:46 -08:00
parent e81cab43e4
commit 139fbf39bc

View File

@@ -14,7 +14,7 @@ Some more useful examples are listed below.
[Mistral](llms/mistral), [Phi-2](llms/phi2), and more in the [LLMs](llms)
directory.
- A mixture-of-experts (MoE) language model with [Mixtral 8x7B](llms/mixtral).
- Parameter efficient fine-tuning with [LoRA](lora).
- Parameter efficient fine-tuning with [LoRA or QLoRA](lora).
- Text-to-text multi-task Transformers with [T5](t5).
- Bidirectional language understanding with [BERT](bert).