From 0c1c500714aef1e05d3b1e032dda48667216fdd3 Mon Sep 17 00:00:00 2001 From: Awni Hannun Date: Thu, 14 Dec 2023 08:37:34 -0800 Subject: [PATCH] update readme --- phi2/README.md | 8 +++++--- 1 file changed, 5 insertions(+), 3 deletions(-) diff --git a/phi2/README.md b/phi2/README.md index aef47cd1..198ac30c 100644 --- a/phi2/README.md +++ b/phi2/README.md @@ -1,9 +1,11 @@ # Phi-2 -Phi-2 is a 2.7B parameter model released by Microsoft[^1] and trained on a mixture -of GPT-4 outputs and clean web-text. Its performance rivals much larger models. +Phi-2 is a 2.7B parameter language model released by Microsoft[^1] with +performance that rivals much larger models. It was trained on a mixture of +GPT-4 outputs and clean web text. -Phi-2 efficiently runs on an Apple silicon device with 8 GB memory in 16-bit precision. +Phi-2 efficiently runs on Apple silicon devices with 8GB of memory in 16-bit +precision. ## Setup