It is Angle 📐, not Angel 👼.
| HF | Avg. |
|---|---|
| SeanLee97/angle-llama-7b-nli-20231027 | 0.8590 |
💬 The model above was trained using BERT's hyperparameters. Currently, We are working on searching for even better hyperparameters for Angle-LLaMA. We plan to release more advanced pre-trained models that will further enhance performance. Stay tuned ;)😉
📝 Training Details:
1) SeanLee97/angle-llama-7b-nli-20231027
We fine-tuned AnglE-LLaMA using 4 RTX 3090 Ti (24GB), the training script is as follows:
CUDA_VISIBLE_DEVICES=0,1,2,3 torchrun --nproc_per_node=4 --master_port=1234 train_angle.py \ --task NLI-STS --save_dir ckpts/NLI-STS-angle-llama-7b \ --w2 35 --learning_rate 2e-4 --maxlen 45 \ --lora_r 32 --lora_alpha 32 --lora_dropout 0.1 \ --save_steps 200 --batch_size 160 --seed 42 --do_eval 0 --load_kbit 4 --gradient_accumulation_steps 4 --epochs 1 The evaluation script is as follows:
CUDA_VISIBLE_DEVICES=0,1 python eval.py \ --load_kbit 16 \ --model_name_or_path NousResearch/Llama-2-7b-hf \ --lora_weight SeanLee97/angle-llama-7b-nli-20231027- using transformers
fromtransformersimportAutoModelForCausalLM, AutoTokenizerfrompeftimportPeftModel, PeftConfigpeft_model_id='SeanLee97/angle-llama-7b-nli-20231027'config=PeftConfig.from_pretrained(peft_model_id) tokenizer=AutoTokenizer.from_pretrained(config.base_model_name_or_path) model=AutoModelForCausalLM.from_pretrained(config.base_model_name_or_path).bfloat16().cuda() model=PeftModel.from_pretrained(model, peft_model_id).cuda() defdecorate_text(text: str): returnf'Summarize sentence "{text}" in one word:"'inputs='hello world!'tok=tokenizer([decorate_text(inputs)], return_tensors='pt') fork, vintok.items(): tok[k] =v.cuda() vec=model(output_hidden_states=True, **tok).hidden_states[-1][:, -1].float().detach().cpu().numpy() print(vec)- using AnglE
Coming soon!
Coming soon!
The training interface is still messy, we are working on making it better. Currently you can modify train_angle.py to train your own models.
python -m pip install -r requirements.txtDownload multi_nli + snli:
$ cd data $ sh download_data.shDownload sts datasets
$ cd SentEval/data/downstream $ bash download_dataset.sh