pszemraj/fleece2instructions
Viewer • Updated • 28.9k • 111
How to use AleBurzio/bart-large-fleece2instructions-r1 with Transformers:
# Load model directly
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("AleBurzio/bart-large-fleece2instructions-r1")
model = AutoModelForSeq2SeqLM.from_pretrained("AleBurzio/bart-large-fleece2instructions-r1")This model is a fine-tuned version of facebook/bart-large on the pszemraj/fleece2instructions dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
|---|---|---|---|---|---|---|---|---|
| 1.0379 | 1.0 | 362 | 1.0932 | 62.4953 | 46.0277 | 60.6748 | 60.7667 | 14.5614 |
| 0.925 | 2.0 | 724 | 1.0436 | 64.4538 | 47.8829 | 62.5085 | 62.6165 | 14.4780 |