How to use MattBou00/SequentialLR001_2000samples_R1-checkpoint-epoch-40 with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("MattBou00/SequentialLR001_2000samples_R1-checkpoint-epoch-40") model = AutoModelForCausalLM.from_pretrained("MattBou00/SequentialLR001_2000samples_R1-checkpoint-epoch-40")
The community tab is the place to discuss and collaborate with the HF community!