Translation
Transformers
PyTorch
TensorFlow
JAX
Safetensors
t5
text2text-generation
summarization
text-generation-inference
Instructions to use google-t5/t5-large with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use google-t5/t5-large with Transformers:
# Use a pipeline as a high-level helper # Warning: Pipeline type "translation" is no longer supported in transformers v5. # You must load the model directly (see below) or downgrade to v4.x with: # 'pip install "transformers<5.0.0' from transformers import pipeline pipe = pipeline("translation", model="google-t5/t5-large")# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("google-t5/t5-large") model = AutoModelForSeq2SeqLM.from_pretrained("google-t5/t5-large") - Inference
- Notebooks
- Google Colab
- Kaggle
| *.bin.* filter=lfs diff=lfs merge=lfs -text | |
| *.lfs.* filter=lfs diff=lfs merge=lfs -text | |
| *.bin filter=lfs diff=lfs merge=lfs -text | |
| *.h5 filter=lfs diff=lfs merge=lfs -text | |
| *.tflite filter=lfs diff=lfs merge=lfs -text | |
| *.tar.gz filter=lfs diff=lfs merge=lfs -text | |
| *.ot filter=lfs diff=lfs merge=lfs -text | |
| *.onnx filter=lfs diff=lfs merge=lfs -text | |
| *.msgpack filter=lfs diff=lfs merge=lfs -text | |
| model.safetensors filter=lfs diff=lfs merge=lfs -text | |