Instructions to use c-bone/CrystaLLM-pi_SLME with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use c-bone/CrystaLLM-pi_SLME with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="c-bone/CrystaLLM-pi_SLME")# Load model directly from transformers import AutoTokenizer, PKVGPT tokenizer = AutoTokenizer.from_pretrained("c-bone/CrystaLLM-pi_SLME") model = PKVGPT.from_pretrained("c-bone/CrystaLLM-pi_SLME") - Notebooks
- Google Colab
- Kaggle
- Local Apps
- vLLM
How to use c-bone/CrystaLLM-pi_SLME with vLLM:
Install from pip and serve model
# Install vLLM from pip: pip install vllm # Start the vLLM server: vllm serve "c-bone/CrystaLLM-pi_SLME" # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:8000/v1/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "c-bone/CrystaLLM-pi_SLME", "prompt": "Once upon a time,", "max_tokens": 512, "temperature": 0.5 }'Use Docker
docker model run hf.co/c-bone/CrystaLLM-pi_SLME
- SGLang
How to use c-bone/CrystaLLM-pi_SLME with SGLang:
Install from pip and serve model
# Install SGLang from pip: pip install sglang # Start the SGLang server: python3 -m sglang.launch_server \ --model-path "c-bone/CrystaLLM-pi_SLME" \ --host 0.0.0.0 \ --port 30000 # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:30000/v1/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "c-bone/CrystaLLM-pi_SLME", "prompt": "Once upon a time,", "max_tokens": 512, "temperature": 0.5 }'Use Docker images
docker run --gpus all \ --shm-size 32g \ -p 30000:30000 \ -v ~/.cache/huggingface:/root/.cache/huggingface \ --env "HF_TOKEN=<secret>" \ --ipc=host \ lmsysorg/sglang:latest \ python3 -m sglang.launch_server \ --model-path "c-bone/CrystaLLM-pi_SLME" \ --host 0.0.0.0 \ --port 30000 # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:30000/v1/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "c-bone/CrystaLLM-pi_SLME", "prompt": "Once upon a time,", "max_tokens": 512, "temperature": 0.5 }' - Docker Model Runner
How to use c-bone/CrystaLLM-pi_SLME with Docker Model Runner:
docker model run hf.co/c-bone/CrystaLLM-pi_SLME
Model Card for CrystaLLM-pi_SLME
Model Details
Model Description
CrystaLLM-pi_SLME is a conditional generative model designed for the discovery of high-performance photovoltaic materials. It is a fine-tuned version of the CrystaLLM-pi framework, based on a GPT-2 decoder-only architecture. This variant employs the Property-Key-Value (PKV) attention mechanism to condition the generation of Crystallographic Information Files (CIFs) on the Spectroscopic Limited Maximum Efficiency (SLME) metric.
The model generates crystal structures based on a single target scalar property:
- SLME (%) - A theoretical maximum efficiency metric for photovoltaic absorbers.
- Developed by: Bone et al. (University College London)
- Model type: Autoregressive Transformer with Prefix Attention Conditioning
- Language(s): CIF (Crystallographic Information File) syntax
- License: MIT
- Finetuned from model:
c-bone/CrystaLLM-pi_base
Model Sources
- Repository: GitHub: CrystaLLM-pi
- Paper: Discovery and recovery of crystalline materials with property-conditioned transformers (arXiv:2511.21299)
- Dataset: HuggingFace: c-bone/mpdb-slme-full
Uses
Direct Use
The model is intended for the exploration of chemical space for new photovoltaic candidates. Users can condition generation on high SLME values (e.g., >25%) to discover novel materials with optimal optical and electronic properties for solar energy conversion.
Out-of-Scope Use
- Large Unit Cells: Context window limit applies (~1024 tokens).
- Production Deployment: Generated structures are theoretical predictions. Verification via Hybrid-DFT calculations and experimental synthesis is required.
Bias, Risks, and Limitations
- Implicit Learning: The model was not explicitly trained on band gap data, but implicitly learned to target the optimal Shockley-Queisser range (1.2-1.4 eV) via the SLME metric. It may be less effective at targeting SLME values driven by mechanisms outside the primary training distribution.
- Data Scarcity: The model was fine-tuned on a relatively small dataset (~5.3K materials).
How to Get Started with the Model
For instructions on how to load and run generation with this model, please refer to the _load_and_generate.py script in the CrystaLLM-pi GitHub Repository.
Training Details
Training Data
The model was fine-tuned on the MP SLME dataset, containing inorganic structures labeled with their calculated Spectroscopic Limited Maximum Efficiency.
- Source: Materials Project / Derived from Walker and Butler (via
c-bone/mpdb-slme-full) - Preprocessing: CIFs are augmented, tokenized, and SLME values are normalized.
Training Procedure
- Architecture: GPT-2 with Property-Key-Value (PKV) encoder layers. (~38.7M parameters)
- Mechanism: Prefix Tuning (PKV) is used to inject the SLME target directly into the attention mechanism.
Evaluation
Metrics
The model is evaluated based on:
- Hit-Rate: Fraction of generated materials with predicted SLME values near the target.
- VSUN: Validity, Stability, Uniqueness, and Novelty of the generated candidates.
Results
The model successfully generated stable, novel candidates (e.g., $Rb_2(NbBr_3)_3$) with high predicted efficiencies, demonstrating the ability to map complex structure-property relationships from limited data.
Citation
@misc{bone2025discoveryrecoverycrystallinematerials,
title={Discovery and recovery of crystalline materials with property-conditioned transformers},
author={Cyprien Bone and Matthew Walker and Kuangdai Leng and Luis M. Antunes and Ricardo Grau-Crespo and Amil Aligayev and Javier Dominguez and Keith T. Butler},
year={2025},
eprint={2511.21299},
archivePrefix={arXiv},
primaryClass={cond-mat.mtrl-sci},
url={[https://arxiv.org/abs/2511.21299](https://arxiv.org/abs/2511.21299)},
}
- Downloads last month
- 305
Model tree for c-bone/CrystaLLM-pi_SLME
Base model
c-bone/CrystaLLM-pi_base