XLM-RoBERTa is fine-tuned on Swedish MultiCoNER2 dataset for Fine-grained Named Entity Recognition.
This model is part of the AWED-FiNER project, presented in the paper: AWED-FiNER: Agents, Web applications, and Expert Detectors for Fine-grained Named Entity Recognition across 36 Languages for 6.6 Billion Speakers.
The tagset of MultiCoNER2 is a fine-grained tagset. The fine to coarse level mapping of the tags are as follows:
- Location (LOC) : Facility, OtherLOC, HumanSettlement, Station
- Creative Work (CW) : VisualWork, MusicalWork, WrittenWork, ArtWork, Software
- Group (GRP) : MusicalGRP, PublicCORP, PrivateCORP, AerospaceManufacturer, SportsGRP, CarManufacturer, ORG
- Person (PER) : Scientist, Artist, Athlete, Politician, Cleric, SportsManager, OtherPER
- Product (PROD) : Clothing, Vehicle, Food, Drink, OtherPROD
- Medical (MED) : Medication/Vaccine, MedicalProcedure, AnatomicalStructure, Symptom, Disease
Model performance:
Precision: 85.10
Recall: 84.19
F1: 84.64
Training Parameters:
Epochs: 6
Optimizer: AdamW
Learning Rate: 5e-5
Weight Decay: 0.01
Batch Size: 64
AWED-FiNER collection | Paper | GitHub Repository | Interactive Demo
Sample Usage of Agentic Tool
The AWED-FiNER agentic tool can be used to interact with expert models trained using this framework. Below is an example:
pip install smolagents gradio_client
from tool import AWEDFiNERTool
tool = AWEDFiNERTool(
space_id="prachuryyaIITG/AWED-FiNER"
)
result = tool.forward(
text="Jude Bellingham joined Real Madrid in 2023.",
language="English"
)
print(result)
Citation
If you use this model, please cite the following papers:
@inproceedings{fetahu2023multiconer,
title={MultiCoNER v2: a Large Multilingual dataset for Fine-grained and Noisy Named Entity Recognition},
author={Fetahu, Besnik and Chen, Zhiyu and Kar, Sudipta and Rokhlenko, Oleg and Malmasi, Shervin},
booktitle={Findings of the Association for Computational Linguistics: EMNLP 2023},
pages={2027--2051},
year={2023}
}
@misc{kaushik2026awedfineragentswebapplications,
title={AWED-FiNER: Agents, Web applications, and Expert Detectors for Fine-grained Named Entity Recognition across 36 Languages for 6.6 Billion Speakers},
author={Prachuryya Kaushik and Ashish Anand},
year={2026},
eprint={2601.10161},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2601.10161},
}
@inproceedings{kaushik2026sampurner,
title={SampurNER: Fine-grained Named Entity Recognition Dataset for 22 Indian Languages},
author={Kaushik, Prachuryya and Anand, Ashish},
booktitle={Proceedings of the AAAI Conference on Artificial Intelligence},
volume={40},
year={2026}
}
- Downloads last month
- 33
Model tree for prachuryyaIITG/MultiCoNER2_Swedish_XLM
Base model
FacebookAI/xlm-roberta-large