HintsOfTruth: A Multimodal Checkworthiness Detection Dataset with Real and Synthetic Claims
Type of publication: | Conference paper |
Citation: | vanderMeer_ACL2025_2025 |
Booktitle: | The 63rd Annual Meeting of the Association for Computational Linguistics |
Year: | 2025 |
Month: | July |
Abstract: | Misinformation can be countered with fact-checking, but the process is costly and slow. Identifying checkworthy claims is the first step, where automation can help scale fact-checkers’ efforts. However, detection methods struggle with content that is (1) multimodal, (2) from diverse domains, and (3) synthetic. We introduce HINTSOFTRUTH, a public dataset for multimodal checkworthiness detection with 27K real-world and synthetic image/claim pairs. The mix of real and synthetic data makes this dataset unique and ideal for benchmarking detection methods. We compare fine-tuned and prompted Large Language Models (LLMs). We find that well-configured lightweight text-based encoders perform comparably to multimodal models but the former only focus on identifying non-claim-like content. Multimodal LLMs can be more accurate but come at a significant computational cost, making them impractical for large-scale applications. When faced with synthetic data, multimodal models perform more robustly. |
Main Research Program: | Sustainable & Resilient Societies |
Keywords: | |
Projects: |
Idiap FACTCHECK |
Authors: | |
Added by: | [UNK] |
Total mark: | 0 |
Attachments
|
|
Notes
|
|
|