Scoring research methods,
not conclusions
The Citation Integrity Dashboard applies a uniform methodological rubric to published research reports. We score the process used to generate claims — not the conclusions themselves. A report can score poorly on rigor while being factually correct. A widely-cited report can have poor methodology.
Every dimension score is supported by specific evidence. Every flag links to primary documentation. Our rubric, weights, and all scoring sheets are published openly before any report is evaluated.
- 27
- Reports evaluated
- 12
- Organizations
- 1999–2026
- Years covered
- 4.5
- Avg. score / 10
Organizations
Each organization has been evaluated on one or more published reports. Scores reflect the reports scored, not the organization itself.
Single-report organizations
Evaluated Reports
Document Types
Each report is classified by document type. The type determines which rubric dimensions apply and how weights are redistributed when dimensions are N/A.