Funded by the National Science Foundation|Award #2535149
Research Design

Research Methodology

SPARC employs a tiered evidence-gathering strategy combining national-level environmental scans, systematic literature reviews, institutional document analysis, semi-structured interviews, and participatory feedback cycles.

Research Methods

Phase I

National Policy Scan

Comparative analysis of research security policies across 46+ U.S. higher education institutions, examining IRB, RCR, export controls, and NSPM-33 compliance.

Phase I

Systematic Literature Review

Comprehensive review of scholarly and policy literature from 2010-2025 using Scopus, Web of Science, and Semantic Scholar databases.

Phase II

Stakeholder Interviews

IRB-approved semi-structured interviews with 30+ individuals including research administrators, IT professionals, faculty, and compliance officers.

Phase I

NLP Discourse Analysis

Natural Language Processing analysis of public digital discourse from 2024-2026, including topic modeling, sentiment analysis, and TF-IDF weighting.

SPARC research methodology illustration

Phased Evidence Building

Phase I (Months 1-6)

National Landscape Scan

  • Environmental scan of 50+ institutional policies
  • Systematic literature review (2010-2025)
  • Social media discourse analysis via NLP
  • Field Scan Report & Literature Review Report
Phase II (Months 7-12)

Institutional Case Studies

  • Selection of 3-5 diverse case study institutions
  • IRB-approved semi-structured interviews
  • Thematic coding of qualitative data
  • Secondary Data Analysis Report
Phase III (Months 13-18)

TWG Convenings

  • Recruitment of 5 national experts
  • Three virtual TWG convenings
  • Feedback integration into RSPF draft
  • Preliminary RSPF Design Memo
Phase IV (Months 19-24)

Framework Finalization

  • Usability testing at case study institutions
  • Iterative refinement based on feedback
  • Toolkit Bundle with templates and guides
  • Issue Brief & NSF Proposal Strategy Memo

Data Sources

SPARC utilizes a range of high-value data sources selected for their relevance to the research security landscape, accessibility, and potential to inform institutional profiling, policy review, and empirical analysis.

SourcePurposeAccessUsed For
IPEDSInstitutional characteristics (size, type)OpenField Scan, Secondary Data
NSF HERD SurveyR&D expenditure trendsOpenSecondary Data
NSF Research.govNSF awards & security-related projectsOpenField Scan, Issue Brief
NCARS ReportsAcademic threat models & mitigationOpenField Scan, Issue Brief
NIST CSF & 800-171Cybersecurity standards in educationOpenIssue Brief, RSPF
Scopus / Web of ScienceLiterature on compliance & gapsSubscriptionLiterature Review
Semantic ScholarNLP-mined trends in research literatureOpenLiterature Review
Data.govCybersecurity & education datasetsOpenField Scan, Secondary Data
Key Findings

What We Have Discovered

The Compliance Divide

A stark and widening gap exists between Tier 1 institutions with comprehensive security programs and Tier 2 institutions operating in a 'compliance desert' with minimal infrastructure.

Literature Gap

Peer-reviewed scholarly literature on research security at small-to-mid-sized institutions is remarkably thin. The most actionable information exists in gray literature and federal policy documents.

Universal Ethics, Uneven Security

While IRB and RCR policies are universally adopted, a significant compliance maturity gap exists in export controls and NSPM-33 responses, strongly correlated with R&D expenditure levels.

Practitioner Disconnect

NLP analysis of public discourse reveals a growing disconnect between federal policy intent and practitioner experience, dominated by themes of administrative overload and compliance fatigue.