AI Strategy, Policy, and Workshops
For universities and research units (and other high-scrutiny non-profits).
Tailored workshops and strategy support to develop practical AI policies, governance frameworks, and pilot implementations that safeguard assessment integrity and research ethics.
Workshops are a standalone offer and begin with a short scoping call (30-45 minutes).
The Problem
You are under pressure to respond to AI in teaching, assessment, and research, but:
- • Policies are fragmented, copied from generic templates, or do not reflect discipline-specific contexts
- • Staff and students use AI anyway, often without guidance or disclosure
- • There is anxiety around assessment integrity, academic misconduct, and IP ownership
- • Leadership asks for "AI strategy" but there is no capacity to design or pilot anything practical
You need practical guidance that moves beyond policy anxiety to concrete pilots and workflows, not another generic presentation on "AI transformation."
What I Offer
Tailored Workshops
Half-day or full-day sessions on AI in teaching, assessment, research workflows, or creative practice. Designed for your discipline and institutional context, not generic corporate training.
Policy Development Support
Co-design discipline-specific AI disclosure frameworks, student-facing guidance with concrete examples, and staff guidelines that are practical rather than punitive.
Short Diagnostic Projects
Map current AI use (official and unofficial), assess risks and readiness, and propose priorities for pilots rather than abstract debates.
Pilot Implementation
Where appropriate, I can supervise small-scale pilots (e.g., AI-assisted feedback drafting, grant-writing helpers, assessment design tools) with clear governance and evaluation.
Typical Workshop Topics
AI in Assessment: Beyond Detection
Moving from reactive policing to proactive assessment design. What constitutes appropriate vs. inappropriate AI use in your discipline? How to design assessments that are robust to AI?
Practical AI Governance for Research Teams
Data classification, IP protection, compliance with funder requirements, and responsible AI use in grant applications and research workflows.
AI-Assisted Feedback: Risks and Guardrails
How to use AI to draft routine feedback while maintaining quality and fairness. What works, what fails, and how to pilot responsibly.
Student-Facing AI Guidance: Disclosure Frameworks
Co-designing clear, discipline-specific guidance with examples of permitted and prohibited AI use. Moving from vague warnings to concrete boundaries.
AI in Creative Practice and Performance
For creative arts and performance disciplines: ethical use of AI in composition, design, and collaborative work. IP, attribution, and pedagogical implications.
How It Works
Pre-Workshop Scoping Call
30-45 minute conversation to understand your institutional context, current policies (if any), key concerns, and desired outcomes. I tailor the session to your needs, not a generic template.
Workshop Delivery
Half-day or full-day session (on-site or online) with:
- Clear framework for thinking about AI in your context
- Interactive exercises grounded in your institutional context
- Small-group discussions to map risks and opportunities
- Concrete next steps (not vague aspirations)
Optional Follow-Up
Written report with policy recommendations, suggested pilot projects, or ongoing advisory support for implementation.
Typical Outcomes
- • Clearer shared understanding of AI risks and opportunities in your discipline
- • Policy language that staff can work with (not generic university-wide boilerplate)
- • Concrete next steps for pilots rather than abstract debate or anxiety
- • Reduced staff anxiety about AI use in teaching and assessment
- • Student engagement: Clear guidance increases disclosure and reduces misconduct
- • Evidence base for leadership: Can report on AI strategy with concrete actions taken
Format and Audience
Workshop Format
Half-day (3 hours) or full-day (6 hours) sessions, delivered on-site or online. Interactive, not lecture-based. Includes small-group work and structured discussion.
Typical Audience
- Faculty and academic staff
- Research leaders and Principal Investigators
- Programme directors and teaching leads
- Professional services staff (academic administration, research support)
- Senior leadership (Pro-Vice-Chancellors, Deans, Heads of School)
More detail on this approach
Read AI in higher education: from policy anxiety to practical pilots for the full framework.
Interested in a workshop?
Get in touch to discuss your institution's needs and schedule a session.
Get in TouchQuestions? Email [email protected]