Independent privacy and AI governance advice for life sciences and pharmaceutical organisations.
Who we are.
Skandor Advisory was founded by Carl Ohrwall, a senior privacy and AI governance leader with over a decade of strategic and operational experience inside global pharmaceutical organisations, where he served as Data Protection Officer and leader of the AI/Data Governance Council spanning 15+ markets.
The foundation of every Skandor engagement is that lived experience running DPIAs across R&D, Digital, and Commercial functions; embedding privacy into AI deployment processes; managing regulators and internal boards. This judgment was not formed in a consulting practice.
It was formed in the room where decisions were made.
Executives and boards navigating privacy and AI governance in pharma need advisors who have lived the complexity they face, not theorists who have studied it from the outside, Skandor provides that perspective. The transition to independent advisory is deliberate: every engagement is principal-led from first conversation to final deliverable, bringing only considered, proportionate judgment to the specific regulatory question in front of you.Credentials & Qualifications
Fellow of Information Privacy (FIP) - IAPPCertified Information Privacy Manager (CIPM) - IAPPCertified Information Privacy Professional / Europe (CIPP/E) - IAPPOneTrust Certified Privacy Professional - OneTrustPrince2 - AXELOS
Sector ExperienceGlobal pharmaceutical R&D · Digital health · Clinical data governance · GDPR · Cross-border data transfers · AI in Pharmaceutical Organizations.
How Skandor engages.
AI Governance Assurance Review
3–4 weeks
An independent assessment of whether your AI governance framework is defensible under EU AI Act scrutiny and whether your leadership has adequate visibility of AI risk. Designed for pharmaceutical and life sciences organisations deploying AI in clinical, commercial, or operational pathways.
Scope includes
AI Act high-risk classification review for active systems
DPIA adequacy assessment for AI-driven data processing
Governance framework gap analysis
C-level reporting quality review
Senior management accountability mapping
Written assessment and board presentation
Privacy & AI Governance De-risking
3–9 months
Senior advisory embedded into transformation programmes digital platform deployments, AI tool integrations, cloud migrations, EHDS data sharing arrangements ensuring GDPR obligations, AI Act conformity, and clinical governance are managed as integrated risk, not addressed after the fact.
Scope includes
Privacy and AI risk embedded into programme governance
Article 9 health data treatment and transfer analysis
Third-party and vendor AI governance oversight
DPIA execution for novel AI processing activities
Cross-border data flow assessment and documentation
Regulator-ready documentation and audit trail support
Senior Management Assurance
Annual retainer
Sustained senior advisory for Audit committees, and C-suite executives on privacy and AI governance accountability.
Provides the independent perspective and the preparedness that regulators increasingly expect to see evidenced at the top.
Scope includes
Quarterly C-suite or audit committee briefings
Senior manager accountability and attestation support
Management information quality review on AI and privacy risk
Regulatory horizon monitoring for pharma and life sciences
Priority access for emerging regulatory questions
Preparation for supervisory authority engagement
Why now.
Pharmaceutical and life sciences organisations in Europe are navigating multiple simultaneous regulatory frameworks none of which can be addressed in isolation.
The EU AI Act classifies most clinical and patient-facing AI as high-risk, with board-level accountability now in force. GDPR's Article 9 protections for health data face growing scrutiny.
The European Health Data Space is creating secondary data obligations most organisations haven't fully assessed. And EMA's emerging AI guidance is beginning to shape what regulators will expect to see evidenced internally.
The organisations best positioned are those with clear senior judgment about which risks are material and which regulatory positions will withstand scrutiny. That is what Skandor provides.Regulatory Frameworks
EU AI Act High-risk classification for clinical AI, conformity assessment obligations, board-level accountability provisions.
GDPR Art. 9 Special category health data obligations across R&D, commercial, and digital health functions. Enforcement intensifying across EU member states.
EHDS European Health Data Space regulation creating new secondary use obligations. Implementation timelines now active across member states.
EMA Guidance Emerging EMA expectations on AI in clinical trials, pharmacovigilance, and drug development increasingly shaping supervisory expectations.
National Laws Material variation in health data law across EU member states requires jurisdiction-specific analysis for cross-border operations.