Clinic workflow demo

Use AI for clinical queries with consent checks, patient information protection, and a retrievable record.

This walkthrough shows one real scenario: a clinician asks a clinical question, the governed session runs, and the clinic can explain what happened. You can run both paths below.

Governed clinical query

QiXChat · Example Primary Care

Patient on atorvastatin and clarithromycin. Drug interactions?

✓ Consent checked ✓ Identifiers protected ✓ Session recorded

Elevated myopathy risk. Consider pharmacist review and a temporary statin hold during the antibiotic course.

Who this is for

Built for clinics already using AI, or deciding how to use it safely.

Clinicians asking clinical questions

Drug interactions, summaries, guideline lookups, and consult follow-up during care.

Practice managers and admins

Teams that need visibility, policy controls, and a clear answer when someone asks what happened.

Clinics already trying AI

Useful when staff are already exploring tools like ChatGPT, Heidi, or Copilot but governance is not in place.

Advanced users

Clinicians who want stronger trust signals, optional multi-model review, and retrievable session records.

The problem

AI use in clinics is already happening. Governance usually is not.

How clinics use AI today

  • Patient details copied into general AI tools
  • Consent handled informally or outside the tool
  • Different staff using different tools with no central oversight
  • No reliable record of which model was used or why

Where the risk appears

  • Hard to explain later what happened in a consultation
  • Unclear whether identifying details were removed before model calls
  • No clinic-wide evidence trail for patient or regulator questions
  • More friction when practices try to scale AI use safely
🏥 Arrives
📋 Checks in Consent captured here
👨‍⚕️ Consultation AI used here
💊 Prescription

See the difference

Two ways to handle the same clinical query. Choose a scenario.

Scenario A

Dr. Chen needs to check a drug interaction. She opens a standard AI tool and types the patient details in for context.

Click Send to see what happens to Margaret's data.
chat.openai.com
AI
Hello! How can I help you today?

What happens in practice

Three outcomes a clinic needs to understand.

No valid consent

The governed session pauses before the model call.

The clinician sees why the query was paused and can use check-in consent, in-room QR, or recorded verbal consent to continue.

Valid consent and safe context

The query is allowed.

QiXHealth checks consent status, applies patient information handling, and routes the query through approved controls.

Session completed

A retrievable record is created.

The clinic can later review what was asked, what was checked, what was protected, and what was returned.

What is protected

Trust signals a clinic owner or practice manager will ask about.

Consent status

Checks whether the session has the right consent state for the intended use.

Patient information handling

Removes or blocks identifying details before governed model calls in the supported workflow.

Model controls

Routes through approved model policy and records model context with the session.

Audit and evidence

Creates a session-level record so the clinic can explain what was allowed, blocked, and returned.

Clinic governance settings

Supports clinic-level choices for consent method, workflow scope, and review posture.

Patient-facing confirmation

Supports front-desk QR, in-room confirmation, and recorded verbal consent flows.

Evidence view

After each governed session, the clinic can retrieve a plain-language summary of what happened.

Not just an answer — a record the clinic can come back to later.

Patient request summary

A clinic can show when consent was confirmed, how the governed query ran, and what record was saved.

2:14 pm Consent confirmed for the consultation

2:15 pm Governed clinical query executed

2:15 pm Patient information handling and model checks recorded

Now Summary retrieved for later review

This is what helps a clinic answer: "What happened in that session?"

Why this is different

AI tools help people use AI. QiXHealth helps clinics govern AI use.

Area General AI tools QiXHealth workflow
Clinical query workflow Usually handled inside a general AI tool Handled inside a governed clinical workflow
Consent handling Often manual, verbal, or outside the tool Checked as part of the session flow
Patient information protection Depends on user behaviour and provider settings Protection checks applied before the model call
Audit trail Usually fragmented or unavailable Session record created for later retrieval
Clinic-wide governance Difficult to standardise across staff One controlled workflow with one review surface

Advanced feature

Roundtable: a higher-confidence review mode for complex clinical queries.

For higher-trust or ambiguous clinical queries, QiXHealth can support a governed multi-model review path under one controlled session. Multiple model perspectives, one consent state, one policy context, and one evidence trail.

Useful when a clinician wants broader AI input without losing governance, visibility, or auditability. Not required to understand the core product.

Pilot offer

Start with one workflow, a small group, and assisted onboarding.

Suggested first pilot

  • One clinic, one or two workflows
  • Small clinician group, assisted onboarding
  • Weekly review during the pilot period
  • 45-day design partner pilot at no cost

Pricing starting points

  • Solo or small practice: from $149/month
  • Group practice (3–10 clinicians): from $399/month
  • Hospital groups and multi-site: custom
  • All plans include the full governance layer and audit log

Pricing is in NZD. AUD, USD, and GBP available on request. The right starting point is a walkthrough, not a checkout.

Questions clinics usually ask

Short answers first. Expand for more detail.

QiXChat is the clinician-facing interface. QiXHealth is the governance layer underneath it. The point is not just to provide answers, but to control how clinical AI is used and what record is left behind.

QiXHealth does not replace the tools you use for documentation or admin. It provides a governed interface specifically for clinical queries involving patient data. Many clinics continue using Heidi for scribing while using QiXChat for anything that touches patient identifiable information. Unlike multiple unprotected tools, QiXHealth consolidates AI risk into one auditable layer.

In the governed clinical-query workflow shown here, identifying details are removed before supported model calls. The AI receives only the clinical context needed for the query. The model never sees the patient's name or date of birth.

Clinic plans start from $149/month for a solo practice. A single privacy breach notification can cost significantly more in staff time, legal fees, and reputational damage. The EU AI Act sets fines at up to 6% of global annual revenue. HIPAA civil penalties in the US run from $100 to $50,000 per violation. The cost of QiXHealth is the cost of not having to worry about those scenarios.

Most clinics are live within a week. We handle the configuration. For clinicians, QiXChat is designed to feel like any familiar chat tool with no new clinical workflow to learn. For practice managers, the admin dashboard takes around an hour to become comfortable with. No integration with your practice management system is required to get started.

The low-friction path is front-desk capture before the consult via QR code. Returning patients can be set to auto-confirm or prompted at a renewal interval. QiXHealth also supports in-room QR codes and verbal consent recorded by the clinician. All flows create an audit record regardless of how consent was captured.

QiXChat is purpose-built for clinical advisory queries during consultations. It complements scribing tools like Heidi rather than replacing them. Heidi handles the documentation layer; QiXChat handles the clinical question-and-answer layer with governance. Some clinics run both in the same consultation window without conflict.

Yes. QiXHealth removes identifying details before governed model calls in a way designed to align with HIPAA's Safe Harbor de-identification method (45 CFR §164.514(b)). US-based practices can use QiXHealth as a HIPAA-aligned AI governance layer. The audit records also support documentation requirements under the updated Security Rule expected in 2026.

Yes. The QiXHealth admin dashboard is separate from the clinician QiXChat interface. Practice managers can view consent records, session audit logs, model verification history, and usage summaries without accessing any clinical content. Reports can be exported for regulatory inspection or internal governance reviews.

Yes. QiXHealth can support a governed Roundtable mode where multiple model perspectives are reviewed under one controlled session with the same consent state, policy controls, and evidence trail. Useful for higher-risk or ambiguous clinical queries where a clinician wants broader AI input without losing auditability.

Yes. Design partner clinics get a 45-day pilot at no cost. During the pilot your team uses QiXChat and QiXHealth in real consultations, you receive a compliance readiness report at the end, and there is no obligation to continue. To apply, book a 20-minute walkthrough.

Regulatory context

Expectations around privacy, consent, and auditability are getting stricter.

Expand each region for the current requirements and timelines relevant to your clinic.

🇳🇿 New Zealand Coming 1 May 2026
  • IPP 3A: Clinics must notify patients when AI tools collect their information indirectly. No consent record means no compliant notification.
  • MCNZ guidance: Doctors must document AI use in consultations and ensure patients are informed. Professional discipline applies.
  • HIPC 2020: Health information privacy code governs all health data. Any breach carries Privacy Commissioner enforcement risk.
🇦🇺 Australia Coming Dec 2026
  • Privacy Act reform: Mandatory disclosure of automated decision-making. OAIC active enforcement — see the Medibank case.
  • TGA: AI tools with diagnostic or treatment recommendations may require registration on the Australian Register of Therapeutic Goods.
  • State laws: VIC Health Records Act and NSW HRIPA impose additional obligations on health information holders.
🇺🇸 United States In effect + updating
  • HIPAA: AI systems handling protected health information must comply with the Privacy and Security Rules. QiXHealth is designed to align with Safe Harbor de-identification (45 CFR §164.514(b)).
  • HIPAA Security Rule update: First major update in 20 years, expected 2026. AI systems processing ePHI must be documented in risk analysis.
  • California SB 1120: Physicians must make final clinical decisions. AI cannot be the sole decision-maker in medical necessity reviews.
🇪🇺 European Union In effect Aug 2025+
  • EU AI Act: Healthcare AI is classified high-risk. Full compliance required by August 2026 for existing systems. Fines up to 6% of global annual revenue.
  • Requirements: Conformity assessment, algorithm audit trails, post-market monitoring, human oversight, and explainability documentation.
  • GDPR: Health data requires explicit consent or narrow Article 9 exceptions. DPIA mandatory before deploying AI in clinical settings.
🇬🇧 United Kingdom Framework due 2026
  • MHRA National Commission: New regulatory rulebook for AI in healthcare expected 2026. 28% of UK GPs already use AI with no regulatory oversight.
  • ICO guidance: DPIA mandatory before deploying AI in health settings. Privacy notices must disclose AI use.
  • Penalties: ICO fines up to £17.5 million or 4% of annual revenue under UK GDPR.

Explore the products

Start with the right care setting: Telehealth Allied Health Aged Care Midwifery Dental

Next step

See the workflow in your clinic. Design partner spots are available.

We run 20-minute guided walkthroughs for GPs and practice managers. No setup required.