Privacy & Security Tips

Expert privacy and security tips from OntarioMD’s General Counsel and Chief Privacy Officer Ariane Siegel to help you protect your patient and practice data.

March, 2026

Navigating medico-legal risks in the age of AI

It can be challenging for physicians to keep pace with rapidly changing healthcare technology, especially with the use of AI in clinical practice. Yet many physicians use digital technologies every day to provide exceptional patient care. It’s crucial to know the legal, privacy, and security risks of using these technologies and how to address them. 

The OMD Educates: Digital Health Virtual Symposium on April 24 will discuss the emerging medico-legal and privacy issues associated with using these tools in clinical settings, accounting for the current Canadian legal and regulatory environment and the recent guidance from the Information and Privacy Commissioner of Ontario on AI scribes. Register now to hear these insights.

The Information and Privacy Commissioner of Ontario (IPC) recently published guidelines to provide a comprehensive framework for adopting AI scribe tools safely, ethically, and in compliance with Ontario’s privacy regulations. Here are the key points:

  • Build trust through strong governance and accountability. This means healthcare groups and organizations must establish clear governance and accountability measures before implementing AI scribes.  
  • Choose and evaluate vendors carefully. Clinicians should assess if the technology aligns with their operational needs and privacy and security requirements.
  • Manage privacy, security, and bias risks. Privacy impact assessments, bias testing, strong safeguards, and continuous monitoring of AI systems help to ensure they remain, safe, accurate, and fair.
  • Clear expectations for developers and users must be set. For developers, it means creating safe and reliable products. For clinician users, it means thoroughly evaluating potential vendors and understanding their own accountability in obtaining patient consent and maintaining accurate medical notes generated by an AI scribe.  

Trust is fundamental to the Canadian healthcare system. Through initiatives like the Ontario AI Scribe Program, OMD continues to support clinicians in efficiently adopting digital health tools that support their needs and satisfy privacy and data handling obligations to patients.

As AI becomes further embedded in clinical and administrative workflows, questions about privacy, accountability, and patient trust have become increasingly important. The Information and Privacy Commissioner of Ontario’s Privacy Day Event, “Trustworthy AI in Health: The Promise, Perils, and Protections” on January 28, 2026, will explore these issues and examine how increasingly autonomous AI tools can support the healthcare sector while complying with PHIPA and addressing emerging risk. Physicians should consider recent IPC advice for procuring, implementing, and using AI tools, including:

  • Obtaining patient consent before their use
  • Being transparent about their purpose and risks with patients
  • Taking reasonable steps to ensure they’re evaluated for reliability and accuracy for ongoing patient safety

OMD is updating its Privacy & Security Training Modules to reflect these considerations and evolving realities. The refreshed modules will provide practical, scenario-based guidance on the use of AI, virtual meetings, personal devices, and approved technologies, to support innovation safely without adding unnecessary burden.

Clinicians are increasingly relying on digital tools to improve efficiency, reduce administrative burden, and enhance patient care. But a recent incident investigated by the Information and Privacy Commissioner of Ontario (IPC) serves as an important reminder that technology—particularly AI—can introduce new risks to patient privacy. Last fall, an AI transcription tool, or bot, attended and recorded a virtual “rounds” meeting in an Ontario hospital, without the knowledge of participants. The AI bot then sent a summary and full transcript to the full list of meeting invitees, which included former hospital employees. 

How did this happen? A former physician employee still included on the meeting invite list had installed an AI bot on a personal device. The tool automatically joined the meeting in their place, captured the discussion, and shared it broadly. It was an AI-driven, not human, action, highlighting how so-called agentic AI systems can independently initiate tasks without prompting. The hospital notified the IPC of the privacy breach and took steps to contain the damage, directing staff to delete the unauthorized e-mails and remove transcription tools from devices. It also updated its AI policies, changing its firewalls to block access to similar platforms. 

Take these steps to protect yourself and your patients:

  • Avoid using personal or unapproved apps (including transcription and AI-powered notetaking tools) during clinical discussion.
  • Audit your meeting lists and notify organizers about outdated or incorrect recipients. 
  • Be alert to AI autonomy. If an app can “listen”, “summarize”, or “join” meetings for you, it may undertake certain unintended actions (including sharing of personal information). 

Protecting patient privacy is not just a legal requirement – it is a core component of patient trust. As tools become more sophisticated it is important to truly understand and mitigate the risks. 

Key takeaways:

  • Patients direct the disclosure and assume the risk.
  • Insurers and/or portal providers must ensure compliance.
  • Physicians can rely on representations of compliance from insurers and/or portal providers.

The bottom line: while privacy vigilance is always important, physicians can upload patient forms or documents with confidence, with no additional action required.

Physicians are increasingly being asked to upload forms or documents containing personal health information to insurance or third-party portals. These platforms often claim they’re “PHIPA- and PIPEDA-compliant,” but how can physicians be sure? What happens if there is a privacy breach?

When physicians send patient information through these portals, they are doing so at the patient’s direction — not for their own purposes. Patient consent governs the disclosure, and by asking physicians to upload the information, patients have already given their consent (which is usually implied). In this situation, physicians are essentially acting as agents for patients, not as independent data controllers.

Because of this, the privacy risk does not fall on physicians. Insurers or portal providers — the parties collecting and storing the data — are responsible for meeting privacy and security requirements under PHIPA and PIPEDA. It is also reasonable for physicians to rely on assurances of compliance from insurers or portal providers.

Key takeaways:

  • Patients direct the disclosure and assume the risk.
  • Insurers and/or portal providers must ensure compliance.
  • Physicians can rely on representations of compliance from insurers and/or portal providers.

The bottom line: while privacy vigilance is always important, physicians can upload patient forms or documents with confidence, with no additional action required.

October is Cybersecurity Awareness Month, a timely reminder that one of the scariest things this Halloween may not be ghosts and goblins, but it may be in your inbox or your phone! In health care, we are seeing increasingly complex threats: phishing emails are becoming increasingly convincing, urging you to click on an email or attachment from a colleague or hospital. Attackers are also using impersonation “quishing” (QR-code phishing), where an image or email asks you to scan a QR code leading to a malicious site, or using phone-based “vishing” scams, spoofing official phone numbers, mimicking caller ID, and even using AI to trick targets into sharing sensitive information. Attackers are also targeting business associates and smaller vendors as a back door into larger health systems.

Personal health information is extremely valuable to cyber attackers. They know the stakes are high: when systems go down, critical data can be temporarily inaccessible or lost entirely. In recent months, healthcare organizations in Canada and beyond have seen major breaches and ransomware attacks which have impacted millions of patient records. Furthermore, a recent decision clarified that under PHIPA, even an unauthorized encryption still counts as a “use” of personal health information and must be reported to the Privacy Commissioner. This is another reminder that privacy duties apply not just to charts and conversations, but to digital security events. 

To stay ahead of these ghouls: 

  • Always verify unexpected requests, especially where they are marked “urgent” or “critical”
  • Think before you click and don’t click on suspicious links or QR codes
  • Enable multi-factor authentication
  • Make sure your clinic’s privacy breach protocol includes steps for dealing with cybersecurity events, and
  • Promptly report any unusual messages to your IT or security personnel.

 This past month, Ontario’s Information and Privacy Commissioner (IPC) issued PHIPA Decision 298, its first administrative monetary penalty under the Personal Health Information Protection Act (PHIPA) against a physician who accessed patient records from a hospital without authorization to solicit his clinic’s private services. The IPC imposed financial penalties on both the physician and the clinic involved. The doctor has been ordered to pay a $5,000 penalty for accessing and using patients’ hospital records without authorization for personal financial gain. For its part, the clinic has been ordered to pay a penalty of $7,500 for failing to meet its most basic obligations under PHIPA. [SM1] The clinic could not demonstrate any evidence of privacy policies or data governance, which formed the basis of why fines were imposed. The decision underscores that custodians must be able to demonstrate that they have reasonable safeguards and governance systems in place, and are applied in practice, not just on paper.

Some critical obligations include

  • Protecting personal health information (PHI) from unauthorized use or disclosure, through:
    • administrative, technical and physical measures or safeguards
    • privacy policies, procedures and practices, with auditing functionality
    • privacy training, awareness programs, and initiatives
    • confidentiality agreements
  • Reviewing measures or safeguards from time-to-time to ensure continued protection of PHI
  • Ensuring no agent of the custodian collects, uses, discloses, retains or disposes of PHI contrary to PHIPA
  • Ensuring that PHI is not collected without authority

For physicians, this decision is a stark reminder about evolving expectations around privacy, digital tools, and accountability. As indicated above, the IPC emphasized the importance of regular and appropriate privacy training for which an individual can demonstrate records of completion. OMD provides physicians and support staff with free training and certificates upon completion.

Healthcare professionals rely on an increasing array of new digital health tools, and every year, cybersecurity threats become more frequent and sophisticated. In 2025, no clinic is too small to be a target. 

Smaller clinics can be attractive targets because attackers know they may have a more limited IT infrastructure, including fewer cybersecurity safeguards. Last year, one third of privacy incidents reported to the Information and Privacy Commissioner came from the healthcare sector. These incidents came in many forms, from ransomware attacks to email phishing campaigns. More recently, the U.S. Cybersecurity and Infrastructure Security Agency and the FBI issued a joint alert highlighting malware group Interlock’s targeting of critical infrastructure, which appears focussed in on the healthcare sector. Several high-profile U.S. specialty medical clinics have already subject to such attacks, with Interlock often asking for a ransom or extracting data to sell on the dark web. 

The legal and regulatory obligations of physicians in an increasingly complex digital health environment will be a critical topic at this year’s OntarioMD Digital Health Conference. On Thursday September 18, I will host “Emerging Medico-legal and Privacy Issues:  Ask a Lawyer, Regulator and Futurist”.  We will tackle your questions, including those related to artificial intelligence. Cybersecurity will be featured on Friday, September 19 at “Cybersecurity Check-Up 2025 Edition: Strategies to Keep Your Practice Secure”.  A physician leader, regulator and vendor will provide practical, actionable advice on how to defend your practice. You’ll learn how to spot red flags like phishing scams, recognize everyday habits and policies which can present risks, and identify vulnerabilities in clinic workflows. This year’s panel will feature Dr. Sharon Domb of Sunnybrook Health Sciences, John Solomos, CEO of BlueBird IT Solutions, and Andrew Drummond from the Office of the Information and Privacy Commissioner of Ontario.  Please join me in both these valuable sessions and bring your questions. Conference registration is now open – I look forward to speaking with you!

Most health information custodians and their clinics are now using Electronic Medical Record systems which store patient records.  However, questions continue to arise with respect to how to manage older paper charts, archived files, or outdated data. Recent events have shown how important it is for clinics to have a plan for securely disposing of these documents when they’re no longer needed.

While it may seem simple, tossing away or improperly tearing or shredding old records can put patients’ privacy at risk and expose you or your clinic to reputational damage. Under the Personal Health Information Protection Act (PHIPA), physicians have a duty to securely dispose of patient records once their retention period ends. It’s not enough to keep health records secure only while they’re in use. Instead, every practice needs to have in place clear, documented processes, policies, and training for staff on how long records are kept and how they can be securely destroyed. These processes include shredding, secure bins, or full digital deletion. Without these types of policies and procedures in place, deleting records can amount to a privacy breach if done improperly. 

If you’re reviewing your clinic’s recordkeeping practices, it may be useful to  take stock of your records! OntarioMD’s Privacy and Security Training also includes practical tips on records retention and disposal, and other ways to keep your clinic’s data management and privacy practices up-to-date, informed, and secure.

The OMD team was pleased to help lead the legal and regulatory evaluation of AI scribes in Ontario and support the development of requirements for the Ontario AI Scribe Program. 

  • Supply Ontario established the AI Scribe Vendor of Record (VOR) arrangement through a rigorous, open, and competitive procurement process.
  • This comprehensive evaluation considered a wide range of criteria, including technical, clinical, legal, and business requirements, as well as critical privacy and security considerations. 
  • OntarioMD sought input from a variety of stakeholders including medical associations and regulators. The OMA and the CMPA, among others, were organizations that provided feedback. This assisted the process to develop requirements and contracts to align with the medico-legal obligations of physicians. 

Supply Ontario’s Vendor of Record list includes vendors who have met the requirements set out in the procurement process. This approach was designed to ensure that qualified vendors will deliver exceptional value and service to Ontario clinicians.

When integrating AI tools into your practice, it is important to ensure patient data is protected and that your vendor maintains the standards of privacy and security required by PHIPA and other privacy laws. As you may be aware, Ontario’s AI Scribe Program and its Vendor of Record approach is coming soon. It will help support your AI scribe adoption process and compliance obligations. In the interim, if you conduct trials of different AI scribe solutions, consider these steps:

  • Choose PHIPA-Compliant Tools: Choose tools from AI vendors that comply with PHIPA and PIPEDA, and that include security measures to protect PHI, and limit data use to authorized purposes. Ask vendors to provide proof of their compliance and data handling protocols. The Ontario AI scribe VOR Program has already done this for you. Check out the OMD Practice Hub for more information.
  • Obtain Valid Consent: PHIPA requires informed, valid consent from patients before you use an AI scribe for the first time. Download OMD’s Patient Consent Toolkit to learn more and simplify this process. 
  • Research and Implement Security & Privacy Safeguards: PHIPA requires reasonable security safeguards to protect PHI against unauthorized access, use, disclosure, or destruction. Confirm your chosen AI scribe includes encryption (for data at rest and in transit), access logs, and regular security updates. Some vendors’ tools may have certifications (e.g., SOC 2, ISO 27001), which can help demonstrate adherence to more industry recognized cybersecurity standards.
  • Ensure the Vendor Performs Regular Audits: Regular audits can help verify AI tools remain compliant with appropriate privacy laws. Audits may include reviewing access logs, checking for unauthorized access, assessing and reviewing data storage practices, and identifying any non-compliance in vendor operations.
  • Ensure  Vendors have Clear Data Retention and Disposal Policies: PHIPA specifies how long PHI should be retained, requiring records to be accessible as long as necessary for the patient’s recourse under PHIPA. Your vendor should establish very short retention limits for AI-generated data and enforce secure disposal practices to prevent unauthorized access post-termination.

Healthcare technology is accelerating so rapidly, it’s challenging to keep up – yet many doctors use digital technologies daily to provide exceptional patient care. With many new emerging technologies, it’s important to consider the legal, privacy and security implications of these tools, particularly, whether these technologies have been vetted by appropriate regulators to assess their compatibility with clinicians’ professional obligations. What privacy standards and guardrails govern these solutions?  Most importantly, does the use of the tool easily lend itself to ensuring clinicians can meet regulatory expectations.

At the OMD Educates: Digital Health Virtual Symposium on April 25, I’ll be discussing strategies to safeguard patient and practice data through cybersecurity best practices. Other speakers will tackle   topics such as AI, roster management and clinical efficiencies. Register now to learn more about these issues and ask your questions! If you can’t make it in person, your conference registration gives you on-demand access to session recordings for up to 90 days after the symposium

OntarioMD’s privacy and security training provides essential, tailored guidance to clinicians and administrative staff to meet your professional obligations and safeguard your patients’ data. The training has been widely used across the province by almost 9,000 clinicians and support staff, with a 90% satisfaction rating. Some key features of the training are:

  • No cost – our training is accessible to healthcare providers for free!
  • Comprehensive content on a range of topics, including regulatory compliance, consent, virtual care, cybersecurity risks, and protecting personal health information (PHI).
  • Physician-led, regulator-informed content tailored for Ontario’s healthcare sector. Our training was developed with input from the College of Physicians and Surgeons of Ontario, the Canadian Medical Protective Association, the Ontario Medical Association, the Information and Privacy Commissioner of Ontario and Ontario Health, ensuring it meets the needs of healthcare providers.
  • User tracking & completion certificates – upon completion of each module, users can print a proof of training certificate. This feature facilitates tracking of user progress and completion, which can also be shared with clinics for auditing/tracking purposes.
  • CME credits (Mainpro+ & MOC) – earn while you learn! 

Implementing AI in clinical practice raises many questions about both benefits as well as risks, but OMD’s AI Knowledge Zone can help you navigate them with confidence. From tips on implementation and workflow to an evaluation matrix to a patient consent toolkit, our site provides support to understand what you should consider when using an AI scribe. As AI tools in health care evolves, so too, will OMD’s AI Knowledge Zone. Check back for updates.

Privacy and Security Tips for the Holidays (and Always) 

The start of a new year is the best time to refresh your privacy and security awareness by completing OMD’s training modules. As our reliance on a variety of digital health technologies grows, it’s crucial to understand the risks and accountability obligations associated with using these technologies to be better positioned to protect your patients’ data and your practice. The training modules are comprehensive, covering key privacy and cybersecurity issues, and best of all, are free!

Using OMD-certified EMRs can also provide peace of mind as they meet a minimum set of privacy and security requirements (and give you access to provincial digital tools like HRM and more). OMD’s additional gifts to you this month are reminder tips to stay vigilant when using your devices as you browse and shop this holiday season. This is the time of year when scams and hacking abound. We hope you stay safe and your personal information stays protected.

There can be many benefits to AI scribes, including saving time in documenting patient encounters and reducing your cognitive load. You can read about these benefits in OMD’s AI Scribe Evaluation Report, a summary of the results of the AI scribe study led by OMD. With those benefits come legal and privacy risks, such as whether prospective AI scribe vendors are accessing and using patient data. 

It’s important to manage the risks of using an AI scribe. There are key contractual elements to keep in mind before signing an agreement with an AI vendor, such as data ownership, control, and storage, term length, regulatory compliance, and more. Similarly, obtaining express consent from patients should be done before you first use an AI scribe. The consent must be valid and meaningful. As part of obtaining this consent, you can consider doing so using a consent form. To learn more about consent requirements, visit OMD’s new AI Knowledge Zone, which provides a Patient Consent Toolkit, with ideas for how to seek patient consent and provide them necessary information on the use and risks of your AI scribe. Watch our latest OMD Educates webinar to help you choose the right AI scribe for your practice.  

Privacy and Security Tips for the Holidays (and Always) 

OMD’s additional gifts to you this month are reminder tips to stay vigilant when using your devices as you browse and shop this holiday season. This is the time of year when scams and hacking abound. We hope you stay safe and your personal information stays protected. Happy Holidays! 

If you are thinking of adopting an AI scribe in your practice, choosing the right tool is crucial to enhancing your workflows and clinical documentation. There are many things to think about when deciding on a tool – its functionality, usability, cost, and reviewing an AI vendor’s contract, among other considerations. Watch our latest OMD Educates webinar to learn more about how to approach choosing an AI scribe and incorporating it into your workflow. With over 500 attendees and OMD Peer Leaders Dr. Kevin Samson and Dr. Kevin Brophy joining the session, it was one of our best yet!

AI has the potential to improve health care, but how do clinicians implement it in a way that satisfies our legal and accountability obligations while fostering trust with patients? Here are some potential approaches to strike the right balance between progress and governance.

  • Practical solutions are needed to foster AI innovation and build trust in AI tools. Humans need to trust AI technologies and the use of the data feeding the algorithms. This includes creating legal, regulatory, and governance mechanisms that identify the benefits of a technology and promote confidence in its use.
  • Governance guardrails should be flexible to keep pace with AI’s evolutionary nature but tailored to its use. AI governance must consider risks, be targeted to its use, adaptable and continuously monitored and updated. This could include implementing audit mechanisms, evaluating AI usage, establishing public and private structures for clear and transparent reporting and developing technical solutions that prioritize principles such as safety, privacy, equality, fairness and transparency. Allowing for a controlled environment where developers can innovate AI products while regulators can monitor provides another avenue for governance.
  • Change management and collaboration are crucial to achieve AI’s potential and address concerns such as algorithmic bias and privacy issues. Open dialogue and fewer silos between stakeholders (including policymakers, regulators, technological developers, clinicians, advocacy groups and patients) are needed to facilitate the continuous adaptation necessary for keeping up with the workflow and regulatory challenges of AI. Collaboration between the technology industry and clinician users can serve to foster innovation and develop AI solutions that enhance patient care.
  • Patients must be engaged in the use of AI for their care. Patient interest and desire for transparency and control must be addressed in the use of AI in their care. In addition to obtaining express consent for the use of AI during medical visits, clinicians should communicate the reasons for use of the AI tool and assure patients of practices on data management. To foster transparency and enthusiastic acceptance of AI, open dialogue and consideration of patient perspectives should be solicited and considered.

Legal and ethical principles should serve as the bedrock of AI strategy in health care, underpinning governance efforts for a practical path for future implementation and adoption.

As a physician, the key to building patient trust is privacy. As regulatory guidance on AI scribe use continues to develop, physicians using these tools should know the associated privacy risks, and how to mitigate them. Here are some tips to consider:

  • Inform patients about how the scribe will be used for documentation purposes (recording conversations, note-taking, etc.) and obtain their express consent before its first use with each patient.
  • Update privacy policies to include AI references, de-identification of personal health information for AI use, and data storage, and post signage like notices of recording and data retention for the use of AI during visits. 
  • Provide patients with the ability to “opt out” of the use of AI during visits without it impacting their health care.
  • Review your AI scribe output after each patient encounter to ensure notes and documentation summarized by the AI scribe are accurate and complete. Recordkeeping remains your responsibility as a physician.

These aren’t all-encompassing tips. Physicians should consult the Canadian Medical Preventive Association, the College of Physicians and Surgeons of Ontario, or the Ontario Medical Association for AI scribe guidance. 

As health information custodians, physicians have legal, ethical and professional obligations to protect personal health information (PHI). It’s a daunting task in a virtual world, but there are precautions you can take.

Governance and training

  • Ensure you and your staff maintain security awareness by incorporating OMD’s Privacy and Security Training modules in your annual training process. The modules are comprehensive, allow you to earn continuing professional development credits and best of all, they’re free!
  • Establish proper clinic governance agreements.

Static safeguards

  • Regularly update software and hardware, including operating systems, antivirus software and firewalls.
  • Perform regular data back-ups and test the reliability of data restoration.
  • Use end-to-end encryption and two-factor authentication protocols.
  • Clean out your inbox and empty trash routinely.

Privacy

  • Use unique passwords (or consider passphrases) and change them often.
  • Obtain express consent from patients, especially where AI is used.
  • Think before you click on links in emails and ensure you (and your staff) know phishing signs and risks.
  • Only use secure wireless networks (no public wi-fi).
  • Maintain electronic audit logs.

Adopting a security-first mindset in protecting patient data can help to build patient trust in family physicians and deepen their connection.

The COVID-19 pandemic spurred unprecedented interest in virtual health care, defined as live interactions with physicians using the telephone or audio or video teleconferencing apps. Ontario’s health privacy law, the Personal Health Information Protection Act (PHIPA), applies to virtual care as it does to in-person care, so it’s important to note the policies and expectations in place. Here are simple tips to help ensure privacy compliance during virtual visits:

  • Book appointments as you would for in-person visits, whether patients call your reception or go online to secure them.
  • Check in patients, whether they need to call reception prior to their appointment, log on to a secure portal, or click on a link sent via email or secure message. This also helps to verify patient identity.
  • Use secure technology only. Avoid communication via texting, email, FaceTime or free versions of Zoom as they could expose you and your clinic to privacy breaches.
  • Mind your background. Remember to consider what is visible on screen behind and around you to ensure your background is professional.

For information on legislative and consent requirements, take OMD’s Virtual Care Privacy & Security Training Module and earn CME credits and certificates.

We also encourage you to review the guidance and best practices offered by Ontario Health. 

Digitization and cybersecurity go hand in hand. As health care becomes more digitized, vigilance against cyberattacks is that much more important. The unfortunate truth is that all clinics, regardless of their size or patient roster, are at risk. The good news is there are simple, preventive measures you can take to help protect your data and patients: 

It’s incumbent on all of us to use best cybersecurity practices. Access our two Privacy & Security Training Modules to learn more and earn CME credits and certificates upon completion.

  • Hold recurring cybersecurity training sessions for physicians and staff
  • Do not use generic or shared passwords
  • Use a secure Wi-Fi network
  • Enable multi-factor authentication
  • Ensure anti-virus software, computer systems, and devices (including internet browsers) are up to date
  • Use caution when visiting websites (look for https:// or a lock symbol in every URL) and never open email attachments or links from unknown or suspicious sources

It’s incumbent on all of us to use best cybersecurity practices. Access our two Privacy & Security Training Modules to learn more and earn CME credits and certificates upon completion. 

In our digital age, privacy and security are more important than ever. As health-care professionals, you must comply with the privacy laws that govern your practice. These include the Personal Health Information Protection Act (PHIPA) and the Personal Information Protection and Electronic Documents Act (PIPEDA).

PHIPA is provincial legislation that protects the confidentiality and privacy of personal health information (PHI) with rules on how it is collected, used and shared by health information custodians (HICs). HICs are individuals (e.g., physicians) or organizations with access to or control of PHI in their roles and duties related to providing health care.

PIPEDA, on the other hand, is federal legislation that protects personal data in a broader sense, not specific to health information as PHIPA. Your obligation to protect the privacy of PHI remains the same, but technologies such as AI scribe add a layer of complexity to the task. Remember to always exercise care and due diligence with all patient information. To learn about the latest in privacy and security compliance, register for our Digital Health Virtual Symposium on June 14 and attend my afternoon sessions, “Staying on Top of Your Legal, Privacy and Security Obligations”, and “Informed Innovation: A Physician’s Legal and Privacy Guide to AI Scribe Use”. I look forward to seeing you!

Doctors are modern-day heroes in white coats, working tirelessly to provide exceptional patient care in a rapidly evolving world. With the rapid proliferation of new technologies, it is imperative to scrutinize their legal, privacy and security implications. Have these technologies been vetted with the requisite expertise to assess their suitability for clinician use? What privacy standards govern these solutions? How do these technologies align with existing regulatory frameworks and what steps are being taken to ensure their compliance? These are some of the questions that run through my mind in my role as General Counsel and Chief Privacy Officer.

At this year’s OMD Educates: Digital Health Virtual Symposium on June 14, we will discuss these considerations, along with other topics such as AI, enhanced billing, secure messaging, and front office efficiencies, to name a few. To learn the latest information on these subjects, and hear from André Picard, our keynote speaker, award-winning author, journalist, and health columnist, register for early bird savings (over 30%) by April 25

If you can’t attend the virtual symposium, your conference fee gives you access to the recordings of the sessions to view at your convenience for four months. 

As health-care professionals, you rely on a growing list of new digital health tools. OMD is here to help you realize the benefits as well as the risks of these technologies. Formed in collaboration with the Ontario Regional Security Operations Centres (RSOCs), OMD’s Privacy & Security Training Module for the Health Care Sector includes cybersecurity content and scenarios informed by real-world experiences. We recommend taking the module annually to fulfill training requirements, keep up with the latest practices, and stay informed about contemporary cyberthreats. 

The training is relevant for clinicians in any setting in the Ontario health-care sector and is certified by the College of Family Physicians of Canada’s Ontario Chapter for 2 Continuing Medical Education Mainpro+ credits. Specialists can claim credit(s) under the Royal College Maintenance of Certification Program as a Section 2: Personal Learning Project for 2 credits/hour.

Change is a constant in everyday life. AI technologies present a whole new level of change, with promising opportunities to enhance health care by reducing physicians’ administrative burden, assisting in the diagnosis of medical conditions, and improving patient outcomes. 

To ensure AI technologies, such as AI scribes, align with health-care needs, it’s important that the health-care industry is involved in engaging vendors and regulators early. This includes creating a framework with defined principles for the development, provision and use of generative AI systems – for example, requiring that vendors comply with applicable laws and health-care regulations and conduct privacy impact and threat risk assessments. 

The input and alignment of diverse stakeholders, including the health-care industry, in managing AI’s possibilities and risks will provide the groundwork for AI implementation and adoption, effectively and responsibly. 

The start of a new year typically brings about change, including new laws and regulations. OMD reminds you that, as of January 1, 2024, changes to the Personal Health Information Protection Act (PHIPA) allow the province’s Information and Privacy Commissioner (IPC) to penalize individuals or organizations who inappropriately access or share a patient’s health information.  

This regulation gives the IPC options for how to address PHIPA breaches, with less severe remediation for unintentional infractions and fines for serious violations only (such as the authorized sale of personal health information).

To learn more about compliance with PHIPA and how to protect the privacy and security of your patients and your practice, we recommend that you take OMD’s privacy and security training and share it with colleagues. Additional insights on the implications of these new changes will be provided in the OMD Educates: Privacy & Security webinar on Feb. 28.