How to Conduct a HIPAA Risk Assessment: A Practical Step-by-Step Guide
If your organization handles electronic patient data, a HIPAA risk assessment isn’t just recommended â it’s required. But here’s the thing: required doesn’t have to mean overwhelming.
A Security Risk Analysis (SRA) might sound like something that belongs in a Pentagon war room, but in reality, it’s a straightforward process of understanding your organization’s vulnerabilities, assessing what could go wrong, and creating a realistic plan to prevent it. Think of it less as a compliance checkbox and more as a health exam for your patient data protection systems.
We’ve seen healthcare organizations of every size successfully conduct their own risk assessments. No special degrees required. Just clarity, honesty, and a willingness to follow a structured process. This guide walks you through it, step by step.
Before You Begin: What You’ll Need
Before you dive in, gather your team and your materials. A successful risk assessment is a team sport.
Your Core Team:
– IT/System Administrator â Knows your technology infrastructure inside and out
– Compliance Officer or Manager â Understands your regulatory obligations
– Clinical Leadership â Can speak to workflows, patient interactions, and operational realities
– Administrative/Office Manager â Knows physical workflows, access patterns, and day-to-day operations
– Facility/Security Manager â Understands physical security, access controls, and environmental risks
Don’t have a dedicated compliance officer? No problem. One person can wear multiple hats, but make sure all these perspectives are represented.
Materials You’ll Gather:
– Current IT system inventory (servers, workstations, mobile devices, printers, etc.)
– Network diagram or documentation
– List of all software and cloud services that touch patient data
– Current security policies and procedures (if they exist)
– Access control logs or documentation
– Previous audit reports or findings
– Incident history (if any)
– Vendor agreements and Business Associate Agreements (BAAs)
– Physical layout of your facility
– A copy of the HHS Security Risk Assessment Tool (free download on HHS.gov)
Time and Resources:
– Budget 40-80 hours for a small practice (under 50 people)
– Budget 100-200+ hours for larger organizations
– Spread this over 4-8 weeks rather than trying to do it in a marathon session
– You’ll need a quiet space to meet, collaborate documents (spreadsheet or shared drive), and honest conversation
Step 1: Define Your Scope â What Are You Actually Assessing?
This is where many organizations stumble, often in the wrong direction. They either scope too narrowly (missing critical systems) or too broadly (wasting time on irrelevant details).
Your scope should answer these questions:
- Which locations are we assessing? (Main office only, or branch locations too?)
- Which systems and applications touch ePHI?
- Which departments or workflows are we evaluating?
- What devices are involved? (Workstations, servers, mobile devices, tablets, printers, fax machines, USB drives?)
- Are we including remote workers and their home networks?
- What about vendors and third parties who access your systems?
A real-world scenario: We once worked with a dental practice that completed a comprehensive risk assessment â and completely overlooked the ancient fax machine in the back office. It was collecting incoming patient data with no encryption, no access controls, and a paper tray that sat exposed all day. They’d focused on their fancy new electronic health record system while the biggest vulnerability was sitting right there in plain sight. Don’t be that practice.
Pro tip: Create a simple scope document that lists what’s in and what’s out. Get stakeholder sign-off before moving forward. This prevents scope creep and ensures everyone’s working from the same blueprint.
Step 2: Inventory Your ePHI â Map Every Touchpoint
Now it’s time to become a detective. Where does patient data live in your organization? The answer is almost certainly more places than you initially think.
Patient health information doesn’t just exist in your primary EHR system. It’s scattered across multiple systems, devices, and workflows. Your job is to create a comprehensive map.
Digital Systems to Inventory:
- Electronic Health Records (EHR) system
- Practice management software
- Billing and accounting systems
- Lab management systems
- Imaging systems (DICOM viewers, PACs)
- Email systems
- Shared drives and file storage (on-premises and cloud)
- Cloud services (backup, collaboration, telehealth, etc.)
- Mobile apps and devices
- Workstations and desktop computers
- Servers and network infrastructure
- Printers, copiers, multifunction devices
- Fax machines and modems
- Phones and voicemail systems
Physical Locations Where ePHI Exists:
- Patient reception areas (sign-in sheets, scheduling)
- Clinical areas (workstations, wall-mounted displays, paper records)
- Back office (billing, data entry, scanning)
- Provider offices (work computers, monitors)
- Waiting areas (check-in systems)
- Drug/supply areas (prescription records)
- Conference rooms (patient discussions, meetings with data displays)
- Break rooms (unattended workstations, discarded materials)
- Parking lot and exterior (dumpsters with unshredded documents)
Vendors and Third Parties:
- EHR and software vendors
- Cloud service providers (Microsoft, Google, AWS)
- Backup and disaster recovery services
- IT support and managed service providers
- Transcription services
- Medical records destruction services
- Insurance companies and clearinghouses
- Patient portal companies
- Telehealth platforms
Pro tip: For each system, document:
– What data does it contain? (Patient names, medical history, financial info, diagnoses?)
– How sensitive is it? (High sensitivity = immediate risk if breached)
– Who has access?
– How is it stored and transmitted?
– What’s the backup and disaster recovery plan?
– What’s the retention requirement?
Create a simple spreadsheet with columns for System Name, Type (EHR, Email, Cloud, etc.), Data Classification, Owner, and Location. This becomes your master inventory â and it’s invaluable for ongoing compliance.
Step 3: Identify Threats and Vulnerabilities
This is where you shift from “what we have” to “what could go wrong.” HIPAA regulations (specifically 45 CFR § 164.308, 164.310, and 164.312) require you to assess four categories of threats:
Natural Disasters and Environmental Events:
– Flooding or water damage
– Fires
– Earthquakes or severe weather
– Power outages
– System failures
Example: A clinic in a flood-prone area discovered during their risk assessment that their server room had no elevated shelving and was located in the basement. Water damage could wipe out their entire patient database. They elevated critical hardware and improved drainage as a result.
Human Error and Unintentional Misuse:
– Data entry mistakes
– Emails sent to the wrong recipient
– Unencrypted devices lost or stolen
– Passwords written on sticky notes
– Failure to lock workstations
– Sharing login credentials
– Accidentally uploading patient data to public cloud storage
Example: One of the most common incidents we’ve seen is a clinician emailing a patient list to what they thought was a personal email, but it went to a lookalike address belonging to someone else. It happens. The safest organizations acknowledge this and build safeguards accordingly.
Malicious Attacks and Cybercrime:
– Ransomware and malware
– Phishing and social engineering
– Brute force attacks on passwords
– Denial of service attacks
– Insiders with access selling or sharing data
– Unpatched system vulnerabilities
Example: Ransomware is the current major threat landscape for healthcare. A vulnerable unpatched server or a single employee clicking a malicious link can encrypt an entire patient database, making it inaccessible until a ransom is paid â or forever, if the organization won’t pay.
System and Equipment Failures:
– Hardware failures
– Software crashes
– Database corruption
– Backup failures
– Vendor discontinuing service
– Network connectivity loss
Example: We worked with a practice whose backup solution had silently failed six months earlier. They didn’t notice until their main database crashed and they had no recovery option. They recovered, barely, but it was a costly lesson in the importance of testing backup and recovery processes.
Vendor and Third-Party Risks:
– Vendor breach exposing your data
– Inadequate vendor security controls
– Vendor going out of business
– Inadequate or missing Business Associate Agreements
– Insecure data transfers to vendors
Pro tip: Don’t just identify threats in the abstract. For each major threat category, ask: “Is this likely in our environment?” and “What could trigger it?” Make it real and specific to your organization.
Step 4: Evaluate Your Current Safeguards
Now you assess what’s already protecting you. HIPAA organizes safeguards into three categories: Administrative, Physical, and Technical. This framework helps ensure you’re not missing any layer of protection.
Administrative Safeguards (Policies and Procedures):
These are the rules and processes that govern how patient data is accessed and protected.
- Do you have documented security policies? (They don’t need to be 200 pages, but they should exist and be accessible)
- Is there a Security Officer or designated security responsibility?
- Do staff members receive security training? (Required annually, and new hires should receive it during onboarding)
- Is there a documented process for granting and removing system access?
- Do you have a policy for acceptable use of IT resources?
- Is there an incident response plan? (What do people do if they suspect a breach?)
- Do you maintain an audit log of who accessed patient data and when?
- Are there documented procedures for handling patient requests for records?
- Do you have Business Associate Agreements (BAAs) with all vendors who touch patient data?
- Is there a risk assessment process itself? (Yes, you’re doing it right now!)
Physical Safeguards (The Physical World):
These protect the actual devices and locations where patient data exists.
- Are servers and networking equipment locked in a secured room?
- Are workstations password-protected?
- Do users lock their screens when stepping away? (Or is there a timeout setting?)
- Are confidential areas restricted to authorized personnel?
- Are patient reception areas designed so that sensitive information isn’t visible to waiting patients?
- Are printers, copiers, and fax machines in secure locations or equipped with access controls?
- Are windows covered if patient information might be visible from outside?
- Are mobile devices (laptops, tablets, phones) encrypted?
- Is there a device disposal procedure for old equipment?
- Are destroyed documents actually being shredded, or just thrown in a dumpster?
Real scenario: One health clinic we assessed had excellent digital security â encryption, strong passwords, audit logs. But their patient check-in area had a clipboard visible to anyone in the waiting room. Everyone could see every patient’s name, appointment time, and sometimes visit reason. They implemented a simple privacy board to block the view. Low-cost, high-impact improvement.
Technical Safeguards (The Tech Layer):
These are the digital controls that prevent unauthorized access and protect data in transit and at rest.
- Is patient data encrypted when stored? (At rest)
- Is patient data encrypted when transmitted over networks? (In transit â think SSL/TLS)
- Are firewalls in place and properly configured?
- Is antivirus software deployed and kept current?
- Are systems patched and updated regularly?
- Are user IDs unique and not shared?
- Are there strong password requirements, or better yet, multi-factor authentication?
- Are audit logs maintained and reviewed?
- Is there a vulnerability scanning or penetration testing program?
- Are database backups performed and tested regularly?
- Is access limited to what each person needs? (The principle of least privilege)
- Is PHI purged according to your retention policy?
The Honest Assessment:
Be honest about what’s working and what’s not. Common gaps we see:
- No documented policies at all (we’ll fix this)
- Policies that exist but nobody follows (we’ll address this)
- Missing or outdated BAAs (we’ll remediate this)
- Inadequate encryption (we’ll upgrade this)
- Default or shared passwords (we’ll change this)
- No audit logging (we’ll implement this)
- Backup procedures that haven’t been tested (we’ll validate this)
- Staff who receive no security training (we’ll establish this)
None of these are deal-breakers. They’re all fixable. The point of the assessment is not to beat yourself up â it’s to identify where to focus your improvement efforts.
Step 5: Assess Risk Levels â Likelihood and Impact
Now comes the quantitative part. For each vulnerability you’ve identified, you need to assess its risk level using a simple formula:
Risk = Likelihood à Impact
This isn’t rocket science, but it is precise.
Likelihood: How probable is this threat in your environment?
- High (3): Very likely to occur. It’s already happened, or you have clear conditions that make it inevitable.
- Medium (2): Reasonably possible. You have some conditions that could trigger it, or it happens in your industry regularly.
- Low (1): Unlikely, but theoretically possible.
Impact: How serious would the damage be if this threat occurred?
- High (3): Would severely disrupt patient care, cause significant financial loss, trigger regulatory action, damage reputation, or expose sensitive data.
- Medium (2): Would cause operational disruption, financial loss, or expose some patient data.
- Low (1): Minor inconvenience, limited data exposure, easily recoverable.
Calculate the Risk Score:
| High Impact (3) | Medium Impact (2) | Low Impact (1) | |
|---|---|---|---|
| High Likelihood (3) | 9 (CRITICAL) | 6 (HIGH) | 3 (MEDIUM) |
| Medium Likelihood (2) | 6 (HIGH) | 4 (MEDIUM) | 2 (LOW) |
| Low Likelihood (1) | 3 (MEDIUM) | 2 (LOW) | 1 (LOW) |
Real Examples:
- Unencrypted laptop with patient data, at high risk of theft: Likelihood = High (3), Impact = High (3), Risk Score = 9 (CRITICAL)
- Lack of annual security training: Likelihood = High (3), Impact = Medium (2), Risk Score = 6 (HIGH)
- No multi-factor authentication on EHR login: Likelihood = Medium (2), Impact = High (3), Risk Score = 6 (HIGH)
- Minor software bug that occasionally duplicates patient records: Likelihood = Medium (2), Impact = Low (1), Risk Score = 2 (LOW)
- No documented incident response procedure: Likelihood = Medium (2), Impact = Medium (2), Risk Score = 4 (MEDIUM)
Pro tip: Document your reasoning. Write a sentence or two for each risk explaining why you assigned those likelihood and impact scores. “High likelihood because we’ve experienced this before” or “Medium impact because it would affect billing but not clinical care.” This helps when you’re prioritizing remediation and when you reassess next year.
Step 6: Build Your Remediation Plan â From Risk to Action
This is where your SRA becomes actionable. Risk identification is meaningless without a plan to address it.
Prioritization Strategy:
Start with your Critical (score 9) and High (score 6+) risks. These get addressed first. Don’t try to fix everything at once â that’s how projects fail. A realistic, phased approach beats an overambitious plan that stalls.
For Each High/Critical Risk, Determine Your Approach:
- Mitigate (reduce the risk): Implement controls to reduce the likelihood or impact. Example: Encrypt laptops (reduces impact of theft), implement MFA (reduces likelihood of unauthorized access).
- Accept (acknowledge and monitor): You understand the risk and accept it because the cost of mitigation outweighs the benefit. Example: A small practice might accept low risk of natural disaster affecting their server room if they have excellent backups. They’re not eliminating the risk, but they’re protecting against the impact.
- Transfer (move the risk elsewhere): Insurance or contractual responsibility. Example: Cyber liability insurance transfers part of the financial risk.
- Avoid (eliminate the risk entirely): Stop doing the thing that creates the risk. Example: If paper-based patient records are a security liability, digitize them.
Most risks are mitigated, not eliminated.
Create Your Remediation Plan:
For each high-priority risk, document:
- Risk Description: What’s the vulnerability? (e.g., “Patient laptops are not encrypted”)
- Current State: What’s the situation today? (e.g., “0 of 5 laptops have encryption enabled”)
- Target State: What’s the goal? (e.g., “100% of laptops with patient data encrypted with AES-256 or equivalent”)
- Remediation Action: What specific steps will you take? (e.g., “Deploy BitLocker to all Windows laptops, FileVault to all Macs”)
- Owner: Who’s responsible? (Assign to a specific person)
- Timeline: When will this be complete? (Be realistic. 30 days is tight; 90 days is more achievable)
- Budget/Resources: What does this cost in money or time? (Help justify the effort)
- Success Metric: How will you know when it’s done? (e.g., “100% of laptops encrypted, verified by audit”)
Create a Timeline:
- Immediate (1-30 days): Quick wins and critical issues. Policy updates, security training, basic access control fixes.
- Short-term (1-3 months): Implementation of major controls. Encryption deployment, access control changes, system upgrades.
- Medium-term (3-6 months): Vendor assessments, process changes, larger technology upgrades.
- Long-term (6-12 months): Advanced controls, comprehensive monitoring, culture change initiatives.
You’re not trying to become perfect overnight. You’re trying to follow a logical progression that balances quick wins with sustainable improvements.
Step 7: Document Everything â Your Evidence
Here’s a hard truth: if it’s not documented, it doesn’t count.
Regulators, auditors, and breach investigators are going to look at your documentation. They want to see that you thought about risks, that you assessed your vulnerabilities, and that you acted on your findings. Your risk assessment document is your evidence that you took HIPAA seriously.
What to Document:
- The SRA Report itself: Your full risk assessment with scope, inventory, threats, vulnerabilities, risk scores, and remediation plan.
- Supporting Materials: System diagrams, network architecture, vendor contracts and BAAs, current policies, interview notes.
- Risk Register: A spreadsheet or database that tracks all identified risks, scores, ownership, and remediation progress.
- Remediation Log: As you address each risk, document what you did, when, and how you verified completion.
- Reassessment Records: When you conduct your annual reassessment or respond to significant changes, document the process and findings.
Level of Detail:
The HHS Security Risk Assessment Tool (which we recommend using) is an excellent template. Your SRA doesn’t need to be a 200-page novelette, but it should be detailed enough that someone could review it a year from now and understand your reasoning.
Aim for 15-50 pages, depending on organization size. Include:
– Executive summary (2-3 pages)
– Methodology (1 page)
– Scope and inventory (3-5 pages)
– Threat and vulnerability analysis (5-10 pages)
– Risk assessment matrix (1-2 pages)
– Remediation plan (3-5 pages)
– Appendices (policies, diagrams, supporting documents)
How Long to Keep It:
The HIPAA Security Rule requires you to retain your risk assessment for a minimum of 6 years. Some organizations keep them longer for historical comparison and trend analysis. We recommend keeping at least the last 3 years of assessments on hand.
Make It Useful:
Create a version you use internally and operationally. This is your risk register â the living document you update as you remediate. This is different from the formal SRA report, which is more like a snapshot in time. Your risk register should be accessible to the team, updated monthly, and reviewed quarterly in team meetings. “Where are we on remediation? What blockers do we have? What’s next?”
Step 8: Monitor, Review, and Repeat â Building Continuous Compliance
Your risk assessment isn’t a one-time event. It’s the beginning of an ongoing cycle.
When to Reassess:
- Annually (minimum): HIPAA requires you to periodically conduct your SRA. Most organizations do this annually.
- After significant changes: New system implementation, vendor change, new workflow, new location, significant staff expansion.
- After security incidents: If you experience any type of data breach or security event, a reassessment is mandatory.
- After failed audits or findings: If an external auditor identifies gaps, you’ll need to validate improvements.
- When required by regulators: If you’re audited by HHS OCR, they may require a new SRA.
Building a Compliance Culture:
The best organizations don’t just complete their risk assessment and shelve it. They build it into their operations:
- Share findings with staff: Help employees understand the risks you’re protecting against. This builds buy-in for security practices.
- Include compliance in team meetings: “Last quarter, we mitigated 3 high-risk vulnerabilities. This quarter, we’re focusing on vendor assessment.”
- Make it everyone’s job: Security isn’t just IT’s responsibility. Clinicians, admin staff, and leadership all play a role.
- Celebrate progress: When you complete a major remediation, acknowledge it. “This month, we encrypted all patient devices â excellent work, team.”
- Learn from incidents: If something goes wrong, use it as a teaching moment. “Here’s what happened, here’s what we learned, here’s what we’re changing.”
Updating Your Remediation Plan:
As you complete remediation actions:
- Mark them complete with completion date and verification method
- Adjust timelines if needed â delays are normal
- Add new risks as they’re identified
- Review and adjust risk scores based on changes you’ve made
- Identify new vulnerabilities as your environment evolves
Your risk assessment should never be “finished” â it should be part of your operating rhythm.
Common Pitfalls â Mistakes to Avoid
We’ve seen organizations make these mistakes. You don’t have to.
1. Scope Too Narrow
The mistake: Assessing only your main EHR system and ignoring everything else.
Why it matters: Patient data exists everywhere â email, backup systems, vendor systems, cloud services, mobile devices. Missing these creates blind spots.
How to avoid it: Use the HHS SRA Tool’s scoping worksheet. Ask: “Where does patient data touch our organization?” If you’re unsure, assume it’s in scope.
2. Ignoring Vendor and Third-Party Risks
The mistake: Conducting a thorough assessment of your own systems, but not asking vendors about theirs.
Why it matters: A vendor breach can expose your patient data even if your own security is excellent. You’re only as secure as your weakest link.
How to avoid it: Request audit reports or security certifications from vendors (SOC 2, etc.). Have signed Business Associate Agreements. Ask specific questions about their encryption, access controls, and breach notification procedures.
3. Risk Scores That Don’t Match Reality
The mistake: Assigning scores based on “what should be” rather than “what is.”
Why it matters: If you underestimate risk, you won’t allocate resources properly. If you overestimate, you’ll create a remediation plan you can’t execute.
How to avoid it: Be brutally honest. If you don’t have encryption, score it as high risk. If staff aren’t getting trained, that’s a high-likelihood threat. Don’t score based on your policies; score based on your actual practices.
4. Remediation Plans That Aren’t Realistic
The mistake: Planning to fix everything in 30 days when you don’t have the budget or resources.
Why it matters: Unrealistic timelines lead to missed deadlines, demoralized teams, and incomplete remediation. Better to be honest and achieve 80% than promise 100% and deliver 40%.
How to avoid it: Break large projects into phases. Get budget approval before committing to timelines. Build in realistic time for vendor implementation, staff training, and testing. A 90-180 day remediation cycle is more sustainable than 30 days.
5. Forgetting About Compliance Culture
The mistake: Completing the SRA, addressing the technical fixes, and expecting culture to change on its own.
Why it matters: Most healthcare breaches involve human error or insider threats. You can have perfect technology and still fail if staff don’t understand why security matters.
How to avoid it: Train regularly. Share incident stories (sanitized, obviously). Make security part of your hiring and onboarding. Recognize good security practices. Discuss compliance in team meetings.
6. No Documentation, No Proof
The mistake: Conducting a thorough assessment verbally, with notes scattered across emails and sticky notes.
Why it matters: When you’re audited or investigated, your documentation is your defense. “We knew about this risk and deliberately accepted it” is a valid position if you can prove it.
How to avoid it: Create a formal SRA report. Maintain a risk register. Document remediation progress. Keep this for at least 6 years.
Tools and Resources â What You’ll Actually Use
You don’t need to start from scratch. Here are the standard tools and frameworks:
The HHS Security Risk Assessment Tool:
The Department of Health and Human Services offers a free SRA tool template at hitech.hrsa.gov. It’s comprehensive, HIPAA-aligned, and specifically designed for healthcare organizations. Start here.
NIST SP 800-66r2 (HIPAA Security Rule Implementation Guide):
This NIST publication translates the HIPAA Security Rule into practical controls. It’s the bridge between regulatory language and real-world implementation. Available free at nvlpubs.nist.gov.
NIST SP 800-30 (Guide for Conducting Risk Assessments):
If you want deep-dive methodology on risk assessment itself, this is the gold standard. It’s detailed and technical, but it’s the framework most organizations use.
Spreadsheet or Risk Management Software:
You can absolutely use Excel. Create a Risk Register with columns for: Risk ID, Description, Likelihood, Impact, Risk Score, Owner, Due Date, Status, Notes. Larger organizations often use dedicated risk management platforms (AuditBoard, LogicGate, etc.), but honestly, a well-organized spreadsheet gets the job done.
Your Existing Documentation:
– Current IT inventory
– System diagrams and network documentation
– Existing security policies
– Incident logs
– Vendor contracts and BAAs
These are your foundation. If you don’t have them, creating them is part of your risk assessment.
Frequently Asked Questions
Q: How long does a HIPAA risk assessment actually take?
A: For a small practice (under 50 people), budget 40-80 hours spread over 4-8 weeks. For a larger organization, 100-200+ hours. This isn’t a 8-hour sprint; it’s a thoughtful process. If you’re trying to complete it in less than a week, you’re either cutting corners or you have unusual resources.
Q: Do we need to hire an external consultant?
A: Not necessarily. If you have IT expertise, compliance knowledge, and leadership buy-in internally, you can conduct your own SRA. Many organizations do this successfully. That said, an external consultant brings perspective and experience. If this is your first time, or if you’re a small organization with limited internal expertise, a consultant can be valuable. Budget $5,000-$20,000 depending on scope. Many consultants also offer SRA support on an hourly basis, so you can do some of the work internally and outsource the complex parts.
Q: What if we discover major vulnerabilities we can’t afford to fix immediately?
A: This is common and doesn’t mean you’re non-compliant. HIPAA requires you to conduct the assessment and develop a remediation plan. It doesn’t require you to fix everything instantly. Document the risk, prioritize it, and include it in your remediation plan with realistic timelines. Get buy-in from leadership. Communicate with your compliance attorney if you have concerns. The worst approach is to ignore known risks and hope nobody finds out.
Q: How often should we update our risk assessment?
A: Minimum annually. But realistically, you should be updating your risk register quarterly (quick reviews) and conducting a full reassessment annually or after significant changes. Organizations that wait 2-3 years between full assessments usually have nasty surprises.
Q: Who should see the final risk assessment report?
A: Your executive leadership and board should see a summary. Your operational teams should understand the risks that affect their area. Your full SRA report (with detailed vulnerabilities) should be protected and available only to those who need it. It’s sensitive information â in the wrong hands, it’s a roadmap for attackers.
Your Next Step: You’ve Got This
Conducting a HIPAA risk assessment feels daunting the first time. All those regulations, all those systems, all those potential vulnerabilities. But here’s what we’ve learned: organizations of all sizes can do this successfully.
The process itself is logical: Define what you’re assessing. Inventory what you have. Identify what could go wrong. Evaluate your current protections. Score the risks. Build a remediation plan. Document everything. Monitor progress.
You don’t need to be perfect. You need to be thoughtful, honest, and committed to continuous improvement.
Start with the HHS SRA Tool. Bring together your team. Walk through the eight steps we’ve outlined. Document your findings. Create a realistic remediation plan. And then get to work on your highest-priority risks.
Your patients’ data deserves this attention. Your organization’s reputation depends on it. And honestly, you’ll sleep better knowing you’ve done the work.
Have questions about your specific risk assessment? Need help developing your remediation plan? That’s where Medcurity comes in. We’ve helped healthcare organizations of every size conduct successful risk assessments and implement sustainable compliance programs. Talk to our team about your security risk analysis needs.
References:
- 45 CFR § 164.308(a)(1)(ii)(A) â HIPAA Security Rule: Conduct risk assessment
- 45 CFR § 164.308(a)(1)(ii)(B) â HIPAA Security Rule: Implement security measures based on risk assessment
- NIST SP 800-30 Rev. 1 â Guide for Conducting Risk Assessments
- NIST SP 800-66 Rev. 2 â An Introductory Resource Guide for Implementing the HIPAA Security Rule
- HHS Security Risk Assessment Tool â hitech.hrsa.gov