Top 10 Nonconformities in ISO 27001 Audits

Top 10 Nonconformities in ISO 27001 Audits

1. Risk Assessment Methodology That Exists Only on Paper

This is the single most common major nonconformity I raise, and it stems from a fundamental misunderstanding of Clause 6.1.2 requirements. Organizations create beautiful risk assessment methodologies—sometimes 30+ page documents with elaborate matrices, scoring systems, and workflows—then proceed to ignore them entirely when actually conducting assessments.

I audited a mid-sized fintech company where the documented methodology specified a 5x5 likelihood/impact matrix with clearly defined criteria for each level. When I reviewed their actual risk register, I found risks scored with decimal values (3.7 likelihood, 2.3 impact), risks scored on a 1-3 scale instead of 1-5, and no evidence that anyone had consulted the criteria definitions. When I asked the risk owner how they determined a particular threat had "high likelihood," they shrugged and said, "It felt right."

The fix isn't complicated: either follow your methodology or change your methodology to reflect what you actually do. Both are acceptable under Clause 6.1.2. What's not acceptable is documented processes that don't match reality.

2. Orphaned Controls in the Statement of Applicability

The Statement of Applicability (SoA) requirement under Clause 6.1.3(d) demands that organizations include necessary controls with clear justifications for inclusion or exclusion. What I routinely find is a complete disconnect between the risk treatment process and the SoA compilation.

Organizations select controls from Annex A based on what seems sensible rather than what their risk assessment actually demands. I'll ask, "Why did you implement Control 8.12 Data leakage prevention?" and receive blank stares followed by, "Because it's in the standard?" That's not how this works. Every control should trace back to a specific identified risk, legal requirement, or contractual obligation.

Even worse is when organizations exclude controls without adequate justification. "Not applicable" is not a justification—it's a cop-out. I recently raised a minor nonconformity when an organization excluded Control 5.19 Information security in supplier relationships with the justification "We don't have suppliers." They had 47 active SaaS subscriptions. They had suppliers—they just hadn't recognized cloud services as part of their supply chain requiring Control 5.20 addressing information security within supplier agreements.

3. Management Review That Ticks Boxes Without Adding Value

Clause 9.3 requires management review at planned intervals with specific inputs that must be considered: status of actions from previous reviews, changes in external and internal issues, feedback on information security performance, results of risk assessment, and opportunities for continual improvement.

What I typically find are meeting minutes that read like a checkbox exercise. "Risk assessment status: reviewed. Audit findings: discussed. Security metrics: presented." There's no evidence of actual management decision-making, no documented outputs, no resource allocation decisions, no strategic direction.

In one memorable audit, the entire ISMS performance discussion was captured as: "The ISMS continues to perform adequately." When I asked what metrics were presented, the management representative couldn't recall. When I asked what decisions were made, they pointed to the single action item: "Continue monitoring." This earned a major nonconformity because management review is supposed to be the steering mechanism for the entire ISMS—not a ceremonial rubber-stamping exercise.

4. Incomplete or Absent Competence Evidence

Clause 7.2 requires organizations to determine necessary competence, ensure persons are competent based on education, training, or experience, take actions to acquire competence where needed, and retain documented information as evidence. Most organizations nail the first part—they define required competencies in job descriptions—but fail spectacularly on maintaining evidence.

I regularly ask, "Show me evidence that your database administrator is competent to manage the security configurations on your SQL servers." The response is often a blank stare followed by, "Well, they've been doing it for five years." That's not evidence—that's assumption. I need to see training certificates, professional qualifications, documented skills assessments, or at minimum, records of successful completion of relevant security training modules.

The ISO 27001 SME guide emphasizes that competence evidence doesn't need to be elaborate—a simple spreadsheet tracking relevant training, certifications, and experience can suffice. But it must exist and be current.

5. Superficial Supplier Security Management

Organizations consistently underestimate the scope of Control 5.19 and Control 5.20. They focus on traditional IT suppliers while completely ignoring the security implications of their cloud services, facilities management, cleaning services, or temporary staffing agencies.

I audited a healthcare organization that had rigorous supplier security assessments for their main EMR vendor but no security evaluation whatsoever for the cloud backup service storing patient data, the document destruction company handling confidential records, or the cleaning company with after-hours access to workstations. When I pointed this out, they genuinely hadn't considered these as "information security suppliers."

The cross-reference to ISO 27036 for supplier relationship security becomes crucial here. Organizations need to map all suppliers that could impact information security—not just the obvious technology vendors.

6. Access Rights Management Without Regular Reviews

Control 5.18 requires regular review of access rights, but "regular" doesn't mean "whenever we remember." I've seen organizations with documented quarterly access reviews where the last actual review was eighteen months ago. When challenged, they often claim they review access "as needed" or "when people leave."

Reactive access management isn't access management—it's damage control. I need to see evidence of systematic, scheduled reviews that verify users still need their current access levels. This is particularly critical for privileged accounts covered under Control 5.3.

In one financial services audit, I found a contractor who'd left the organization six months earlier but still had administrative access to production databases. The organization's "access review" process consisted of HR notifying IT when someone left—but only for permanent employees. Contractors fell through the cracks entirely.

7. Asset Management That's Just an Excel Spreadsheet

Control 5.9 requires an inventory of information and other associated assets, but organizations often interpret this as "list the servers somewhere." A proper asset inventory must include classification levels, ownership, acceptable use restrictions, and handling requirements as specified in Control 5.12.

I regularly find comprehensive hardware inventories maintained by facilities teams but zero visibility into information assets like databases, file shares, or cloud repositories. The disconnect becomes critical when trying to trace how sensitive data flows through the organization—you can't protect what you don't know exists.

8. Incident Response That's Never Been Tested

Control 5.26 requires response to information security incidents, but most incident response procedures I review are theoretical frameworks that have never faced real-world testing. Organizations document elaborate escalation matrices and communication trees but haven't validated whether these procedures actually work under pressure.

During one audit, I asked to see evidence of incident response testing. The security manager proudly showed me their tabletop exercise from two years prior—a discussion-based scenario that lasted 45 minutes. When I asked about technical response procedures, communications testing, or coordination with external parties, I got more blank stares.

The ISO 27035 series provides detailed guidance on incident management that goes far beyond what most organizations implement. Regular testing isn't optional—it's the only way to verify your procedures work when they're needed most.

9. Change Management That Bypasses Security Reviews

Control 8.32 requires change management procedures that include information security implications, but I consistently find change processes where security reviews are optional, inconsistent, or entirely absent for "minor" changes.

In a manufacturing audit, I discovered that "emergency" changes—which accounted for nearly 40% of all production changes—automatically bypassed all security reviews. When I asked what constituted an emergency, the criteria were so vague that almost anything qualified. Network configuration changes, software updates, and even new system deployments had all been pushed through as "emergencies" to avoid the security review process.

10. Internal Audit That's Actually Consulting

Clause 9.2 requires internal audits at planned intervals, but many organizations confuse auditing with consulting. I've reviewed internal audit reports that read like improvement recommendations rather than objective assessments of conformity.

Internal auditors should be asking, "Does this comply with our documented procedures?" not "How can we make this better?" When I see audit findings phrased as "Consider implementing additional controls to enhance security posture," I know the auditor has missed the point entirely. Audit findings should clearly state whether requirements are met or not met, with objective evidence to support the conclusion.

The guidance in ISO 27007 emphasizes that internal audits must maintain independence and objectivity. Auditors who spend their time providing improvement suggestions aren't auditing—they're consulting.

What the Auditor Looks For

When I'm conducting certification or surveillance audits, I'm looking for specific evidence that requirements are not just documented but actually implemented:

  • Risk Assessment: Evidence that the documented methodology was actually followed, with clear traceability from risk identification through treatment decisions
  • Statement of Applicability: Clear justification for each control inclusion/exclusion that traces back to specific risks, legal requirements, or business needs
  • Management Review: Meeting minutes that show actual management decisions, resource allocations, and strategic direction based on ISMS performance data
  • Competence: Current training records, certifications, or skills assessments for personnel with information security responsibilities
  • Supplier Management: Security assessments for all suppliers that could impact information security, not just obvious IT vendors
  • Access Reviews: Dated evidence of systematic access rights reviews with clear approval trails
  • Asset Management: Comprehensive inventories that include both physical and information assets with appropriate classification
  • Incident Response: Evidence of testing, not just documentation, with records of lessons learned and improvements
  • Change Management: Security review checkpoints embedded in all change procedures, with no blanket exemptions
  • Internal Audit: Objective findings focused on conformity assessment rather than improvement suggestions

Common Mistake Scenario

Here's a scenario I encounter repeatedly: An organization implements comprehensive security controls but fails to maintain the documented evidence required by the standard. Their technical security is excellent—firewalls properly configured, access controls working effectively, backup procedures functioning—but they can't demonstrate compliance because the supporting documentation is either missing, outdated, or inconsistent with actual practices.

In one technology company audit, their information security was genuinely robust from a technical perspective, but I raised eight nonconformities because they couldn't provide evidence of management review decisions, risk treatment plans were three versions behind current practice, and their competence records hadn't been updated in two years. The lesson: implementing good security isn't enough—you must maintain evidence that demonstrates compliance with your documented ISMS.

Understanding these common pitfalls can help you focus your ISMS maintenance efforts on areas where nonconformities are most likely to arise. Remember, the standard requires both effective implementation and documented evidence of that implementation.

For more detailed guidance on avoiding these common nonconformities and maintaining an effective ISMS, connect with our community of ISO 27001 practitioners at the IX ISO 27001 Info Hub or consider scheduling a consultation to review your current implementation approach.

Need personalized guidance? Reach our team at ix@isegrim-x.com.


Related Articles

Read more

ISO 27001 and Zero Trust Architecture — Modern Security Meets Compliance

ISO 27001 and Zero Trust Architecture — Modern Security Meets Compliance

Executive Summary: * Architecture-Documentation Alignment: Zero Trust implementations fail audit when security architecture shifts to identity-centric models but ISMS documentation still describes perimeter-based controls * Multi-Framework Convergence: Zero Trust principles naturally align with ISO 27001's risk-based approach and map directly to NIST CSF, CMMC, and TISAX requirements—creating implementation synergies