Get the App

Chapter 7 of 8

Inside the SOC 2 Report: Opinions, Findings, and Use

Walks through how to read and interpret a SOC 2 report, including the auditor’s opinion, description of tests and results, and how customers use the report for risk assessments.

15 min readen

1. Orienting Yourself: What Is a SOC 2 Report For?

A SOC 2 report is an independent assurance report about a service organization’s controls related to the AICPA Trust Services Criteria (TSC): Security (required), and optionally Availability, Processing Integrity, Confidentiality, and Privacy.

You’ve already learned how controls are designed and how auditors test them. In this module, we focus on how to read the finished report and how customers use it for risk assessments.

Typical SOC 2 Report Structure (Type II example)

When you open a SOC 2 report, you’ll usually see:

  1. Independent Service Auditor’s Report (the opinion letter)
  2. Management’s Assertion
  3. System Description (what the service does and how it works)
  4. Description of Controls, Tests of Controls, and Results
  5. Complementary User Entity Controls (CUECs)
  6. Subservice Organizations and Carve‑outs
  7. Other Information (e.g., management responses, additional metrics)

For this module, we’ll zoom in on:

  • The auditor’s opinion (Step 2–3)
  • The tests and results section (Step 4–6)
  • How customers use the report for third‑party risk (Step 7–10)

> Today’s context (December 2025): SOC 2 reports are still issued under the AICPA Attestation Standards (AT‑C sections 105 & 205) and the Trust Services Criteria (2017 TSC, updated periodically). Many cloud and SaaS vendors now treat SOC 2 as a baseline for security assurance, often supplemented with ISO 27001, CSA STAR, or similar frameworks.

2. Finding and Decoding the Auditor’s Opinion

The Independent Service Auditor’s Report is usually in the first few pages. It answers two core questions:

  1. Were the controls suitably designed (and, for Type II, operating effectively) for the period?
  2. Did the system description fairly present the service as of the date / during the period?

Look for a paragraph that starts with something like:

> In our opinion, in all material respects…

This is the opinion paragraph, and it references:

  • The criteria used (e.g., AICPA Trust Services Criteria for Security, Availability, and Confidentiality).
  • The type of report (Type I: design at a point in time; Type II: design and operating effectiveness over a period, often 12 months).
  • The period covered (e.g., October 1, 2024 through September 30, 2025).

Practical reading tip

When you first receive a SOC 2:

  1. Skim to the opinion paragraph and identify the opinion type.
  2. Confirm the period matches your vendor risk needs (e.g., covers your go‑live date).
  3. Note the TSC categories included (Security only vs Security + others).

You’ll use this snapshot to decide how much reliance you can place on the report.

3. Types of Auditor Opinions: Clean vs Problematic

Auditors use standard opinion types. These terms are widely used beyond SOC 2 (e.g., in financial audits) and are still current as of 2025.

1. Unmodified Opinion ("Clean")

  • Wording: In our opinion, in all material respects…
  • Meaning: The auditor found that:
  • The system description is fairly presented, and
  • Controls were suitably designed (and for Type II, operated effectively throughout the period),
  • Except for no significant (material) issues.
  • Interpretation: This is the best outcome. There might still be some exceptions, but none that are material.

2. Qualified Opinion

  • Wording: Similar to unmodified, but with “except for…” describing one or more specific issues.
  • Meaning: Most things are OK, but there is at least one material issue (e.g., a key control didn’t operate effectively).
  • Interpretation: Partial reliability. You need to understand the scope and impact of the exception.

3. Adverse Opinion

  • Wording: Indicates that the criteria were not met in all material respects.
  • Meaning: The auditor believes there are widespread, material problems with design or operating effectiveness.
  • Interpretation: High risk. Customers usually treat this as a major red flag.

4. Disclaimer of Opinion

  • Wording: The auditor “does not express an opinion”.
  • Meaning: The auditor could not obtain enough appropriate evidence (e.g., severe scope limitations, non‑cooperation).
  • Interpretation: No assurance. You cannot rely on the report for comfort.

> Key nuance: Even a clean (unmodified) opinion does not mean perfect security. It means, in the auditor’s professional judgment, controls met the criteria in all material respects during the period tested.

4. Opinion Scenarios: What Would You Conclude?

Imagine you are on a vendor risk team reviewing three SOC 2 reports.

Scenario A

The opinion paragraph says:

> In our opinion, in all material respects, based on the criteria described in [reference], the description fairly presents the system… and the controls were suitably designed and operated effectively throughout the period…

Scenario B

The opinion paragraph says:

> In our opinion, except for the matter described in the Basis for Qualified Opinion paragraph, the controls were suitably designed and operated effectively…

Scenario C

The opinion paragraph says:

> Because of the significance of the matter described in the Basis for Disclaimer of Opinion paragraph, we were not able to obtain sufficient appropriate evidence to provide a basis for an opinion…

Your task

For each scenario, jot down (mentally or on paper):

  1. Opinion type (unmodified, qualified, adverse, disclaimer).
  2. One risk question you would ask the vendor.

Then compare to this guide:

  • Scenario A: Unmodified. Ask: “Can you confirm there were no material exceptions for the Security category during the period?”
  • Scenario B: Qualified. Ask: “What was the qualified issue, and how have you remediated it?”
  • Scenario C: Disclaimer. Ask: “Why couldn’t the auditor obtain evidence, and how can we gain assurance another way?”

5. Navigating the Controls, Tests, and Results Tables

The heart of a SOC 2 report for risk analysis is the section often titled:

  • “Controls, Tests of Controls, and Results of Tests” or
  • “Controls and Related Tests of Operating Effectiveness.”

This is usually a table with columns like:

| Trust Services Criteria | Control | Test Performed | Results of Test |

|-------------------------|---------|----------------|-----------------|

How to read a row

A typical row might look like this (simplified):

  • Criteria: CC6.1 – The entity implements logical access security software, infrastructure, and architectures over protected information assets…
  • Control: User access to production systems is restricted to authorized personnel through SSO and MFA; access is reviewed quarterly.
  • Test Performed: Inspected access control policy; obtained a listing of users with production access; selected a sample of users and inspected evidence of quarterly access reviews; observed SSO and MFA configuration.
  • Results: No exceptions noted.

What you should focus on

  1. Coverage: Which TSC criteria and which controls are tested?
  2. Nature of tests: Inspection, observation, re‑performance, inquiry — what did the auditor actually do?
  3. Results: “No exceptions noted” vs. detailed exception descriptions.

> Important (2025 practice): Many auditors now include more detailed descriptions of sampling (e.g., sample sizes, how samples were selected) to increase transparency. This helps you judge how strong the testing was.

6. Interpreting Exceptions in Test Results

Exceptions are not automatically deal‑breakers. You need to understand their frequency, severity, and remediation.

Example 1: Low‑impact exception

Results of Test:

> For one of 40 new hires selected, we noted that the security awareness training was completed 3 days after the required 30‑day onboarding window.

Interpretation:

  • Frequency: 1/40 (2.5%)
  • Impact: Minor delay; training still completed.
  • Likely conclusion: Low risk, especially if management has a remediation plan.

Example 2: Higher‑impact exception

Results of Test:

> For 3 of 25 terminated users, we noted that access to the production environment remained active for 5–7 days after the termination date.

Interpretation:

  • Frequency: 3/25 (12%)
  • Impact: Former employees retained access to production systems.
  • Risk: Moderate to high, depending on data sensitivity.

You should check:

  1. Management’s Response (often in the same section or an appendix):
  • Do they acknowledge the issue?
  • Have they implemented corrective actions (e.g., HR‑IT integration, automated de‑provisioning)?
  1. Materiality:
  • Did the auditor treat this as a material issue affecting the opinion (often explained in a “Basis for Qualified Opinion” section) or as a non‑material exception within a clean opinion?

> Practical rule of thumb: An exception becomes more concerning when it affects high‑risk areas (e.g., production access, encryption keys, incident response) or shows a pattern rather than a one‑off error.

7. Quick Check: Reading Test Results

Apply what you’ve learned about interpreting exceptions.

An auditor tested 50 changes to production systems. For 4 of them, they found missing evidence of peer review, even though a change management policy requires peer review for all changes. The auditor still issued an unmodified opinion. What is the most appropriate interpretation?

  1. The SOC 2 report is unreliable because any exception should result in a qualified opinion.
  2. There is a control deficiency in change management, but the auditor judged it not material overall.
  3. Because the opinion is unmodified, the exceptions in the test results can be ignored.
Show Answer

Answer: B) There is a control deficiency in change management, but the auditor judged it not material overall.

Option B is correct. An unmodified (clean) opinion can still include non‑material exceptions in specific controls. The missing peer reviews show a control deficiency, but the auditor concluded that, overall, controls met the criteria in all material respects. You should still evaluate the risk and remediation, rather than dismissing or overreacting to the exceptions.

8. User Considerations, CUECs, and Report Limitations

Every SOC 2 report includes a section often called “User Entity Considerations” or “Complementary User Entity Controls (CUECs)”. These are controls the customer is expected to implement for the system to be secure overall.

Complementary User Entity Controls (CUECs)

Examples of CUECs:

  • The user entity is responsible for managing and securing user devices that access the service.
  • The user entity is responsible for reviewing user access to the service at least quarterly.

If you, as the customer, do not implement these controls, the overall security posture may be weaker than the SOC 2 report suggests.

Common Limitations of SOC 2 Reports

  1. Point‑in‑time or period‑bound
  • Type I: only a single date.
  • Type II: a defined period (e.g., 12 months ending September 2025). Events after that period are not covered.
  1. Scope limitations
  • Not all products or regions may be in scope.
  • Some subservice organizations may be carved out (you must assess them separately).
  1. Reasonable assurance, not guarantee
  • Audits provide reasonable, not absolute, assurance.
  • They focus on material issues; small problems may not affect the opinion.
  1. Confidentiality and use restrictions
  • SOC 2 reports are usually labeled “Restricted Use” and intended for existing or prospective user entities and their auditors, not for general public distribution.

> 2025 nuance: As vendor ecosystems have grown more complex, organizations increasingly use multiple artifacts (SOC 2, ISO 27001 certificates, penetration test summaries, security questionnaires) together, rather than relying only on SOC 2.

9. Using a SOC 2 Report in Vendor Due Diligence

Imagine you are evaluating a new SaaS vendor that will store customer PII and some payment data.

You have their latest SOC 2 Type II report (covering October 1, 2024 – September 30, 2025).

Activity: Build a Mini Review Checklist

Take 2–3 minutes to sketch (mentally or on paper) how you would use the report:

  1. Scope & Relevance
  • Which TSC categories are included? (Security only, or also Availability, Confidentiality, etc.?)
  • Are the specific products/regions you’ll use in scope?
  1. Opinion & Period
  • What is the opinion type?
  • Does the coverage period include your planned go‑live date?
  1. Key Risk Areas to Scan in the Test Results
  • Logical access and authentication (MFA, least privilege).
  • Change management and deployment.
  • Data encryption (in transit and at rest).
  • Backup and recovery.
  • Incident detection and response.
  1. Exceptions & Remediation
  • Any exceptions in high‑risk controls?
  • Did management implement corrective actions? Are timelines reasonable?
  1. CUECs and Your Responsibilities
  • Which user responsibilities must your organization implement (e.g., access reviews, endpoint security, network controls)?

After listing these, compare to this sample checklist:

  • ✅ Confirm scope and services match your use case.
  • ✅ Confirm unmodified or understand any qualification.
  • ✅ Review exceptions in high‑risk areas and request remediation evidence if needed.
  • ✅ Map CUECs to your internal controls to ensure coverage.
  • ✅ Store the report and your review notes in your third‑party risk management system and set a reminder to request the next report when available.

10. Flashcards: Key Terms and Concepts

Use these flashcards to reinforce the main ideas about SOC 2 opinions, findings, and use.

Unmodified (Clean) Opinion
An auditor’s conclusion that, in their opinion, the description is fairly presented and controls were suitably designed (and, for Type II, operated effectively) in all material respects for the stated period.
Qualified Opinion
An opinion stating that, except for one or more specific material issues described in the report, the description and controls met the applicable criteria.
Adverse Opinion
An opinion stating that the description and/or controls did NOT meet the applicable criteria in all material respects, indicating significant, widespread issues.
Disclaimer of Opinion
A statement that the auditor does not express an opinion, usually because they could not obtain sufficient appropriate evidence to form a conclusion.
Type I vs. Type II SOC 2
Type I: Opinion on the fairness of the description and suitability of control design at a point in time. Type II: Opinion on description and control design AND operating effectiveness over a specified period.
Complementary User Entity Controls (CUECs)
Controls that the user organization (customer) is expected to implement for the service organization’s controls to achieve the Trust Services Criteria effectively.
Tests of Controls
Procedures performed by the auditor (e.g., inspection, observation, re‑performance) to obtain evidence about whether controls operated effectively during the period.
Exception (in SOC 2 testing)
A deviation from the described control identified during testing (e.g., a missing access review). May or may not be material, depending on frequency and impact.
Materiality (in SOC 2 context)
A threshold used by auditors to judge whether an issue is significant enough to affect the overall conclusion about whether controls meet the criteria in all material respects.
User Considerations / Limitations
Sections of the SOC 2 report explaining how the report should be used, its scope, period, and restrictions, and highlighting that it provides reasonable, not absolute, assurance.

Key Terms

SOC 2
A type of System and Organization Controls report defined by the AICPA that evaluates a service organization’s controls relevant to the Trust Services Criteria (Security, Availability, Processing Integrity, Confidentiality, and Privacy).
Materiality
The significance of an omission or misstatement that could influence the judgment of report users; used by auditors to decide whether issues affect the overall opinion.
Adverse Opinion
An opinion stating that the description and/or controls did not meet the applicable criteria in all material respects, indicating significant deficiencies.
Qualified Opinion
An opinion that is mostly positive but includes at least one material issue described in a ‘Basis for Qualified Opinion’ section.
Tests of Controls
Procedures performed by auditors (such as inspection, observation, inquiry, and re‑performance) to obtain evidence about the effectiveness of controls.
Carve‑out Method
An approach in SOC reporting where certain subservice organizations’ controls are excluded from the scope, requiring user entities to assess them separately.
Unmodified Opinion
A clean auditor’s opinion stating that, in all material respects, the description is fairly presented and controls were suitably designed and (for Type II) operated effectively.
Type I SOC 2 Report
A report that opines on the fairness of the system description and the suitability of control design at a specific point in time.
Reasonable Assurance
A high but not absolute level of assurance provided by an audit, recognizing that audits are subject to inherent limitations such as sampling and judgment.
Type II SOC 2 Report
A report that opines on the fairness of the system description and the suitability of control design and operating effectiveness over a defined period (commonly 12 months).
Disclaimer of Opinion
A statement that the auditor does not express an opinion, typically due to an inability to obtain sufficient appropriate evidence.
User Entity Considerations
Sections in a SOC 2 report that describe responsibilities and controls that user organizations should implement and highlight limitations and proper use of the report.
Trust Services Criteria (TSC)
A set of control criteria developed by the AICPA for evaluating controls over security, availability, processing integrity, confidentiality, and privacy.
Third‑Party Risk Management (TPRM)
The process organizations use to assess, monitor, and manage risks associated with their external vendors and service providers.
Complementary User Entity Controls (CUECs)
Controls that must be implemented by the user organization to complement the service organization’s controls for the overall control objectives to be achieved.