A practical guide to the System Security Plan — what it actually is, what goes in it, and what makes the difference between a draft that holds up under assessment and one that doesn't.
The most common misconception about a System Security Plan is that it's a compliance attestation — a document where you claim you've implemented every required control. It isn't. The SSP is a description of your current state for each of the 110 NIST 800-171 controls. What you have in place. What's partially in place. What hasn't been addressed yet.
This distinction matters because it changes what the document is for. An SSP is not a sales document; it's an honest record. Assessors expect to see gaps in it. They're far more concerned with discrepancies between what your SSP claims and what your environment actually does than they are with the gaps themselves — gaps are tracked separately in the POA&M, where you commit to closing them.
Practically, the SSP serves three audiences. First, you and your team — it's the operational baseline you maintain over time. Second, your assessor — it's the document they read first and the framework they use to walk through your environment. Third, your customers and primes — they may ask to see it as part of subcontractor due diligence.
Honesty in the SSP is what passes assessment, not blanket attestations. Describe your current state accurately, and let the POA&M document the gaps and your plan to close them.
An SSP isn't just 110 control narratives. The document has a standard structure that an assessor expects to see, and most of the upfront sections set the context that makes the control narratives interpretable.
Your organization's name, address, CAGE code, DUNS or UEI number, the system being described, and the system owner. Basic identifying information that lives on the cover and in the opening section.
A plain-English description of what the system does, what business processes it supports, and what types of CUI it handles. This is where an assessor first orients themselves.
The cloud services, infrastructure, applications, and endpoints that make up the system. Assessors use this to verify that what your control narratives reference actually exists.
The single most consequential section. Defines what's in scope for assessment — which components handle CUI, where the edges are, and what's explicitly excluded. Wrong scoping here cascades into every control narrative.
Specific named individuals responsible for security functions — the ISSO, the ISSM, the incident response lead, the privacy officer. Assessors will ask these people direct questions during the assessment.
The bulk of the document. One narrative per NIST 800-171 control, describing how that control is implemented (or partially implemented, or not yet implemented) in your specific environment.
Supporting policies, network diagrams, data flow diagrams, evidence references, and a glossary. The narratives often point into these for specifics.
The 110 control narratives are where most SSPs succeed or fail. The technical content is roughly the same across all defense contractors of similar size — what separates a defensible SSP from a brittle one is the specificity and honesty of how each control is described.
Here are the principles that make a narrative hold up:
Be specific about your environment. Don't write "the organization implements multi-factor authentication." Write "Acme Corp enforces multi-factor authentication through Okta Workforce Identity, with FIDO2 security keys required for all privileged accounts and Okta Verify push notifications acceptable for standard users." Specificity is what an assessor uses to verify the narrative against reality.
Describe what you actually do, not what NIST says you should. Restating the control language back to the assessor is a red flag. They wrote it; they don't need you to repeat it. They need to know what your specific implementation looks like.
Acknowledge partial implementation honestly. If a control is half-implemented, say so. "Acme Corp performs annual access reviews for the production environment but has not yet extended this practice to the development environment, which is tracked as POA&M item AC-2." That sentence is more credible than a vague claim of full compliance.
Compare these two versions of the same narrative:
"The organization implements the principle of least privilege. Users are granted access only to the systems and information necessary to perform their job functions. Access is reviewed periodically."
"Acme Corp enforces least privilege through role-based groups in Okta. Each role is mapped to specific Microsoft 365 GCC High and Azure Government resources via group claims. The IT Director conducts quarterly access reviews and documents results in the access review log. Privileged Identity Management is enforced for all admin roles, requiring just-in-time elevation with MFA."
Every claim in a control narrative should be traceable to something concrete — a tool, a process, an artifact, a named person, a date. If a sentence doesn't have one of these anchors, an assessor can't verify it, and verification is what an assessment is.
The authorization boundary defines what's in scope for assessment. It identifies the specific systems, applications, services, and components that process, store, or transmit CUI. Everything inside the boundary is subject to NIST 800-171 controls. Everything outside is not.
This section deserves its own discussion because it's the most common single source of assessment failure. Two patterns recur:
The boundary is too broad. Some contractors describe their entire IT environment as in-scope, which forces every general-purpose system — corporate email, HR tools, marketing websites — into CMMC scope unnecessarily. This dramatically expands the work required and creates scope creep that's nearly impossible to maintain.
The boundary is too narrow. Other contractors define an enclave that excludes systems that genuinely touch CUI in practice. A laptop used by an engineer to draft proposals containing CUI is in scope, even if that wasn't the intent. An assessor finds the gap by asking simple questions about how work actually happens.
Getting the boundary right requires understanding three things: where CUI actually flows in your business, what systems support that flow, and what protections you've architected to keep CUI inside that scope. The boundary description should make these connections explicit, often supported by data flow diagrams in the appendices.
Walk through a typical workday for someone who handles CUI in your organization. Note every system they touch, every place CUI could land. The boundary should reflect that map. If a system is touched but not in scope, your narrative needs to explain why — usually that it's the technical layer below CUI, or that protections prevent CUI from reaching it.
Most SSP failures aren't catastrophic — they're small accumulations of the same patterns. Here are the recurring ones to watch for in your draft.
Writing "the organization implements account management" instead of describing what your actual account management process is. Assessors want to see your environment, not their own document quoted back at them.
"All systems are configured according to security best practices." Vague phrases like this can't be verified, and they raise immediate suspicion that the contractor doesn't actually know what's configured.
Your access control narrative names Okta as the identity provider; your audit logging narrative implies you use Active Directory. These contradictions are common when narratives are written in isolation by different people.
An assessor reads a narrative and thinks "I'd like to see this." If the narrative doesn't reference a specific document, log, or artifact they can examine, they have to ask — and asking takes time and erodes confidence.
The SSP described your 2023 environment, but you migrated to a new identity provider last quarter and never updated the document. Stale narratives erode trust across the entire SSP, even for sections that are still accurate.
Describing a control as fully implemented when it's actually partial. Better to acknowledge the gap and reference a POA&M item than to overstate and have an assessor catch the discrepancy during testing.
Baseline's interview turns thirty questions about your environment into a draft SSP, ready for review and refinement. Faster than starting from scratch, more concrete than a template.