An official AI intelligence platform for public sector professionals. All content generated and verified by Astra.
analysis

Why Copilot adoption stalls in government and how to unblock it

Why Copilot adoption stalls in government and how to unblock it

Executive signal

Copilot initiatives inside federal enterprises typically stall for structural, not technical, reasons: governance gates under OMB M-24-10 and the NIST AI RMF, security accreditation and enclave alignment to FedRAMP and the DoD SRG, data and records obligations under A-130 and 44 U.S.C. 3301, network boundary frictions under TIC 3.0 and the federal Zero Trust strategy, and acquisition plus software supply chain requirements under M-22-1812345678910. Each blocker has a corresponding remediation path that satisfies federal policy while enabling pragmatic deployment on Microsoft 365 and Azure Government where those platforms meet required impact levels and authorizations111.

Below, we decompose the five structural reasons and provide fixes aligned to binding authorities and platform capabilities.

The five structural blockers

  1. Policy governance and risk approvals under M-24-10 and NIST AI RMF
  • OMB M-24-10 requires agencies to establish AI governance, designate a Chief AI Officer, inventory AI use cases, and conduct AI Impact Assessments for safety-impacting or rights-impacting applications, which creates formal entry gates before production use of copilots1.
  • The NIST AI Risk Management Framework recommends functions to map, measure, manage, and govern AI risks; agencies aligning to this framework must document risk controls across data, models, and human oversight, adding time and artifacts to Copilot deployments2.
  • EO 14110 directs OMB and agencies to implement safeguards for AI systems in government, reinforcing the need for governance process maturity prior to adoption12.
  1. Security accreditation and enclave alignment (FedRAMP, DoD SRG, and agency ATO)
  • Federal SaaS must be authorized at a FedRAMP baseline commensurate with impact; reuse of FedRAMP authorizations and the agency ATO process (per NIST RMF) determine whether and how a copilot can operate with agency data3135.
  • DoD components must align cloud services to the DoD Cloud Computing SRG impact levels (e.g., IL4/IL5 for CUI), constraining use of generative services that are not authorized at the required level414.
  • Azure Government provides services in U.S. sovereign regions with FedRAMP High and DoD SRG authorizations for multiple impact levels, offering an enclave path for compliant AI workloads11.
  1. Data governance, privacy, and records management readiness
  • OMB Circular A-130 requires agencies to manage information as a strategic resource, including privacy risk assessments and records management, which applies to AI-generated and AI-processed content9.
  • Under 44 U.S.C. Β§3301, materials created or received by an agency in connection with public business are federal records regardless of format, so copilot outputs can be records and must be managed accordingly8.
  • Agencies handling CUI must implement controls per 32 CFR Part 2002, limiting when copilots may access or generate on such data without appropriate safeguards14.
  • In Microsoft 365, Copilot uses existing tenant security, compliance, and user permissions through Microsoft Graph, and honors sensitivity labels and data loss prevention where configured, which means weak data hygiene directly translates into risk exposure in copilot experiences151617.
  1. Network boundary patterns and Zero Trust/TIC 3.0
  • OMB’s federal Zero Trust strategy mandates identity-centric, least-privilege access and modern cloud security integration, often requiring architecture changes before enabling new SaaS features at scale6.
  • TIC 3.0 provides cloud use cases to securely access SaaS without hairpinning through legacy gateways; agencies must adopt these patterns and approve required service endpoints for copilot traffic718.
  • Microsoft 365 requires allowing specific URLs and IP ranges for service functionality; unapproved endpoints impede copilot features until networking policies are updated18.
  1. Acquisition and software supply chain controls
  • OMB M-22-18 requires agencies to collect secure software development attestations (and, for critical software or upon request, additional artifacts) from vendors before using third-party software, which applies to copilot services and can delay purchase or ATO until attestations are in place10.
  • Agencies commonly procure enterprise software via government-wide vehicles such as GSA MAS and NASA SEWP, which support compliant acquisition but require proper scoping of AI-specific terms and data-use restrictions in task orders1920.

What works: fixes mapped to federal requirements

  • Establish an AI governance operating model that satisfies M-24-10 and NIST AI RMF

    • Charter a decision board under the CAIO that triages copilot use cases, maintains the AI use case inventory, and ensures AI Impact Assessments where required by M-24-10, with risk controls documented per NIST AI RMF functions12.
    • Define a tiering rubric distinguishing low-risk productivity scenarios from safety- or rights-impacting use; apply commensurate approvals as required by M-24-101.
  • Choose an authorized boundary and accelerate ATO with reuse

    • Select a hosting and data boundary that already meets your information impact level (e.g., FedRAMP High or DoD IL4/IL5) and reuse FedRAMP authorizations where available to streamline the agency ATO per NIST RMF3135.
    • For custom copilots and retrieval-augmented generation on agency data, prefer Azure Government services with appropriate FedRAMP/DoD SRG authorizations; Azure Government provides sovereign regions and compliance artifacts to support accreditation packages11.
    • Where building your own copilots, use Azure OpenAI Service in U.S. Government regions so prompts and completions remain within the sovereign boundary and are not used to train the service, aligning with data control expectations in M-24-10211.
  • Make data, privacy, and records controls first-class

    • Treat copilot outputs and transformed content as records when they meet the definition in 44 U.S.C. Β§3301; integrate retention and disposition in line with A-130 and your records schedules before broad rollout89.
    • Enforce least-privilege access and label content. In Microsoft 365, sensitivity labels and DLP policies are honored by Copilot; deploy labeling and DLP at scale so copilots surface only what a user is entitled to see and prevent exfiltration in prompts or responses151617.
    • For CUI, validate that the copilot data path and storage conform to 32 CFR Part 2002 safeguards before enabling access to such content14.
  • Engineer the network for SaaS AI under Zero Trust and TIC 3.0

    • Adopt TIC 3.0 cloud use cases to route SaaS traffic through approved security capabilities without legacy backhaul; explicitly approve required Microsoft 365 endpoints used by copilot features718.
    • Align identity, device, and segmentation controls to the OMB Zero Trust strategy so copilot access decisions incorporate strong authentication and device posture from the start6.
  • Procure with supply chain assurance baked in

    • Collect M-22-18 secure development attestations for any third-party copilot software, and require SBOMs or additional artifacts where applicable to your risk posture10.
    • Use established vehicles (GSA MAS, NASA SEWP) to speed buys while inserting data-use, privacy, records, and model-training restrictions consistent with M-24-10 in the order-level terms19201.

Microsoft-specific implementation notes where applicable

  • Microsoft Copilot for Microsoft 365 operates within the tenant’s Microsoft 365 trust boundary, uses Microsoft Graph to scope retrieval to a user’s existing permissions, and does not use tenant data to train the foundation models powering the service, aligning with common agency data control expectations under M-24-10 when properly configured151.
  • Sensitivity labels, DLP, and other Microsoft Purview controls apply to copilot experiences; deploy them before enabling wide access to minimize oversharing risks and to demonstrate privacy and records safeguards under A-13016179.
  • For custom copilots, Azure OpenAI Service in Azure Government keeps prompts and completions within U.S. Government regions and indicates prompts and completions are not used to train the service, supporting enclave and data-residency requirements that often drive ATO decisions215.
  • Azure Government provides FedRAMP High and DoD SRG-aligned services and documentation to support accreditation; Azure Policy includes regulatory compliance built-ins to monitor adherence to control mappings (e.g., FedRAMP High/NIST 800-53) across your AI workload resources11223.

Implementation checklist for program executives

  • Governance

    • Stand up CAIO-led intake and inventory; publish a copilot use-case tiering guide aligned to M-24-10 and NIST AI RMF12.
  • Security and accreditation

    • Select an authorized enclave (FedRAMP/IL) and plan ATO reuse; compile FedRAMP package artifacts and control inheritance for your copilot stack per NIST RMF3135.
  • Data, privacy, and records

    • Map records implications of copilot outputs per 44 U.S.C. Β§3301 and A-130; deploy Purview labels and DLP before enabling production use891617.
  • Network

    • Approve Microsoft 365 endpoints used by copilot; implement TIC 3.0 cloud use case architecture and verify Zero Trust prerequisites1876.
  • Acquisition and supply chain

    • Add M-22-18 self-attestation (and artifacts as needed) to solicitations and orders; use GSA MAS/SEWP to acquire licenses with explicit data-use restrictions consistent with M-24-101019201.

Risks and open issues to surface early

  • Impact-level and FedRAMP alignment: If a target copilot service lacks the required FedRAMP baseline or DoD SRG impact level, adoption will pause until an enclave-compliant alternative (e.g., custom copilots in Azure Government) is available341121.
  • Records and privacy: Without agency-approved retention schedules and PIAs covering copilot features, A-130 compliance will block production deployment98.
  • Network readiness: Failure to approve required SaaS endpoints or adopt TIC 3.0 cloud patterns will degrade or break copilot features, delaying user rollout718.
  • Supply chain attestations: Missing M-22-18 attestations or SBOM artifacts can halt procurement and ATO, even for pilot scopes105.

1: OMB M-24-10 Advancing Governance, Innovation, and Risk Management for Agency Use of Artificial Intelligence β€” https://www.whitehouse.gov/wp-content/uploads/2024/03/M-24-10.pdf 2: NIST AI Risk Management Framework 1.0 β€” https://nvlpubs.nist.gov/nistpubs/ai/NIST.AI.100-1.pdf 3: FedRAMP Baselines β€” https://www.fedramp.gov/baselines/ 4: DoD Cloud Computing Security Requirements Guide v1r4 β€” https://dl.dod.cyber.mil/wp-content/uploads/cloud/CC/SRG_v1r4_Final_Marked.pdf 5: NIST SP 800-37 Rev. 2 Risk Management Framework for Information Systems and Organizations β€” https://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.800-37r2.pdf 10: OMB M-22-18 Enhancing the Security of the Software Supply Chain through Secure Software Development Practices β€” https://www.whitehouse.gov/wp-content/uploads/2022/09/M-22-18.pdf 6: OMB M-22-09 Moving the U.S. Government Toward Zero Trust Cybersecurity Principles β€” https://www.whitehouse.gov/wp-content/uploads/2022/01/M-22-09.pdf 7: CISA TIC 3.0 Cloud Use Case β€” https://www.cisa.gov/sites/default/files/publications/TIC_3.0_Cloud_Use_Case.pdf 15: Data, privacy, and security for Microsoft Copilot for Microsoft 365 β€” https://learn.microsoft.com/microsoft-365-copilot/microsoft-365-copilot-privacy 18: Microsoft 365 and Office 365 URLs and IP address ranges β€” https://learn.microsoft.com/microsoft-365/enterprise/urls-and-ip-address-ranges 8: 44 U.S.C. Β§ 3301 Definition of Records β€” https://uscode.house.gov/view.xhtml?req=granuleid:USC-prelim-title44-section3301 9: OMB Circular A-130 Managing Information as a Strategic Resource β€” https://www.whitehouse.gov/wp-content/uploads/2016/07/omb_circular_a-130.pdf 16: Sensitivity labels in Microsoft Purview β€” https://learn.microsoft.com/microsoft-365/compliance/sensitivity-labels 17: Data loss prevention in Microsoft Purview β€” https://learn.microsoft.com/microsoft-365/compliance/data-loss-prevention-policies 11: Azure Government overview and compliance β€” https://learn.microsoft.com/azure/azure-government/documentation-government-overview 21: Azure OpenAI Service overview with US Government information β€” https://learn.microsoft.com/azure/ai-services/openai/overview?tabs=usgov 13: FedRAMP Agency Authorization Playbook β€” https://www.fedramp.gov/assets/resources/documents/Agency_Authorization_Playbook.pdf 12: Executive Order 14110 Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence β€” https://www.whitehouse.gov/briefing-room/presidential-actions/2023/10/30/executive-order-on-safe-secure-and-trustworthy-development-and-use-of-artificial-intelligence/ 19: GSA Multiple Award Schedule β€” https://www.gsa.gov/buying-selling/purchasing-programs/gsa-schedule 20: NASA SEWP V Government-Wide Acquisition Contract β€” https://www.sewp.nasa.gov/ 14: 32 CFR Part 2002 Controlled Unclassified Information β€” https://www.ecfr.gov/current/title-32/subtitle-B/chapter-XX/part-2002 22: Azure Policy regulatory compliance built-ins β€” https://learn.microsoft.com/azure/governance/policy/concepts/regulatory-compliance


References

  1. OMB M-24-10 Advancing Governance, Innovation, and Risk Management for Agency Use of Artificial Intelligence β€” https://www.whitehouse.gov/wp-content/uploads/2024/03/M-24-10.pdf ↩
  2. NIST AI Risk Management Framework 1.0 β€” https://nvlpubs.nist.gov/nistpubs/ai/NIST.AI.100-1.pdf ↩
  3. FedRAMP Baselines β€” https://www.fedramp.gov/baselines/ ↩
  4. DoD Cloud Computing Security Requirements Guide v1r4 β€” https://dl.dod.cyber.mil/wp-content/uploads/cloud/CC/SRG_v1r4_Final_Marked.pdf ↩
  5. NIST SP 800-37 Rev. 2 Risk Management Framework for Information Systems and Organizations β€” https://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.800-37r2.pdf ↩
  6. OMB M-22-09 Moving the U.S. Government Toward Zero Trust Cybersecurity Principles β€” https://www.whitehouse.gov/wp-content/uploads/2022/01/M-22-09.pdf ↩
  7. CISA TIC 3.0 Cloud Use Case β€” https://www.cisa.gov/sites/default/files/publications/TIC_3.0_Cloud_Use_Case.pdf ↩
  8. 44 U.S.C. Β§ 3301 Definition of Records β€” https://uscode.house.gov/view.xhtml?req=granuleid:USC-prelim-title44-section3301 ↩
  9. OMB Circular A-130 Managing Information as a Strategic Resource β€” https://www.whitehouse.gov/wp-content/uploads/2016/07/omb_circular_a-130.pdf ↩
  10. OMB M-22-18 Enhancing the Security of the Software Supply Chain through Secure Software Development Practices β€” https://www.whitehouse.gov/wp-content/uploads/2022/09/M-22-18.pdf ↩
  11. Azure Government overview and compliance β€” https://learn.microsoft.com/azure/azure-government/documentation-government-overview ↩
  12. Executive Order 14110 Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence β€” https://www.whitehouse.gov/briefing-room/presidential-actions/2023/10/30/executive-order-on-safe-secure-and-trustworthy-development-and-use-of-artificial-intelligence/ ↩
  13. FedRAMP Agency Authorization Playbook β€” https://www.fedramp.gov/assets/resources/documents/Agency_Authorization_Playbook.pdf ↩
  14. 32 CFR Part 2002 Controlled Unclassified Information β€” https://www.ecfr.gov/current/title-32/subtitle-B/chapter-XX/part-2002 ↩
  15. Data, privacy, and security for Microsoft Copilot for Microsoft 365 β€” https://learn.microsoft.com/microsoft-365-copilot/microsoft-365-copilot-privacy ↩
  16. Sensitivity labels in Microsoft Purview β€” https://learn.microsoft.com/microsoft-365/compliance/sensitivity-labels ↩
  17. Data loss prevention in Microsoft Purview β€” https://learn.microsoft.com/microsoft-365/compliance/data-loss-prevention-policies ↩
  18. Microsoft 365 and Office 365 URLs and IP address ranges β€” https://learn.microsoft.com/microsoft-365/enterprise/urls-and-ip-address-ranges ↩
  19. GSA Multiple Award Schedule β€” https://www.gsa.gov/buying-selling/purchasing-programs/gsa-schedule ↩
  20. NASA SEWP V Government-Wide Acquisition Contract β€” https://www.sewp.nasa.gov/ ↩
  21. Azure OpenAI Service overview with US Government information β€” https://learn.microsoft.com/azure/ai-services/openai/overview?tabs=usgov ↩
  22. Azure Policy regulatory compliance built-ins β€” https://learn.microsoft.com/azure/governance/policy/concepts/regulatory-compliance ↩