Skip to main content

Cloud Security Maturity at the GovExperience Summit

By April 19, 2026Articles, Event Summaries

Signals from the Field: What Government Cloud Security Panels Reveal About Enterprise Architecture Maturity

Observations and Recommendations from the Carahsoft GovExperience Summit

The federal government’s cloud security journey reflects both genuine progress and persistent architectural gaps.

Setting the Scene

The Carahsoft GovExperience Summit 2026 convened government leaders, technology partners, and public-sector innovators at the Carahsoft Conference & Collaboration Center in Reston, Virginia. Organized by Government Executive Media Group and underwritten by Carahsoft — with gold sponsors including Knox Systems and Salesforce — the event featured two parallel tracks: Change Management and Emerging Technology. Sessions spanned a broad and timely agenda: AI-enhanced efficiency, cloud-powered digital services, cybersecurity user experience, data analytics, and the perennial challenge of organizational silos.

The summit’s stated mission was to explore how AI, automation, and data intelligence are transforming government service delivery. Featured sessions included “Cloud-Powered Digital Services: Secure, Scalable, Inclusive Government,” “The UX in Cybersecurity,” “Leveraging AI to Enhance Efficiency and Readiness,” and “Overcoming Organizational Silos.” Taken together, the agenda painted a picture of a federal technology community that is ambitious, aware of its challenges, and actively seeking operational answers to strategic questions.

I attended the summit with the kind of interest that any enterprise architecture practitioner brings to these convenings: not to judge, but to listen. Summits like GovExperience are valuable precisely because they surface the real state of practice — the unscripted moments, the questions that produce long pauses, the gaps between the polished slide decks and the lived reality of implementation. What I heard over the course of the day confirmed something I have observed across dozens of government and industry forums over the past several years: the federal cloud security conversation has matured considerably in its ambitions but has not yet matured proportionally in its architectural foundations.

That observation is not a criticism. It is a diagnosis — and diagnoses are only useful if they lead to treatment plans. This article offers both.

What the Panels Revealed: An Architecture Maturity Reading

I sat through two panels that, taken together, provided an unusually clear reading of where much of the federal government stands on cloud security architecture. I will not name individuals — the value of this analysis lies in the patterns, not the personalities — but I will describe what I observed with the candor that our profession requires.

Panel One: Integrating Security into Cloud Platforms

The first panel was organized around a compelling premise: how to integrate security into cloud platforms to identify and block threats. The panelists were positioned as government experts, and the topic could not have been more relevant. Cloud-native security integration is one of the defining challenges of modern enterprise architecture, and the federal government — with its complex compliance landscape, multi-vendor ecosystems, and high-consequence data environments — faces this challenge at a scale and sensitivity that few private-sector organizations can match.

What I observed, however, was a panel discussion that stayed at the surface. Responses lacked architectural depth. When the moderator posed a direct and well-framed question — How would a multi-cloud environment complicate threat detection, and what strategies exist or are planned to master multi-cloud security? — the panelists could not answer. There was no discussion of cross-cloud control plane integration, no mention of federated identity architectures spanning providers, no reference to the security control mapping challenges that arise when workloads are distributed across AWS GovCloud, Azure Government, and other FedRAMP-authorized environments. Frankly, the panel was not well-designed or facilitated.

A recurring theme was “managing the data supply chain” — a phrase that appeared multiple times throughout the discussion. The concept is sound and increasingly important. But it was discussed without an operational framework behind it: no data lineage models, no trust boundary definitions, no custodial handoff protocols. The phrase functioned more as a shared vocabulary term than as an architectural discipline, which is a pattern I have seen before and which warrants its own analysis below.

A thread that ran through much of Panel One — and would resurface even more forcefully in Panel Two — was the bureaucratic nightmare of Authority to Operate (ATO) verification. Panelists described months-long ATO timelines as a source of deep frustration, particularly in an era where AI-assisted attackers iterate their techniques in hours, not quarters. The temporal mismatch is stark: defenders are locked in sequential certification workflows governed by FAR, FedRAMP, and FISMA while adversaries run continuous, parallel assault loops that exploit the very windows of vulnerability those certification delays create.

What was most striking was the resignation. When pressed on whether ATO workflows could be reformed to match the pace of the threat landscape, the panelists offered little hope. Procurement culture, statutory frameworks, and risk-averse institutional norms were cited not as obstacles to be overcome but as immovable features of the environment. The candor was refreshing; the conclusion was alarming. If the defenders themselves do not believe that their authorization processes can evolve, the architecture community must ask whether continuous assurance models can be designed to satisfy the intent of existing frameworks while dramatically compressing the vulnerability window.

Panel Two: Managing Multiple Clouds

The second panel, which was facilitated well, addressed multi-cloud management directly, and here the picture was more nuanced — and more instructive. Of four panelists, two were fluent and clearly operational. They spoke from experience. Their answers referenced specific architectural decisions, real trade-offs they had navigated, and concrete strategies for maintaining security posture across heterogeneous cloud environments. Their contributions demonstrated that operational excellence in multi-cloud governance is not theoretical — it is achievable, and some corners of the federal government are achieving it.

The other two panelists, however, appeared to struggle to add value to the session. Their responses were often vague, relying on general statements about “the importance of security” and “working with our cloud providers” without offering specifics about how that work was structured, governed, or measured. AI was discussed briefly but superficially — treated as a feature to add to existing tools rather than as a transformation to the security operating model itself.

The contrast on that stage was striking, and it told a story that no slide deck could have communicated as effectively.

Notably, no panelist on either panel arrived with slides, a structured framework, or a visual aid of any kind. The discussions remained entirely conversational — reactive rather than commanding. An audience member hoping to see at least one presenter step forward with crisp visuals, a compelling architecture diagram, or a rehearsed narrative that demonstrated mastery would have been disappointed. Even the two stronger panelists on the second panel operated conversationally rather than authoritatively; none arrived prepared to teach the room. A few slides from each panelist could have also helped address the difficulty of understanding a few of the panelists, who did not speak English fluently.

This is not a minor point. Government panels at industry summits are how the federal technology community projects competence to vendors, partners, the public, and — critically — to the workforce pipeline it needs to attract and retain. Underprepared panelists, however well-intentioned, reinforce a narrative of government lagging behind the private sector. That narrative, once established, becomes self-fulfilling: top talent gravitates toward organizations that project mastery, and vendors calibrate their engagement based on the sophistication they observe. The presentation readiness of government representatives at public forums is, in this sense, a strategic asset — or a strategic liability.

The value of these panels was not in the polished answers but in what the silences and hesitations revealed. They provided an unfiltered maturity reading of where much of the federal government stands on cloud security architecture And the reading of the lacking maturity on the topics, while sobering, is instructive.

Seven Maturity Signals and What They Mean

From these two panels, I identified seven distinct maturity signals — recurring patterns that, taken together, compose a coherent picture of the current state of federal cloud security architecture. Each signal is a data point, and each points toward specific architectural interventions that the enterprise architecture community can and should provide.

Signal 1: “Security as Tooling, Not Architecture”

Across both panels, cloud security was consistently framed as a product procurement problem — which tools should we buy? — rather than as an architecture governance problem — how do we design integrated security across the enterprise? This distinction is fundamental. Tools are components; architecture is the discipline of organizing components into coherent, governed, and evolvable systems. When an organization treats security as a collection of tools rather than as an architectural concern, it inevitably ends up with fragmented coverage, redundant capabilities, and gaps that no single product can close.

On most enterprise architecture maturity models, this orientation corresponds to Level 1 (Initial) or Level 2 (Developing) — stages characterized by ad hoc, project-driven decision-making without enterprise-level coordination. The panelists were not describing immature organizations in pejorative terms; they were describing organizations that have not yet made the leap from tool-centric to architecture-centric security governance. That leap is achievable, but it requires deliberate investment in architecture practice, not just technology procurement.

Signal 2: “The Multi-Cloud Blind Spot”

The inability to articulate a multi-cloud mastery strategy — when directly asked — was perhaps the most revealing moment across both panels. It suggests that many agencies adopted multiple cloud providers tactically, workload by workload, without an enterprise cloud architecture governing those decisions. This is not unusual; it reflects the organic way that cloud adoption unfolded across the federal government, driven by individual program offices, FedRAMP authorization timelines, and vendor relationships rather than by a unified architectural vision.

But the consequences are significant. Fragmented multi-cloud adoption without architectural governance produces exactly the conditions that the NIST Multi-Cloud Security Public Working Group (MCSPWG) was established to address: fragmented visibility, inconsistent security controls across providers, identity federation gaps, and dangerous blind spots in threat detection. When your security monitoring architecture was designed for a single cloud and your workloads now span three, you do not have a multi-cloud security strategy — you have three single-cloud security strategies and a prayer.

Signal 3: “Data Supply Chain Without a Supply Chain Architecture”

The phrase “data supply chain” appeared repeatedly across both panels, which is itself a sign of progress — the concept has entered the working vocabulary of federal technology leaders. But vocabulary is not architecture. In every instance I observed, the term was used without an operational definition, without reference to a lineage model, and without a governance framework that would make it actionable.

Data supply chain management, properly understood, is an architecture discipline. It requires data lineage tracing — the ability to track every data element from its source, through every transformation, to its point of consumption. It requires trust boundary mapping — clear definitions of where data crosses organizational, jurisdictional, or classification boundaries and what controls apply at each crossing. It requires custodial handoff protocols — formalized processes for transferring data stewardship responsibility as data moves through the chain. Without these architectural foundations, “data supply chain management” remains an aspiration rather than a capability.

Signal 4: “AI as Afterthought, Not Operating Model”

When AI was discussed — and it was discussed only briefly, despite its prominence on the summit agenda — it was framed as an enhancement to existing processes. We are looking at AI to improve our threat detection. We are exploring AI for automating compliance checks. These are reasonable starting points, but they reflect a tool-level understanding of AI integration rather than an operating model transformation.

The distinction matters enormously. Using AI tools within an existing security operating model produces incremental improvements. Redesigning the security operating model around AI-augmented workflows — where AI agents handle continuous monitoring, pattern recognition, anomaly detection, and initial triage while human architects and analysts focus on consequential decisions, strategic response, and governance oversight — produces a fundamentally different capability. This is the difference between adding a spell-checker to a manual typewriter and redesigning the writing process around a word processor. The latter requires rethinking workflows, roles, decision authorities, and performance metrics. It is architecture work, and it was notably absent from the discussion.

Signal 5: “The Competency Divide”

The sharp contrast between the four fluent panelists on the panels and the four who struggled was not a commentary on individual capability — it was a window into a systemic workforce challenge. Cloud security architecture requires practitioners who understand both cloud-native architectures (container orchestration, serverless patterns, cloud-native identity, infrastructure as code) and enterprise governance (architecture review boards, decision rights frameworks, portfolio management, compliance traceability). This combination of skills is rare, and current federal workforce development pipelines do not reliably produce it.

The four strong panelists likely developed their expertise through direct operational experience — learning by doing, in environments that demanded both technical depth and governance discipline. That path works for individuals, but it does not scale. The competency divide revealed on that stage is a structural problem that requires a structural solution: deliberate, funded, sustained investment in competency development programs that integrate cloud-native technical skills with enterprise architecture governance frameworks.

Signal 6: “The ATO-Agility Paradox”

The most consequential finding may have been the panelists’ candid acknowledgment that Authority to Operate (ATO) verification timelines — often stretching six to eighteen months — are fundamentally incompatible with the pace of modern cyber threats. AI-assisted attack toolkits now enable adversaries to discover vulnerabilities, generate exploits, test evasion techniques, and launch campaigns in days or weeks. The federal authorization model, designed for an era of annual security assessments and stable threat landscapes, creates a temporal mismatch that no amount of tool procurement can close. Defenders operating on quarterly or annual certification cycles are structurally disadvantaged against attackers operating on continuous iteration cycles.

What elevated this from a familiar complaint to a maturity signal was the resignation that accompanied it. The panelists did not describe ATO reform as difficult but achievable; they described it as essentially impossible given current procurement culture and statutory constraints. This learned helplessness — the belief that authorization workflows cannot evolve — is itself a maturity indicator. Organizations at higher maturity levels do not accept process constraints as permanent; they architect around them, designing continuous assurance models that satisfy the intent of FISMA and FedRAMP while compressing the window of unmonitored vulnerability. The absence of this architectural thinking on the panels suggests that the ATO-agility paradox is not just a process problem but a conceptual one: the practitioners closest to the problem have not yet framed it as an architecture challenge amenable to an architecture solution.

Signal 7: “The Presentation Readiness Gap”

Public conference appearances are how government projects competence, shapes vendor behavior, attracts talent, and reassures the public that critical infrastructure is in capable hands. The presentation readiness observed across both panels — no slides, no structured frameworks, no visual aids, no rehearsed narratives — suggests a systemic underinvestment in preparing government practitioners for public-facing roles. This is not about stage presence or charisma; it is about the ability to communicate architectural thinking with the clarity, structure, and authority that the subject matter demands.

The gap is addressable but requires deliberate investment. Organizations at higher maturity levels treat public communication as a professional competency, not a personality trait. They invest in structured briefing preparation — developing visual frameworks, rehearsing key narratives, vetting talking points, and ensuring that every representative who takes a public stage is equipped to project mastery. The enterprise architecture community should advocate for and help design “conference readiness” modules within government competency development programs — not as vanity exercises but as investments in institutional credibility. A government panelist who arrives with a compelling architecture diagram, a clear framework, and a rehearsed narrative does more for public confidence in federal cybersecurity than any number of press releases. The upskilling of government panelists and public communicators is not a nice-to-have; it is a strategic imperative that directly affects the government’s ability to attract talent, command vendor respect, and maintain public trust.

SignalWhat Was ObservedWhat It Indicates
Security as Tooling, Not ArchitectureCloud security framed as a procurement decision rather than an architecture governance disciplineLevel 1–2 maturity; ad hoc, project-driven security without enterprise coordination
The Multi-Cloud Blind SpotInability to articulate a multi-cloud mastery strategy when directly askedTactical, workload-by-workload cloud adoption without enterprise cloud architecture
Data Supply Chain Without Architecture“Data supply chain” used repeatedly without operational definition or governance modelVocabulary has outpaced practice; concept adoption without architectural foundations
AI as AfterthoughtAI discussed as a feature enhancement, not as a transformation of the operating modelTool-level AI adoption without redesign of security workflows, roles, or decision authorities
The Competency DivideStark contrast between operationally fluent panelists and those who struggled with specificsSystemic workforce gap; cloud-native skills and EA governance rarely combined in one practitioner
The ATO-Agility ParadoxMonths-long ATO timelines acknowledged as incompatible with AI-accelerated threats, with no reform path articulatedLearned helplessness around authorization processes; continuous assurance not yet framed as an architecture solution
The Presentation Readiness GapNo panelist brought slides, frameworks, or visual aids; discussions stayed conversational rather than commandingSystemic underinvestment in preparing government practitioners as public communicators; upskilling needed

The Positive Case: Why These Signals Matter

It would be easy — and wrong — to read the maturity signals above as an indictment. They are not. They are a baseline measurement, and the fact that we can take this measurement at all is itself a sign of how far the federal cloud security conversation has come.

Five years ago, these panels would not have existed. The questions that were asked — about multi-cloud security integration, about data supply chain management, about AI-augmented threat detection — would not have been on the agenda because they were not yet in the operational vocabulary of most federal technology leaders. The fact that government leaders are now publicly discussing these challenges, standing on stages and engaging with them in front of peers and industry partners, indicates that the conversation has moved decisively from “whether” to “how.” That transition is significant, and it should be recognized.

Moreover, the two strong panelists on the second panel demonstrated something critically important: operational excellence in multi-cloud governance is achievable within government. Their fluency was not theoretical — it was earned through practice, and it showed. They are proof of concept. They demonstrate that the federal government can develop practitioners who combine deep cloud-native technical expertise with enterprise governance discipline, and that those practitioners can operate at a level that rivals or exceeds their private-sector counterparts. The question is not whether this is possible; it is how to make it the norm rather than the exception.

Events like the GovExperience Summit create the conditions for this kind of honest assessment. When government leaders step onto a stage and engage with hard questions — even when they cannot fully answer them — they are performing an act of institutional courage that deserves respect. The silences and hesitations that I observed were not failures; they were data. And data, properly analyzed and acted upon, is the foundation of improvement.

Nine Recommendations for the Enterprise Architecture Community

The maturity signals from the GovExperience Summit point toward specific, actionable interventions. These recommendations are written for the enterprise architecture community — for the practitioners, leaders, and educators who have the expertise and the responsibility to provide the structural frameworks that turn aspiration into operational capability.

Recommendation 1: Establish Multi-Cloud Reference Architectures

Every agency operating in more than one cloud provider needs a published, maintained multi-cloud reference architecture. This is not a nice-to-have; it is a governance necessity. The reference architecture should define security control mappings across providers — showing how a given NIST 800-53 control is implemented in AWS GovCloud, Azure Government, Google Cloud, and Oracle Cloud, respectively, and how those implementations are validated as equivalent. It should define identity federation patterns that ensure consistent authentication and authorization regardless of which cloud hosts a given workload. And it should establish data residency policies that reflect both statutory requirements and operational realities.

The NIST Multi-Cloud Security Public Working Group is developing guidance that will inform these reference architectures. Enterprise architecture teams should engage with the MCSPWG now — contributing operational experience, reviewing draft guidance, and building internal expertise — rather than waiting for final publications. Reference architectures are living artifacts; building them is a continuous discipline, not a one-time project.

Recommendation 2: Architect the Data Supply Chain

The data supply chain is an architecture artifact, not a compliance checkbox. Enterprise architecture teams should treat data supply chain management as a first-class architecture discipline, applying the same rigor to data flows that they apply to application integration or network topology. This means implementing data lineage models that trace every data element from source to consumption, including every transformation, aggregation, and derivation along the way. It means defining trust boundaries at each custodial handoff — the points where data responsibility transfers from one organization, system, or classification domain to another — and documenting the controls that apply at each boundary. And it means establishing provenance verification for data entering AI and machine learning pipelines, where the quality and integrity of training data directly determine the reliability of the models that consume it.

The phrase “data supply chain” has entered the vocabulary. Now the architecture community must give it structure.

Recommendation 3: Integrate Security into the Architecture Governance Framework

Security should not operate as a parallel governance track with its own review processes, its own decision authorities, and its own documentation standards. It should be embedded in the enterprise architecture governance framework — the same framework that governs application architecture, infrastructure architecture, data architecture, and business architecture decisions. Architecture review boards should include security architecture as a standing agenda item, not an occasional guest. Security decisions should follow the same decisioning model as other architecture decisions, with clear escalation paths, documented authority boundaries, and full traceability from requirement to implementation to validation.

When security governance operates in parallel to architecture governance, the result is exactly what we observed on the panels: security discussions that lack architectural depth, and architecture discussions that treat security as someone else’s problem. Integration is the remedy.

Recommendation 4: Adopt AI-Augmented Security Operations as an Architecture Pattern

The enterprise architecture community should move beyond “AI for security” — which operates at the tool level — to “AI-augmented security architecture,” which operates at the operating model level. This means designing human-agent collaboration patterns for threat detection, incident response, and compliance monitoring. In these patterns, AI agents handle continuous monitoring, pattern recognition across high-volume data streams, anomaly scoring, and initial triage — tasks that benefit from machine speed and consistency. Human architects and analysts focus on consequential decisions: threat classification, response strategy, governance implications, and strategic adaptation.

The Augmented Architecture Office (AAO) model provides a framework for this transformation. The AAO model recognizes that AI does not replace architects; it augments them, handling the computational and pattern-recognition workload while human practitioners focus on judgment, context, and governance. Applying this model to security operations requires deliberate architecture work: defining the interfaces between human and AI decision-making, establishing escalation thresholds, designing feedback loops that improve AI performance over time, and documenting the governance framework that ensures accountability.

Recommendation 5: Close the Competency Gap with Structured Development

The competency divide observed on the panels is addressable, but not with ad hoc training or conference attendance alone. Agencies should invest in structured competency development programs that combine cloud-native architecture skills with enterprise governance frameworks. The goal is to produce practitioners who are fluent in both domains — who can design a Kubernetes security policy and present an architecture decision record to a governance board, who understand both the technical mechanics of cloud-native identity federation and the organizational dynamics of cross-agency data sharing agreements.

The SAFe competency model and TOGAF’s architecture skills framework both provide templates that can be adapted for cloud security architecture roles. But frameworks alone are not sufficient. Agencies also need structured mentorship programs that pair emerging practitioners with experienced architects, rotational assignments that expose practitioners to multiple cloud environments and governance contexts, and communities of practice that sustain learning beyond formal training. The two fluent panelists at the summit did not become fluent by accident; they became fluent through sustained, deliberate practice. The challenge is to create pathways that make that journey accessible to many more practitioners.

Recommendation 6: Create Cross-Cloud Observability as an Architecture Requirement

Unified observability across cloud providers should be an architecture requirement, documented in the multi-cloud reference architecture and enforced through architecture governance. Enterprise architects should define observability standards — log aggregation patterns, metrics normalization rules, trace correlation protocols — that ensure security teams have enterprise-wide visibility regardless of which cloud hosts a given workload. When a threat actor moves laterally from a compromised workload in one cloud to a target in another, the security team’s ability to detect and respond depends entirely on whether their observability architecture spans the boundary between providers.

This is not an operations concern to be delegated to cloud engineering teams after the architecture is complete. It is a foundational architecture requirement that shapes technology selection, integration design, and governance processes from the outset. Cross-cloud observability should be a first-class artifact in every multi-cloud reference architecture, with defined standards, validation criteria, and governance oversight.

Recommendation 7: Use Maturity Models to Drive Honest Assessment

The maturity gaps revealed at the summit are not unique to the panelists who happened to be on stage that day. They reflect a systemic condition across much of federal IT — a condition that is well-documented in GAO reports, inspector general assessments, and FITARA scorecards, even if it is rarely discussed with the specificity that architecture practitioners require.

Enterprise architecture maturity models — whether ACMM (Architecture Capability Maturity Model), NASCIO’s EA maturity framework, or domain-specific models for cloud security, data governance, or AI readiness — should be used regularly and transparently to assess organizational readiness. The key word is transparently. Maturity assessments that are conducted quietly and filed away produce no improvement. Maturity assessments that are shared with leadership, discussed openly, and used to prioritize investment and capability development produce measurable, sustained progress. Honest assessment is not a sign of weakness; it is the first step toward targeted improvement, and it is a discipline that the enterprise architecture community is uniquely positioned to lead.

Recommendation 8: Champion Continuous Authorization as an Architecture Capability

The ATO-agility paradox will not be resolved by policy reform alone — the statutory and procurement frameworks that govern federal authorization are deeply embedded and slow to change. But enterprise architects can design continuous authorization architectures that satisfy the intent of FISMA and FedRAMP while dramatically compressing the vulnerability window. This means architecting infrastructure-as-code pipelines where security configurations are version-controlled, automatically validated against NIST 800-53 control baselines, and continuously monitored for drift. It means implementing policy-as-code frameworks where compliance rules are executable, testable, and integrated into CI/CD pipelines rather than documented in static spreadsheets reviewed annually. And it means building real-time configuration drift detection that alerts security teams the moment a production environment deviates from its authorized baseline — not six months later during the next assessment cycle.

The architecture community should champion continuous authorization not as a replacement for ATO but as an architectural implementation of ATO’s intent: ensuring that systems operate within their authorized security parameters at all times, not just at the moment of certification. This reframing — from ATO as a gate to ATO as a continuous assurance architecture — is precisely the kind of conceptual shift that enterprise architects are trained to provide and that the panelists at the summit were reaching for but could not articulate.

Recommendation 9: Invest in Government Panelist Readiness and Public Communication Excellence

Government agencies should invest systematically in preparing their technical leaders for public-facing roles at conferences, congressional hearings, industry forums, and media engagements. This investment should include structured briefing preparation protocols — developing visual frameworks, architecture diagrams, and clear narratives before every public appearance. It should include presentation coaching that focuses not on generic public speaking but on communicating architectural thinking with precision and authority. And it should include rehearsal processes where panelists practice fielding difficult questions — like the multi-cloud mastery question that stalled Panel One — with substantive, framework-grounded responses.

The enterprise architecture community can contribute directly to this effort by developing a “conference readiness” competency module within existing government training frameworks. Such a module would cover: how to translate complex architecture concepts into clear visual frameworks; how to prepare structured talking points that demonstrate operational fluency; how to anticipate and prepare for challenging questions; and how to project institutional competence without overpromising. The upskilling of government panelists is an investment in the government’s most visible asset — its people. Every underprepared panel appearance erodes public confidence and reinforces the narrative that government cannot keep pace with the private sector. Every well-prepared appearance reverses that narrative and demonstrates that the federal government possesses — and is developing — the architectural talent that the nation’s security demands.

#RecommendationCore ActionAddresses Signal
1Establish Multi-Cloud Reference ArchitecturesPublish and maintain cross-provider security control mappings, identity federation patterns, and data residency policiesMulti-Cloud Blind Spot
2Architect the Data Supply ChainImplement lineage models, trust boundaries, and provenance verification as architecture artifactsData Supply Chain Without Architecture
3Integrate Security into Architecture GovernanceEmbed security as a standing agenda item in architecture review boards with shared decisioning modelsSecurity as Tooling
4Adopt AI-Augmented Security OperationsDesign human-agent collaboration patterns using the Augmented Architecture Office (AAO) modelAI as Afterthought
5Close the Competency GapBuild structured development programs combining cloud-native skills with EA governance frameworksThe Competency Divide
6Create Cross-Cloud ObservabilityDefine log aggregation, metrics normalization, and trace correlation as architecture requirementsMulti-Cloud Blind Spot
7Use Maturity Models TransparentlyConduct regular, shared maturity assessments to prioritize investment and capability developmentAll Signals
8Champion Continuous AuthorizationDesign infrastructure-as-code, policy-as-code, and real-time drift detection to implement ATO as continuous assuranceATO-Agility Paradox
9Invest in Panelist Readiness and UpskillingDevelop conference readiness modules, briefing protocols, and presentation coaching for government technical leadersPresentation Readiness Gap

Closing Reflection

The Carahsoft GovExperience Summit demonstrated something important: the federal government is asking the right questions. It is convening cross-agency panelists. It is creating spaces where the gap between aspiration and operational readiness can be seen clearly — and where that gap can be addressed with intellectual honesty and professional commitment.

The seven maturity signals I observed are not cause for alarm. They are cause for action. They tell us precisely where the enterprise architecture community can add the most value: in providing the reference architectures, governance frameworks, competency development models, and operating model designs that transform scattered good intentions into coherent, governable, and evolvable capabilities. The strong panelists on that second panel proved that this transformation is possible. The others proved that it is necessary.

The enterprise architecture community has both a responsibility and an opportunity in this moment. The responsibility is to engage — not from the sidelines, and not with abstract frameworks that never touch operational reality, but with specific, actionable, and architecturally rigorous guidance that meets government practitioners where they are and helps them get where they need to be. The opportunity is to demonstrate, through that engagement, that enterprise architecture is not overhead, not bureaucracy, not a compliance exercise, but the essential discipline that turns technological potential into organizational capability.

The signals from the field are clear: the architecture community’s moment to lead is now.


About the Author

Steve Else, Ph.D., is the Founder and Editor-in-Chief of the Enterprise Architecture Professional Journal (EAPJ.org) and a recognized thought leader in enterprise architecture. His research and contributions related to conferences and summits focus on the intersection of architecture governance, emerging technology integration, and organizational capability development.

Event Reference: The Carahsoft GovExperience Summit 2026: Advancing Government Service Delivery & CX was held at the Carahsoft Conference & Collaboration Center, 11493 Sunset Hills Road, Reston, Virginia. The summit was organized by Government Executive Media Group and underwritten by Carahsoft Technology Corp., with gold sponsorship from Knox Systems and Salesforce.

© 2026 Enterprise Architecture Professional Journal (EAPJ.org). All rights reserved. This article may be shared with attribution for professional and educational purposes.

Leave a Reply