Skip to main content
Module 7

LEAD — Becoming an Orchestration Leader

Teaching others and scaling the discipline

132 min read26,302 words
Reading progress0%

Module 7A: LEAD — Theory

R — Reveal

Case Study: Thornton Manufacturing — When Success Becomes a Bottleneck


Diana Okafor had done everything right.

As Thornton Manufacturing's process improvement manager, she had been the first to recognize the opportunity when the company's quality inspection documentation was drowning in inefficiency. Inspectors were spending forty percent of their time on paperwork. Critical defect patterns were buried in spreadsheets no one analyzed. Customer complaints took weeks to trace back to root causes.

Diana had learned the discipline through a regional workshop—the A.C.O.R.N. methodology for AI-augmented operations. She had applied it rigorously to the inspection documentation problem:

Assess: She had mapped the entire quality documentation workflow, identifying twelve distinct friction points where value leaked from the process. The biggest: inspectors hand-entering the same data into three separate systems, then manually cross-referencing defect patterns that should have been automatically correlated.

Calculate: She had built a business case projecting $280,000 in annual value—primarily from reduced documentation time and faster defect pattern identification. The calculation was transparent, conservative, defensible.

Orchestrate: She had designed a workflow where AI handled the data correlation and pattern recognition while inspectors retained judgment authority for classification decisions. The human-AI collaboration was clean, with clear handoff points and maintained accountability.

Realize: She had piloted the system on one production line, measured results for eight weeks, and validated that the projections held. Actually, they exceeded—the final calculation showed $340,000 in annual value, twenty percent above projection.

Nurture: She had established monitoring, assigned ownership, documented procedures, and created a sustainability plan that kept the system performing month after month.

One implementation. One success. $340,000 in proven value.


The Call from Corporate

Thornton Manufacturing was a mid-sized industrial equipment company—four plants across two states, 2,200 employees, $180 million in revenue. Diana's success at the Greenville plant had attracted attention.

James Mitchell, the VP of Operations, called her six months after the quality documentation project stabilized.

"Diana, I need you to scale this. What you did for quality inspection—I want that capability across all four plants. Not just quality, either. Procurement, inventory, customer service. Every process where this method applies."

Diana felt the weight of what he was asking.

"James, I appreciate the confidence. But what I did took eight months of my full attention. I don't have the bandwidth to replicate that across four plants simultaneously."

"I'm not asking you to do it all yourself. Train others. Build a team. Make this a company-wide capability."

"That's... different than what I've been doing. I know how to execute. I'm not sure I know how to teach it at scale."

"Figure it out. You have eighteen months to establish Orchestrated Intelligence as a Thornton Manufacturing capability. The executive team is committed. Budget won't be a constraint. What we need is someone who actually understands how this works—and that's you."

The call ended. Diana stared at her notes.

She had mastered execution. Now she needed to master something entirely different: leadership.


The Replication Trap

Diana's first instinct was to clone herself.

She identified four high-potential employees across the plants—one from each location—and brought them to Greenville for two weeks of intensive training. She walked them through the methodology, showed them her documentation, had them shadow her daily work, and sent them back to their plants with instructions to "find opportunities and apply the method."

It failed comprehensively.


Plant 2: Oak Ridge

Sarah, the Oak Ridge representative, was a supply chain analyst with strong quantitative skills. She identified an opportunity in inventory management—excess stock tying up $2.3 million in working capital—and built what she thought was a business case.

But the calculation was wrong. She had quantified the carrying cost of excess inventory without accounting for the service-level risk of reducing stock. The "opportunity" she identified would have saved $180,000 annually while creating $400,000 in stockout costs.

Diana caught the error during a review call. But she was already stretched thin across all four initiatives. How many errors was she not catching?


Plant 3: Riverside

Tom, the Riverside representative, was an operations supervisor who had been skeptical from the start. His "opportunity assessment" was a thinly veiled argument for a project he had wanted to do for years—automating the maintenance scheduling system. The assessment didn't reveal hidden friction; it justified a predetermined solution.

When Diana pushed back, Tom became defensive. "You did it your way in Greenville. Let me do it my way in Riverside."

He wasn't wrong that different contexts might require different approaches. But he also wasn't following the methodology—he was ignoring it while using its vocabulary.


Plant 4: Lakeside

Maria, the Lakeside representative, was the most promising. She genuinely understood the methodology, identified a legitimate opportunity in customer service documentation, and built a solid business case showing $95,000 in annual value.

But Maria couldn't get traction. The customer service director at Lakeside didn't trust the analysis. "Who is this person? She's been here three months and now she's telling me my department is broken?"

Maria lacked the organizational credibility that Diana had built over four years at Greenville. The methodology was sound, but the messenger couldn't land it.


Plant 1: Greenville

And at Greenville—Diana's home plant—the quality inspection system she had built was drifting. With her attention divided across four locations, she wasn't reviewing the monitoring dashboards as frequently. Usage had dropped from 91% to 78%. Two inspectors had developed workarounds because "it's faster the old way for these particular cases."

By month six of the eighteen-month initiative, Diana had accomplished less than nothing. No new implementations at scale. One degrading implementation at home. Four frustrated trainees. And an executive sponsor who was starting to ask uncomfortable questions.


Module 7A: LEAD — Theory

O — Observe

From Practitioner to Leader: The Five Leadership Principles


The transition from practitioner to leader is one of the most difficult shifts in professional development. What made you successful as a practitioner—personal execution, direct control, individual expertise—becomes a liability when your mandate expands to organizational capability.

Diana Okafor experienced this directly. Her first instinct was to replicate herself: train others to do what she did. But expertise doesn't transfer through exposure. The skills that made Diana effective—judgment developed through practice, credibility earned through results, authority established through relationships—weren't teachable in a two-week workshop.

Module 7 introduces five leadership principles that guide the transition from individual mastery to organizational capability.


Anchor Principle

The measure of mastery is whether others can do it without you.

This principle reframes success entirely. As a practitioner, Diana was measured by what she produced. As a leader, she must be measured by what others produce when she's not involved.

This is uncomfortable for most practitioners. Personal execution provides control, recognition, and security. If others can do the work without you, what's your role? What's your value?

The answer: your value shifts from doing the work to building the capability to do the work. This is more leveraged, more impactful, and more durable—but it requires surrendering the direct control that made practitioner success satisfying.


Principle 1: Codification Before Scale

If the method lives in your head, it can't scale.

Diana's first attempt at training failed because she tried to transfer tacit knowledge—the intuitions, judgments, and pattern recognition she had developed through practice—directly to trainees. This doesn't work. Tacit knowledge transfers slowly, requires extensive shared experience, and loses fidelity in transmission.

Codification is the process of making tacit knowledge explicit:

  • Documenting procedures that capture not just "what to do" but "why to do it" and "how to know if it's working"
  • Creating decision trees that guide judgment in common situations
  • Building templates that embed the logic of the methodology so users don't need to reinvent it
  • Developing examples that illustrate what good looks like—and what failure looks like

Codification isn't about replacing human judgment. It's about reducing the judgment burden to what actually requires expertise, while systematizing everything that can be systematized.

The goal is to get practitioners to seventy percent of expert capability through codified materials. The remaining thirty percent comes from supervised practice and developed judgment—but starting at seventy percent instead of zero changes everything.


Principle 2: Teaching as Test

If you can't teach it, you don't fully understand it.

Many practitioners overestimate their own understanding because they've never had to make it explicit. They know how to do the work, but they can't articulate how they know. They make good judgments, but they can't explain the criteria they're applying.

Teaching forces clarity. When a practitioner must explain the methodology to someone who doesn't already understand it, gaps in understanding become visible:

  • "I just know when an opportunity is worth pursuing" becomes inadequate when a trainee asks for criteria
  • "You'll develop judgment over time" doesn't help someone making a decision today
  • "It depends on the situation" is true but useless without guidance on what situational factors matter

The discipline of teaching—of answering the questions that novices ask—refines the teacher's own understanding. Teaching isn't just knowledge transfer; it's knowledge development.

This principle also provides a quality check: if trainees consistently struggle with specific elements, the problem may be in how those elements are being taught, not in the trainees themselves.


Principle 3: Governance Enables Speed

Clear decision rights prevent bottlenecks and paralysis.

One of Diana's early failures was creating dependency: her trainees couldn't make decisions without her approval, but she didn't have bandwidth to approve everything. The result was either bottlenecks (waiting for Diana) or freelancing (making decisions without understanding).

Governance solves this by making decision authority explicit:

Decision TypeAuthority Level
Methodology interpretationMethodology owner (final authority)
Strategic prioritizationGovernance board
Business case approvalExecutive sponsor
Implementation tacticsPractitioner (within guidelines)
Escalation triggersDocumented thresholds

With clear governance, practitioners know what they can decide independently, what requires consultation, and what requires approval. This actually increases speed because it eliminates the uncertainty that causes people to wait for permission they don't need—or make decisions they shouldn't.

Governance also distributes accountability. When decisions are explicit, outcomes can be traced to decision-makers. This creates learning: good decisions are reinforced; poor decisions are visible and correctable.


Principle 4: Relationships Don't Transfer

Authority must be established, not borrowed indefinitely.

Diana had credibility at Greenville because she had earned it—four years of demonstrated competence, relationships built through collaboration, trust developed through delivered results. Her trainees had none of this. They were outsiders telling insiders that their processes were broken.

This is a fundamental challenge of scaling: the credibility that enables execution doesn't come with the methodology. A practitioner can borrow credibility temporarily (through executive sponsorship, through association with proven success), but eventually must establish their own.

The leadership response has three components:

  1. Executive sponsorship: Local leaders who trust the methodology lend their credibility to practitioners. "I'm sponsoring this initiative" creates space for the practitioner to operate while establishing their own credibility.

  2. Quick wins: Small, visible successes build local credibility faster than comprehensive business cases. A practitioner who delivers one visible improvement has more organizational standing than one who projects large future value.

  3. Relationship investment: Practitioners must spend time building relationships, not just executing methodology. Understanding local context, respecting local expertise, and collaborating rather than dictating—these create the conditions for sustained influence.

The implication: scaling takes longer than pure execution because relationship-building time must be included. Leaders who try to skip this step find that technically sound implementations fail for organizational reasons.


Principle 5: The Organization Owns the Transformation

Individual heroics don't create sustainable capability.

Diana's initial model positioned her as the hero: the expert who would transform the organization through personal brilliance. This is seductive—it feels important—but it creates fragility. If Diana left, the capability would degrade rapidly.

The alternative is to position the organization as the owner of the transformation, with Diana as the architect:

  • Governance structures that persist beyond any individual
  • Documented methods that don't require Diana to interpret
  • Certified practitioners who can execute independently
  • Executive commitment that provides ongoing resources and attention
  • Success metrics that the organization tracks, not just Diana

This means Diana's success is measured not by what she produces, but by what the organization can produce when she's not involved. It means building systems that would survive her departure.

For practitioners, this requires ego management. The transformation isn't "Diana's initiative"—it's Thornton Manufacturing's capability. Recognition is shared. Credit is distributed. The practitioner becomes less visible even as their impact grows.


The Leadership Mindset Shift

Together, these five principles represent a fundamental mindset shift:

Practitioner MindsetLeader Mindset
Success = what I produceSuccess = what others produce
Knowledge = what I knowKnowledge = what's documented
Quality = my standardsQuality = governance standards
Influence = my relationshipsInfluence = organizational structures
Control = my decisionsControl = clear decision rights

This shift is difficult because practitioner success habits must be unlearned. The behaviors that earned recognition—personal execution, visible contribution, direct control—are exactly wrong for leadership.

But the shift is necessary because individual capacity doesn't scale. Diana had twenty-four hours in a day. No matter how effective she became, she could only execute so many initiatives. Leadership leverage comes from building capability that multiplies beyond individual limits.


When to Make the Transition

Not every practitioner should become a leader. Leadership is a different role, not a promotion. Some practitioners should remain practitioners—deepening expertise, executing complex initiatives, serving as methodology experts.

The transition to leadership makes sense when:

  • Organizational demand exceeds individual capacity
  • The practitioner has genuine interest in developing others
  • The methodology is mature enough to codify
  • Organizational support exists for building capability infrastructure

The transition is premature when:

  • The practitioner hasn't yet mastered the methodology
  • Organizational support is uncertain or transactional
  • The practitioner would rather execute than develop others
  • The method isn't understood well enough to teach

Module 7 equips practitioners to make this transition successfully—but the first step is recognizing that leadership is a choice, not an inevitable progression.


The Practitioner's Dilemma

Diana faced a genuine dilemma. She loved execution. There was satisfaction in personally solving complex problems, in seeing direct results from her work, in having control over quality.

Leadership meant giving up that satisfaction. It meant watching others execute with less skill than she would bring. It meant accepting that some implementations would be imperfect—that scaling capability meant tolerating variance she could have prevented through personal involvement.

But leadership also meant impact she could never achieve alone. Four plants. Multiple initiatives. Hundreds of thousands in value. A capability that would persist after she moved on.

The measure of mastery is whether others can do it without you.

For Diana, accepting this principle was the first step toward genuine organizational transformation.


The five leadership principles synthesize research on organizational learning, knowledge management, and leadership development. Key theoretical foundations include Nonaka's knowledge creation theory, Argyris and Schön's work on organizational learning, and the communities of practice literature.


Module 7A: LEAD — Theory

O — Observe

Teaching the A.C.O.R.N. Method


Teaching methodology is fundamentally different from executing methodology. Execution requires applying knowledge to specific situations. Teaching requires making knowledge explicit, sequencing learning appropriately, and developing judgment in others who lack the teacher's experience.

Diana discovered this difference painfully. Her first attempt at training—two weeks of intensive shadowing—assumed that exposure would create capability. It didn't. Her trainees watched her work but couldn't replicate her judgment.

Effective methodology teaching requires understanding what knowledge is, how judgment develops, and how to structure learning that builds genuine capability.


The Knowledge Hierarchy

Not all knowledge transfers the same way. Understanding the hierarchy helps teachers design appropriate interventions.


Level 1: Information

Information is declarative knowledge—facts, definitions, procedures. The A.C.O.R.N. methodology has substantial informational content:

  • The five phases and their purposes
  • The deliverables for each phase
  • The ROI lenses for measurement
  • The workflow design patterns

Information transfers relatively easily through documentation, instruction, and examples. A trainee can learn what A.C.O.R.N. means and what each phase involves within hours.

But information alone doesn't create capability. Knowing that the Assess phase identifies friction points doesn't mean a trainee can identify friction points effectively.


Level 2: Procedural Knowledge

Procedural knowledge is how-to knowledge—the steps to accomplish specific tasks. The methodology's templates and processes embed procedural knowledge:

  • How to conduct a friction inventory
  • How to calculate ROI using the three lenses
  • How to design a pilot measurement plan
  • How to build a sustainability plan

Procedural knowledge transfers through guided practice. A trainee learns not just what to do but how to do it, with feedback that corrects errors and reinforces success.

Diana's playbook—the step-by-step implementation guide with decision trees and templates—captured procedural knowledge effectively. Trainees could follow the procedures even without deep understanding.


Level 3: Judgment

Judgment is the ability to make appropriate decisions in novel or ambiguous situations. This is the knowledge that experienced practitioners apply intuitively:

  • When an opportunity is worth pursuing versus deferring
  • How to handle stakeholder resistance
  • When a business case is strong enough to present
  • How to adapt the methodology to local context
  • When to escalate versus decide independently

Judgment doesn't transfer through documentation or instruction. It develops through experience with feedback—making decisions, observing outcomes, receiving correction. This is why Diana's two-week shadowing failed: watching someone exercise judgment doesn't build judgment in the observer.


Level 4: Wisdom

Wisdom is strategic knowledge—understanding how things connect, what matters most, when rules should be broken. Examples:

  • Understanding which organizational dynamics will enable or obstruct transformation
  • Recognizing when methodology purity should yield to pragmatic adaptation
  • Knowing how to build sustainable capability rather than dependent relationships

Wisdom develops over years of practice. It cannot be directly taught but can be accelerated through mentorship, reflection, and exposure to diverse situations.

For most practitioners, the goal is to develop through Level 3 (judgment). Level 4 (wisdom) emerges for those who lead methodology implementation across multiple contexts over extended periods.


The Teaching Architecture

Effective methodology teaching structures learning to develop each knowledge level appropriately.


Phase 1: Foundation (Weeks 1-4)

Focus: Information and procedural knowledge

Content:

  • Complete playbook study (self-paced, approximately 20 hours)
  • Instructor-led sessions on core concepts (8 hours total)
  • Template practice with provided cases (16 hours)
  • Knowledge assessment (quiz, not judgment test)

Outcome: Trainee understands the methodology conceptually and can execute procedures using templates and documentation.

Quality check: Can the trainee correctly apply templates to a practice case? Can they explain the purpose of each phase?


Phase 2: Supervised Execution (Months 2-4)

Focus: Procedural knowledge and early judgment development

Structure:

  • Trainee executes a real initiative from Assess through Nurture
  • Methodology leader reviews every major deliverable before finalization
  • Explicit feedback on quality, completeness, and judgment calls
  • Graduated responsibility as competence demonstrates

The Supervision Model:

DeliverableReview DepthDecision Authority
Friction inventoryFull review, detailed feedbackTrainee drafts, leader approves
Opportunity prioritizationFull review, discussion of judgmentJoint decision
Business caseFull review, challenge assumptionsLeader approves before presentation
Workflow designFull review, design critiqueTrainee leads, leader validates
Pilot planLight review, focus on measurementTrainee decides, leader advises
Sustainability planFull reviewJoint decision

The key principle: trainees make decisions but receive feedback before those decisions become final. This builds judgment through experience while preventing costly errors.

Outcome: Trainee has successfully executed one initiative with support. Judgment is developing but not reliable for independent operation.

Quality check: Did the initiative succeed? Did the trainee require less intervention as the initiative progressed? Can the trainee articulate why they made specific choices?


Phase 3: Supported Independence (Months 5-8)

Focus: Judgment refinement

Structure:

  • Trainee executes a second initiative with reduced supervision
  • Methodology leader available for consultation but not reviewing everything
  • Scheduled check-ins (biweekly) rather than deliverable reviews
  • Trainee makes most decisions independently

The Consultation Model:

Trainee initiates consultation when:

  • Facing an unfamiliar situation
  • Uncertain about methodology interpretation
  • Encountering significant stakeholder resistance
  • Results diverging from projections

Leader initiates consultation when:

  • Monitoring indicates concerns
  • Strategic questions arise
  • Quality issues appear in outputs
  • Trainee seems stuck

Outcome: Trainee demonstrates reliable judgment in routine situations. Complex or novel situations may still require consultation.

Quality check: How often does the trainee need consultation? Are consultations about genuine complexity or uncertainty the trainee should resolve independently?


Phase 4: Certification (Month 9+)

Focus: Validated judgment and independent capability

Certification requires:

  1. Two successful initiative completions
  2. Demonstrated judgment in handling ambiguity
  3. Stakeholder feedback confirming effective collaboration
  4. Peer review of deliverable quality
  5. Methodology leader attestation of capability

Post-Certification:

  • Trainee operates independently
  • Leader involvement only for strategic decisions
  • Trainee may begin mentoring new trainees
  • Ongoing participation in community of practice

Outcome: Trainee is a certified practitioner capable of independent methodology execution.


Common Teaching Mistakes


Mistake 1: Assuming Exposure Equals Learning

Diana's original approach—watch me work, then do what I do—assumes that observation transfers capability. It doesn't. Observation provides information about what experts do, but not the judgment that guides their decisions.

The correction: structured learning with graduated practice, not shadowing alone.


Mistake 2: Teaching to the Average

When training groups, there's pressure to teach to the middle. But methodology trainees vary dramatically in background, aptitude, and learning speed. The average-focused approach bores quick learners and loses slow ones.

The correction: self-paced foundational learning with instructor time focused on judgment development, which requires individualized attention anyway.


Mistake 3: Delaying Real Work

Some teaching approaches delay real implementation until trainees have "mastered" the methodology. This creates artificial environments where judgment can't develop because the stakes aren't real.

The correction: get trainees into real work early, with supervision that prevents catastrophic errors while allowing learning from smaller mistakes.


Mistake 4: Supervising Too Long

The opposite error: supervisors who can't release control. If supervision never decreases, trainees never develop independence, and the methodology leader becomes a bottleneck.

The correction: explicit graduation criteria and deliberate reduction in supervision as capability demonstrates.


Mistake 5: Certifying Too Fast

Pressure to show results can lead to premature certification. A trainee who succeeded once with heavy support isn't ready for independence—but organizations want to claim capability exists.

The correction: certification based on demonstrated independent judgment, not just successful outcomes with supervision.


The Mentorship Relationship

Beyond structured teaching, ongoing mentorship accelerates judgment development:


After-Action Reviews

Following each initiative phase, mentor and trainee discuss:

  • What worked and why
  • What didn't work and why
  • What the trainee would do differently
  • What patterns emerge across situations

This reflection transforms experience into learning rather than just accumulated activity.


Judgment Calibration

Mentor and trainee periodically review decisions together:

  • Cases where trainee judgment was correct
  • Cases where trainee judgment was off (and why)
  • Criteria the trainee is applying (explicitly or implicitly)
  • Refinements to decision-making approach

The goal is making trainee judgment explicit so it can be examined and refined.


Stretch Assignments

As trainees mature, mentors assign progressively challenging situations:

  • More complex opportunities
  • More difficult stakeholders
  • Less familiar domains
  • Higher organizational visibility

Stretch assignments accelerate development by pushing trainees beyond their comfort zones with mentor support available.


Teaching as Organizational Investment

Teaching methodology isn't free. It requires:

  • Leader time: 10-15 hours per trainee during foundation, plus ongoing supervision
  • Trainee time: 40+ hours during foundation, plus learning curve impact on execution speed
  • Organizational patience: First trainee initiatives take longer and produce more modest results
  • Tolerance for imperfection: Trainee work won't match expert quality initially

Organizations that invest in teaching build capability that persists. Organizations that skip teaching—expecting practitioners to figure it out—get either failure or continued dependence on the original expert.

The question isn't whether to invest in teaching. It's whether the organization is committed to building capability or just buying expertise.


Teaching methodology draws on situated learning theory, particularly Lave and Wenger's work on communities of practice and legitimate peripheral participation. The progression from observation to supervised execution to independence reflects established models of professional development.


Module 7A: LEAD — Theory

O — Observe

Building a Center of Excellence


Diana's pivot from training individuals to building a Center of Excellence (CoE) was the breakthrough that made scaling possible. A CoE is an organizational structure that concentrates expertise, provides governance, and enables distributed execution.

The concept isn't new—organizations have built Centers of Excellence for decades in areas like project management, data analytics, and quality assurance. But applying the model to AI-augmented operations requires understanding both the general principles and the specific adaptations the discipline requires.


Why Individual Practitioners Fail at Scale

Diana's initial approach—train individuals and send them to execute—failed for structural reasons:

Authority gap: Junior practitioners lack the organizational standing to challenge senior leaders about process inefficiencies.

Credibility gap: New practitioners haven't demonstrated success, so stakeholders reasonably question their recommendations.

Support gap: Isolated practitioners have no one to consult when facing ambiguous situations.

Quality gap: Without review mechanisms, methodology drift and errors go undetected.

Sustainability gap: Individual departures eliminate local capability entirely.

A Center of Excellence addresses these gaps by creating organizational infrastructure that individual practitioners cannot create alone.


Center of Excellence Models

Three primary CoE models exist, each with distinct trade-offs:


Model 1: Centralized CoE

All practitioners work within a dedicated organizational unit. Local business units request services from the CoE, which dispatches practitioners to execute initiatives.

Structure:

  • CoE reports to a senior executive (often CFO, COO, or Chief Digital Officer)
  • Full-time practitioners dedicated to methodology execution
  • Business units are "clients" of the CoE
  • Governance is internal to the CoE

Advantages:

  • Strong quality control (all practitioners under one management structure)
  • Consistent methodology application
  • Clear career path for practitioners
  • Economies of scale in training and development
  • CoE has organizational visibility and authority

Disadvantages:

  • "Consulting" relationship can create distance from business units
  • Practitioners may lack deep domain knowledge
  • Business units may resist "outside" recommendations
  • Can create bottleneck if demand exceeds CoE capacity
  • Risk of becoming bureaucratic gatekeeper

Best for: Large organizations with multiple business units, high demand for methodology application, and leadership commitment to centralized capability.


Model 2: Federated CoE

Practitioners are embedded in business units but connect through a network. A small central team maintains methodology standards, provides training, and coordinates across the network.

Structure:

  • Practitioners report to local business unit leadership
  • Central team (often 1-3 people) maintains methodology and provides development
  • Community of practice connects practitioners across units
  • Governance is distributed with central standards

Advantages:

  • Practitioners have deep domain knowledge and local relationships
  • Business units have ownership of their practitioners
  • Methodology adapts to local context
  • No bottleneck through central team
  • Lower overhead than centralized model

Disadvantages:

  • Methodology can drift as local adaptations accumulate
  • Quality varies across business units
  • Harder to maintain consistent standards
  • Practitioners may be pulled toward local priorities at methodology's expense
  • Central team has limited authority to enforce standards

Best for: Organizations with distinct business units, strong local leadership commitment, and culture that resists centralization.


Model 3: Hub-and-Spoke CoE

Hybrid model combining central expertise with local presence. A central team handles strategy, training, and complex initiatives while local "champions" support routine execution in their units.

Structure:

  • Central hub of full-time practitioners (methodology experts)
  • Local champions (part-time practitioners) in each business unit
  • Hub handles methodology development, training, complex cases
  • Champions handle routine execution and local relationship management
  • Governance shared between hub and business unit leadership

Advantages:

  • Combines depth of centralized model with reach of federated model
  • Hub maintains methodology integrity
  • Champions provide local credibility and domain knowledge
  • Scalable—can add champions without proportional hub growth
  • Champions create distributed leadership pipeline

Disadvantages:

  • More complex governance structure
  • Champions have split attention (methodology plus other responsibilities)
  • Hub-champion relationship requires active management
  • Risk of champions becoming isolated without hub support
  • Requires clear escalation criteria between champion and hub

Best for: Mid-sized organizations wanting to scale capability without building large central team, or organizations piloting methodology before committing to full CoE investment.


Diana's Hub-and-Spoke Model

Diana chose the hub-and-spoke model for Thornton Manufacturing:

The Hub:

  • Diana as methodology owner (full-time focus on CoE)
  • One additional full-time practitioner supporting Diana

The Spokes:

  • Four plant champions (20-30% of role dedicated to methodology)
  • Each champion embedded in their local plant organization

Why this model:

  • Thornton wasn't ready to invest in a large central team
  • Local credibility required embedded champions
  • Diana needed to develop capability, not just execute initiatives
  • Four plants created natural spoke structure

Essential CoE Functions

Regardless of model, an effective CoE performs five essential functions:


Function 1: Methodology Stewardship

Someone must own the methodology—maintaining standards, interpreting ambiguities, evolving the approach as learning accumulates.

Activities:

  • Maintaining and updating the playbook
  • Resolving methodology interpretation questions
  • Incorporating lessons learned into standard practices
  • Evaluating and approving methodology adaptations

Authority required: Final say on what constitutes correct methodology application.


Function 2: Quality Assurance

Work product must meet standards. Without quality assurance, methodology drift degrades outcomes over time.

Activities:

  • Reviewing deliverables at key checkpoints
  • Conducting post-implementation audits
  • Identifying common quality issues and addressing root causes
  • Certifying practitioners who meet quality standards

Authority required: Ability to require rework when quality is inadequate.


Function 3: Practitioner Development

The organization needs a pipeline of capable practitioners. Development doesn't happen automatically.

Activities:

  • Operating the training program (foundation through certification)
  • Providing ongoing coaching and mentorship
  • Conducting competency assessments
  • Managing practitioner career progression

Resources required: Significant time investment from experienced practitioners.


Function 4: Portfolio Governance

Not all opportunities should be pursued. The CoE ensures resources go to highest-value initiatives.

Activities:

  • Reviewing proposed opportunities for alignment and viability
  • Prioritizing the initiative portfolio
  • Allocating practitioner resources across initiatives
  • Tracking portfolio performance and adjusting priorities

Authority required: Decision rights over which initiatives proceed (or at minimum, advisory authority to executive decision-makers).


Function 5: Organizational Advocacy

The CoE must advocate for the methodology within the organization—securing resources, building executive support, and removing barriers.

Activities:

  • Communicating results and value to leadership
  • Building relationships with key stakeholders
  • Identifying and addressing organizational barriers
  • Positioning the methodology within organizational strategy

Relationship required: Direct access to executive leadership.


CoE Governance Structure

Effective governance requires clear decision rights at multiple levels:


Executive Sponsor

A senior leader (VP or above) who champions the CoE and ensures organizational support.

Responsibilities:

  • Securing budget and resources
  • Removing organizational barriers
  • Holding CoE accountable for results
  • Connecting CoE to strategic priorities

Governance Board

A cross-functional group that provides oversight and strategic direction.

Typical composition:

  • Executive sponsor (chair)
  • CoE leader (Diana, in this case)
  • Representatives from major business units
  • Finance representative (for business case validation)

Responsibilities:

  • Approving major initiatives
  • Reviewing portfolio performance
  • Resolving cross-functional issues
  • Guiding methodology evolution

Meeting cadence: Monthly or quarterly, depending on initiative volume.


CoE Leadership

The operational leader of the CoE (Diana) who manages daily activities.

Responsibilities:

  • Managing practitioners and champions
  • Ensuring quality of work product
  • Developing and maintaining methodology
  • Reporting to governance board
  • Representing CoE in organizational discussions

Practitioner Teams

The people executing initiatives.

Responsibilities:

  • Delivering high-quality methodology execution
  • Following governance requirements
  • Participating in continuous improvement
  • Developing junior practitioners

RACI for Key Decisions

DecisionExecutive SponsorGovernance BoardCoE LeaderPractitioner
Initiative approval (major)ARCI
Initiative approval (minor)IIAR
Methodology interpretationIIAC
Practitioner certificationIIAR
Budget allocationACRI
Quality standardsICAR
Strategic prioritiesARCI

R = Responsible, A = Accountable, C = Consulted, I = Informed


Establishing the CoE

Building a CoE follows a predictable sequence:

Phase 1: Charter (Month 1)

  • Define CoE mission and scope
  • Secure executive sponsorship
  • Establish governance structure
  • Allocate initial budget and resources

Phase 2: Foundation (Months 2-3)

  • Recruit or assign CoE leader
  • Develop playbook and templates
  • Design training program
  • Identify initial champions

Phase 3: Pilot (Months 4-8)

  • Train initial cohort of champions
  • Execute first wave of initiatives
  • Refine methodology based on experience
  • Demonstrate early results

Phase 4: Scale (Months 9-18)

  • Expand champion network
  • Increase initiative volume
  • Develop second-generation practitioners
  • Institutionalize governance processes

Phase 5: Maturity (Months 18+)

  • Sustainable operation without hero dependence
  • Continuous improvement embedded
  • Clear career paths for practitioners
  • Recognized organizational capability

Common CoE Failure Patterns


Failure 1: Inadequate Executive Support

CoEs require sustained executive attention. When executives lose interest or change priorities, CoEs wither from resource starvation.

Prevention: Secure multi-year commitment, tie CoE to strategic objectives, deliver visible wins regularly.


Failure 2: All Governance, No Execution

Some CoEs become bureaucratic oversight bodies that slow work without adding value. Practitioners resent the overhead; business units route around the CoE.

Prevention: Ensure CoE provides genuine support, not just approval gates. Value should flow to practitioners, not just from them.


Failure 3: Hero Dependence

The CoE is built around one expert (like Diana) who becomes indispensable. When they leave, the capability collapses.

Prevention: Build leadership depth, document everything, certify multiple practitioners before the founder moves on.


Failure 4: Methodology Ossification

The CoE treats the methodology as fixed, resisting adaptation even when evidence suggests improvement. The methodology becomes disconnected from reality.

Prevention: Build continuous improvement into CoE operations. Methodology should evolve based on accumulated learning.


Failure 5: Isolation from Business

The CoE becomes an ivory tower—technically sophisticated but disconnected from business priorities. Initiatives are methodologically pure but strategically irrelevant.

Prevention: Keep governance connected to business leadership. Measure CoE on business outcomes, not methodology compliance.


The CoE as Transformation Engine

A well-functioning CoE does more than execute initiatives. It transforms how the organization approaches AI-augmented operations:

  • Building capability rather than buying consulting
  • Creating standards that enable scaling
  • Developing talent that grows the organization's capacity
  • Demonstrating value that justifies continued investment
  • Embedding learning so each initiative makes the next one better

Diana's Center of Excellence at Thornton Manufacturing became this kind of engine—not just delivering projects, but building the organizational muscle that would deliver projects long after Diana moved on.


Center of Excellence design draws on organizational design literature, particularly Galbraith's star model and the IT governance frameworks (COBIT, ITIL) that have shaped shared services approaches across industries.


Module 7A: LEAD — Theory

O — Observe

Portfolio Management for Orchestrated Intelligence


Individual initiatives are tactical. A portfolio is strategic.

When Diana's mandate expanded from one initiative to organizational capability, she wasn't just running multiple projects simultaneously—she was managing a portfolio of opportunities with limited resources. Portfolio management is the discipline of choosing what to pursue, what to defer, and what to decline.

This is fundamentally different from executing individual initiatives well. An organization can have excellent individual implementations and still fail at portfolio level—pursuing wrong opportunities, misallocating resources, or creating unsustainable demands on scarce expertise.


The Portfolio Mindset

Individual initiative thinking focuses on: How do we make this initiative succeed?

Portfolio thinking focuses on: Given limited resources, which initiatives should we pursue to maximize organizational value?

This shift has several implications:

Trade-offs become explicit. Resources spent on one initiative aren't available for another. Every "yes" is implicitly a "no" to something else.

Timing matters. Some initiatives are more urgent; others can wait. Sequencing affects total value delivered.

Dependencies emerge. Some initiatives enable others. Portfolio order affects what becomes possible.

Risk must be managed. Concentrating all resources on one initiative maximizes risk. Diversification reduces single-point-of-failure exposure.


Portfolio Dimensions

A healthy portfolio balances across multiple dimensions:


Dimension 1: Value Size

Portfolios should contain a mix of initiative sizes:

SizeAnnual ValueTypical DurationRole in Portfolio
Quick wins$25,000-$75,0001-3 monthsBuild credibility, develop practitioners
Core initiatives$75,000-$250,0004-8 monthsPrimary value delivery
Transformational$250,000+9-18 monthsStrategic differentiation

The trap: Pursuing only transformational initiatives. They take too long, carry too much risk, and don't build capability as fast as smaller initiatives.

The balance: 60% core, 30% quick wins, 10% transformational (by resource allocation, not count).


Dimension 2: Complexity

Complexity affects execution difficulty and resource requirements:

ComplexityCharacteristicsPractitioner Level
LowSingle process, clear stakeholders, proven patternsDeveloping practitioners
MediumMultiple processes, some organizational change, adaptation requiredCertified practitioners
HighCross-functional, significant change management, novel applicationExpert practitioners + support

The trap: Assigning complex initiatives to developing practitioners as "stretch opportunities." They fail, damaging both the initiative and the practitioner's confidence.

The balance: Match complexity to capability. Use complexity as a development tool only when adequate support is available.


Dimension 3: Business Domain

Diversification across business domains builds organizational breadth:

  • Operations processes
  • Customer-facing processes
  • Financial processes
  • Administrative processes
  • Decision-support systems

The trap: Concentrating all initiatives in one domain because the first success came there. This creates perception that the methodology "only works for" that domain.

The balance: Deliberately seed initiatives across domains, especially early. Success breadth builds organizational credibility.


Dimension 4: Lifecycle Stage

The portfolio should include initiatives at different lifecycle stages:

StageCharacteristicsResource NeedValue Delivery
PipelineIdentified but not startedLow (assessment)Future
Active DevelopmentAssess through RealizeHigh (execution)Near-term
OperationalNurture phaseModerate (maintenance)Current
EnhancementAdding capability to existingMedium (iteration)Incremental

The trap: All resources consumed by active development, leaving operational systems undermaintained and pipeline empty.

The balance: Allocate capacity across stages. Typically: 60% active development, 25% operational maintenance, 15% pipeline development.


Portfolio Prioritization

Not all opportunities deserve pursuit. Prioritization applies the Module 2 lens at portfolio level.


Prioritization Criteria

  1. Strategic alignment: Does this opportunity support organizational priorities?
  2. Value magnitude: What is the projected annual value?
  3. Confidence level: How reliable is the value estimate?
  4. Execution complexity: What resources and time does this require?
  5. Dependency status: Does this enable or require other initiatives?
  6. Risk profile: What could go wrong? What's the downside?
  7. Timing sensitivity: Is there a window that closes? Urgency?

Prioritization Matrix

A simple 2x2 often suffices for initial sorting:

High Strategic ValueLower Strategic Value
Lower EffortDo FirstConsider
Higher EffortPlan CarefullyDefer or Decline

The "Do First" quadrant is obvious. The interesting decisions are:

  • Consider: Lower strategic value but low effort. These can build credibility and develop practitioners. Include some, but don't let them crowd out strategic work.

  • Plan Carefully: High value but high effort. These are the transformational initiatives. Pursue selectively with appropriate resources and oversight.

  • Defer or Decline: Not worth the effort at current value. Revisit if circumstances change.


Forced Ranking

When resources are scarce (they always are), rank opportunities explicitly:

  1. List all viable opportunities
  2. Compare each pair: "If we could only do one, which would it be?"
  3. Produce a ranked list
  4. Draw a line where resources run out
  5. Everything above the line proceeds; below the line waits

This uncomfortable exercise makes trade-offs explicit. Organizations often resist—they want to pursue everything. But pursuing everything means doing everything poorly.


Portfolio Governance

Portfolio decisions require governance structure:


The Portfolio Review

Regular governance meeting focused on portfolio-level decisions.

Cadence: Monthly for active portfolios; quarterly for stable portfolios.

Participants: Executive sponsor, CoE leader, business unit representatives, finance.

Agenda:

  1. Portfolio status overview (active, pipeline, operational)
  2. Performance against targets (value delivered vs. projected)
  3. Resource utilization and capacity
  4. Proposed additions to portfolio
  5. Proposed changes to priorities
  6. Issues requiring governance decision

Outputs:

  • Decisions on portfolio composition
  • Resource allocation direction
  • Escalation resolutions
  • Updated priorities

Portfolio Metrics

Track aggregate portfolio health, not just individual initiative success:

MetricWhat It MeasuresTarget
Total annual value deliveredAggregate initiative valueGrowth year-over-year
Value pipelineProjected value in development2-3x current delivery capacity
Delivery success rate% of initiatives meeting projections>70%
Time to valueAverage months from start to value deliveryDecreasing trend
Practitioner utilization% of practitioner capacity engaged70-85% (headroom for quality)
Enhancement ratioEnhancements vs. new implementationsIncreasing over time

Portfolio Capacity Planning

Resources are finite. Capacity planning ensures commitments don't exceed capability.


Capacity Calculation

Available Practitioner Months = (# Practitioners) × (Months) × (% Available for Initiatives)

Example:
- 4 certified practitioners
- 12-month planning horizon
- 75% time available (remaining for training, admin, maintenance)
- Available: 4 × 12 × 0.75 = 36 practitioner-months

Demand Estimation

Each initiative requires practitioner effort:

Initiative SizeTypical Effort
Quick win2-4 practitioner-months
Core initiative6-10 practitioner-months
Transformational15-24 practitioner-months

Demand vs. capacity:

  • If demand > capacity: prioritize ruthlessly, defer lower-priority work
  • If demand < capacity: accelerate pipeline development, increase ambition
  • If demand ≈ capacity: healthy state, maintain balance

Capacity Constraints

Practitioners aren't interchangeable. Constraints include:

  • Expertise: Some initiatives require specific domain knowledge
  • Geography: Some initiatives require local presence
  • Relationship: Some stakeholders work better with specific practitioners
  • Development: Some initiatives are assigned for practitioner growth

Capacity planning must account for these constraints, not just raw headcount.


Portfolio Dynamics

Portfolios aren't static. They evolve as initiatives progress and new opportunities emerge.


Initiative Flow

Pipeline → Active → Operational → Enhancement/Retirement
    ↓                    ↓
  Defer               Decline

Healthy portfolios have flow at every stage. Warning signs:

  • Pipeline stagnation: No new opportunities identified (methodology becoming irrelevant?)
  • Active bottleneck: Too many initiatives in development (capacity exceeded?)
  • Operational neglect: No maintenance resources (sustainability risk?)
  • Enhancement drought: No improvements to existing systems (ossification?)

Portfolio Rebalancing

Circumstances change. Rebalancing adjusts portfolio composition:

Triggers for rebalancing:

  • Strategic priorities shift
  • Resources increase or decrease
  • Major initiative succeeds or fails
  • New opportunities emerge
  • External environment changes

Rebalancing actions:

  • Accelerate or defer initiatives
  • Add or remove initiatives
  • Shift resources between initiatives
  • Change priority rankings

Thornton Manufacturing Portfolio

Diana's portfolio at month 12 illustrated these principles:

Active Development:

  1. Oak Ridge inventory optimization ($165K projected) - Core, Medium complexity
  2. Riverside procurement analysis ($220K projected) - Core, Medium complexity
  3. Lakeside customer service ($95K projected) - Core, Low complexity

Operational:

  1. Greenville quality documentation ($340K actual) - Maintenance mode

Pipeline:

  • 8 additional opportunities identified, assessed, prioritized
  • Next 4 ready to begin when capacity allows

Portfolio Metrics:

  • Total current value: $340K (operational)
  • Total projected value: $480K (active)
  • Practitioner utilization: 78%
  • Pipeline depth: 8 initiatives (approximately 18 months of work)

Balance Assessment:

  • Size: Good mix (one quick win in Lakeside, two core)
  • Complexity: Appropriate match to practitioner levels
  • Domain: Operations (2), Customer (1), Administrative (0) - gap to address
  • Lifecycle: Healthy distribution across stages

Portfolio Leadership

Managing a portfolio requires leadership distinct from initiative execution:

Strategic thinking: Seeing the forest, not just the trees.

Trade-off navigation: Making explicit choices, accepting that some opportunities won't proceed.

Resource stewardship: Allocating limited capacity to maximum effect.

Stakeholder management: Maintaining support from executives, business leaders, and practitioners.

Long-term perspective: Building sustainable capability, not just delivering this quarter.

Diana's evolution from practitioner to leader was fundamentally about developing these portfolio leadership capabilities—seeing the whole, making strategic choices, and building an engine that would continue producing value beyond any single initiative.


Portfolio management principles draw on project portfolio management literature, particularly the Standard for Portfolio Management (PMI) and portfolio optimization theory from operations research.


Module 7A: LEAD — Theory

O — Observe

Culture and Sustainability: Making the Discipline Stick


Structures and processes are necessary but not sufficient. A Center of Excellence can exist on paper while the methodology withers in practice. Portfolio governance can be established while initiatives stall from organizational resistance.

The difference between organizations that sustain Orchestrated Intelligence and those that don't isn't primarily structural—it's cultural. Culture determines whether the methodology is seen as "how we do things" or "that initiative from a few years ago."

Module 6 addressed sustainability at the initiative level: how to maintain individual systems after deployment. Module 7 addresses sustainability at the organizational level: how to maintain the capability itself.


Cultural Enablers

Certain cultural characteristics make the discipline more likely to take root and persist:


Evidence-Based Decision Making

The A.C.O.R.N. methodology is fundamentally empirical. It requires measuring before and after, testing projections against reality, and adjusting based on what the data shows.

Organizations with strong evidence cultures—where data is expected, scrutinized, and acted upon—absorb this methodology naturally. Organizations where decisions are made by authority, politics, or gut feeling struggle with the methodology's rigor.

Indicators of evidence culture:

  • Leaders ask "what does the data show?" before making decisions
  • Disagreements are resolved with analysis, not hierarchy
  • Forecasts and projections are tracked against actual outcomes
  • Measurement is seen as helpful, not threatening

Building evidence culture:

  • Start with the methodology's own results—demonstrate that measurement matters
  • Celebrate when data changes minds (including leaders' minds)
  • Make measurement accessible and transparent
  • Connect measurement to real decisions

Learning Orientation

The methodology improves through accumulated experience. Each initiative teaches lessons that should inform the next. Organizations with learning orientation capture and apply these lessons.

Indicators of learning orientation:

  • After-action reviews are standard practice
  • Failures are examined without blame
  • Lessons learned actually influence future work
  • People share knowledge across boundaries

Building learning orientation:

  • Institute structured after-action reviews for each initiative
  • Create forums for practitioners to share experiences
  • Document lessons learned and make them accessible
  • Recognize people who identify improvement opportunities

Tolerance for Discomfort

The methodology requires honest assessment of current state, which often reveals uncomfortable truths. Friction inventories surface process problems that someone created or maintains. Business cases challenge assumptions about value. Pilots may fail.

Organizations that punish bearers of bad news—or that preference optimism over realism—can't sustain honest application of the methodology.

Indicators of discomfort tolerance:

  • People raise problems without fear of retribution
  • Leaders thank people for identifying issues early
  • Realistic assessments are valued over optimistic projections
  • Honest feedback is seen as respectful, not hostile

Building discomfort tolerance:

  • Leaders model response to bad news (thanking, not punishing)
  • Distinguish between "problem identification" and "problem causation"
  • Celebrate early problem detection (it's cheaper than late detection)
  • Create psychological safety for honest assessment

Execution Discipline

The methodology requires sustained attention over months. Initiatives pass through multiple phases, each requiring consistent effort. Organizations with execution discipline complete what they start.

Indicators of execution discipline:

  • Projects finish, not just start
  • Accountability is clear and maintained
  • Deadlines are taken seriously
  • Follow-through is expected and recognized

Building execution discipline:

  • Make commitments explicit and track them
  • Conduct regular reviews of initiative progress
  • Address slippage early, before it becomes failure
  • Recognize completion, not just initiation

Cultural Barriers

Certain cultural patterns actively undermine the methodology:


Hero Culture

Organizations that celebrate individual heroics—the person who stayed up all night to fix the crisis, the executive who made the bold call—struggle with systematic improvement. Why invest in methodology when heroes will save the day?

The problem: Hero culture rewards firefighting over fire prevention. It values dramatic intervention over quiet capability-building.

The response: Celebrate capability-building alongside firefighting. Ask: "Why did we need a hero? What would have prevented this crisis?" Shift recognition toward those who prevent problems.


Initiative Fatigue

Organizations that launch too many initiatives create cynicism. Employees have seen "transformational programs" come and go. They've learned to wait initiatives out—do the minimum until leadership moves on to the next shiny thing.

The problem: The methodology is seen as "another initiative" that will pass. Why invest when it won't last?

The response: Be explicit about this concern. Acknowledge the history. Demonstrate commitment through sustained action, not just rhetoric. Build the methodology into permanent structures, not temporary programs.


Risk Aversion

Extremely risk-averse organizations struggle with the methodology's empirical nature. If failure is punished, people won't run experiments. If uncertainty is intolerable, they won't pilot before scaling.

The problem: The methodology requires tolerance for pilot failure as a learning mechanism. Risk aversion eliminates this learning.

The response: Distinguish between smart risk (small pilot, controlled experiment) and reckless risk (large commitment without validation). Create space for the former. Make the pilot phase explicitly about learning, not just succeeding.


Silo Mentality

The methodology often requires cross-functional collaboration. Friction inventories surface problems that span departments. Implementations require coordination across boundaries. Siloed organizations resist this integration.

The problem: Initiatives stall at departmental boundaries. Knowledge doesn't flow across silos. Portfolio optimization is impossible when each silo guards its resources.

The response: Use governance structures that include cross-functional representation. Pursue initiatives that require collaboration—success builds relationships. Frame the methodology as serving organizational goals, not departmental ones.


Sustaining Cultural Change

Cultural change is slow. It happens through accumulation of experiences, not through declaration. The methodology itself can be a vehicle for cultural change if approached intentionally.


Start with Believers

Early initiatives should be staffed by people who are genuinely curious about the methodology, led by sponsors who are committed to its success. Success with believers builds credibility for engaging skeptics later.

Forcing the methodology on skeptical units creates resistance that poisons the broader organization. Let success pull skeptics in rather than pushing them.


Make Success Visible

Each successful initiative should be communicated broadly. Not just the results, but the process—what the methodology contributed, what would have happened without it.

Visibility builds legitimacy. When people across the organization see consistent success, the methodology becomes credible. When success is invisible, it's easy to dismiss as luck or exception.


Connect to Values

Frame the methodology in terms the organization already values. If the organization values innovation, position the methodology as systematic innovation. If it values efficiency, emphasize the cost savings. If it values customer service, highlight customer impact.

The methodology shouldn't compete with organizational values—it should serve them.


Build Community

Practitioners who work in isolation struggle. Practitioners connected to a community of peers—sharing challenges, celebrating wins, learning together—sustain their commitment.

Community-building activities:

  • Regular practitioner meetings (monthly at minimum)
  • Shared communication channel (Slack, Teams, etc.)
  • Annual gathering or summit
  • Mentorship relationships
  • Joint problem-solving sessions

Community provides support during difficulty and accountability during drift.


Institutionalize the Practice

Eventually, the methodology should become "how we do things" rather than "that initiative." Signs of institutionalization:

  • New employees learn the methodology during onboarding
  • The vocabulary enters common usage
  • Standards expect methodology application for relevant decisions
  • Budget cycles include methodology investments
  • Career paths include methodology expertise

Institutionalization takes years, not months. It requires consistent investment after the initial excitement fades.


The Long Game

Diana's eighteen-month mandate was enough to establish the Center of Excellence and demonstrate results. But genuine cultural integration would take longer—three to five years for the methodology to become truly embedded in how Thornton Manufacturing operated.

Year 1: Establish structures, demonstrate early wins, build initial practitioner capability.

Year 2: Expand portfolio, deepen practitioner expertise, integrate with existing processes.

Year 3: Institutionalize training, connect to career development, embed in strategic planning.

Years 4-5: The methodology becomes invisible—"just how we do things"—rather than a named initiative.

This timeline frustrates executives who want immediate transformation. But cultural change at scale doesn't accelerate on command. Sustained investment over time is the only path to lasting change.


Leadership for Cultural Change

Leaders shape culture through what they pay attention to, what they reward, and what they model.

Attention: What leaders ask about signals what matters. Leaders who regularly ask about methodology application, initiative results, and practitioner development signal that the methodology matters.

Rewards: What gets recognized gets repeated. Leaders who celebrate methodology success—not just business results, but the process that delivered them—reinforce the methodology's value.

Modeling: How leaders themselves engage with the methodology demonstrates its importance. Leaders who apply methodology principles to their own decisions, who participate in governance, who invest in understanding—these leaders embed the methodology in culture.

Diana's continued engagement was crucial even as she developed others. Her visible commitment signaled that the methodology wasn't something she was delegating away—it was central to how Thornton Manufacturing would operate.


When Culture Won't Change

Sometimes organizations simply aren't ready for the methodology. The cultural barriers are too strong; the leadership commitment too weak; the organizational history too toxic.

Signs that cultural change is unlikely:

  • Executive sponsors disengage after initial launch
  • Practitioners face active resistance without organizational support
  • Successful initiatives don't lead to additional investment
  • The methodology is treated as overhead rather than capability
  • People are punished for honest assessment

In these cases, the leader faces a choice: continue investing in an organization that won't change, or redirect energy to where it will have impact. This is a difficult judgment, and there's no formula.

But the discipline is explicit: organizations that won't provide the cultural conditions for success won't achieve sustainable results. Structure without culture produces compliance at best, theater at worst.

The measure of mastery is whether others can do it without you. That requires an organization that wants to learn.


Cultural change theory draws on Schein's work on organizational culture, Kotter's research on leading change, and the psychological safety literature pioneered by Edmondson. The integration of cultural factors with methodology implementation reflects lessons from the organizational learning field.



Module 7B: LEAD — Practice

R — Reveal

Introduction: From Theory to Organizational Playbook


Module 7A established the theory of organizational leadership: the five principles that guide the transition from practitioner to leader, the structures that enable scaling, and the cultural conditions that sustain transformation.

Module 7B translates that theory into action. The deliverable is an Orchestration Playbook—a comprehensive document that enables others to execute the A.C.O.R.N. methodology without requiring the original practitioner's direct involvement.


The Playbook Purpose

Diana Okafor's breakthrough at Thornton Manufacturing came when she stopped trying to transfer tacit knowledge through shadowing and started codifying explicit knowledge into usable materials.

The Orchestration Playbook serves three functions:

1. Execution Guide

The playbook provides step-by-step guidance for practitioners executing initiatives. It answers the questions that arise during each phase:

  • What are the specific steps in this phase?
  • What does a good deliverable look like?
  • What are the common mistakes to avoid?
  • When should I escalate versus decide independently?

A practitioner following the playbook can reach 70% of expert capability without expert supervision.

2. Quality Standard

The playbook establishes what "good" looks like. Quality is no longer "what Diana would approve"—it's codified in templates, checklists, and examples that anyone can reference.

This enables quality assurance at scale. Reviewers can evaluate work against documented standards rather than subjective judgment.

3. Training Foundation

The playbook provides the curriculum for practitioner development. New practitioners study the playbook during their foundation phase, then apply it during supervised execution.

Without a playbook, training depends on the teacher's memory and availability. With a playbook, training is consistent and scalable.


Playbook Components

A complete Orchestration Playbook includes:

ComponentPurposePrimary Users
Methodology OverviewOrientation to A.C.O.R.N. and principlesAll stakeholders
Phase GuidesStep-by-step execution for each phasePractitioners
TemplatesStandardized deliverable formatsPractitioners
Decision TreesGuidance for common judgment callsPractitioners
Quality ChecklistsValidation criteria for deliverablesPractitioners, reviewers
Case ExamplesIllustrations of methodology in actionLearners
Governance GuideDecision rights and escalation pathsAll
Teaching GuideTraining program structureCoE leaders

Module 7B develops each component through structured exercises, culminating in a complete playbook assembly.


The R-01 Thread

Throughout Modules 2-6, learners developed R-01: their first A.C.O.R.N. implementation from Assess through Nurture. Module 7 uses R-01 as the foundation for the playbook:

  • Phase guides draw on the experience of executing R-01
  • Templates are refined versions of R-01 deliverables
  • Case examples use R-01 as illustration
  • Quality checklists embed lessons learned from R-01

This approach ensures the playbook is grounded in real experience, not abstract theory.


Who Builds the Playbook

The playbook is built by the practitioner who has successfully executed at least one complete A.C.O.R.N. cycle. This is typically the person transitioning from practitioner to leader—Diana, in the Thornton case.

Building the playbook requires:

  • Deep methodology understanding: You can't codify what you don't understand
  • Execution experience: The playbook must address real situations, not theoretical ones
  • Teaching orientation: The playbook is for others, not for the author
  • Organizational context: The playbook must fit the specific organization

The exercises in Module 7B guide this development process.


Playbook Development Sequence

ExerciseFocusOutput
7.8Playbook structure and architecturePlaybook outline and component list
7.9Teaching system designTraining program with progression levels
7.10Governance frameworkDecision rights matrix and escalation paths
7.11Portfolio management toolsPrioritization criteria and capacity model
7.12Culture assessmentOrganizational readiness diagnostic
7.13Playbook assemblyComplete integrated playbook

Each exercise builds on previous work. By Exercise 7.13, learners have assembled a complete playbook ready for organizational deployment.


Quality Criteria for the Playbook

The playbook isn't complete until it meets these criteria:

Usability: A practitioner unfamiliar with the methodology can follow the playbook to execute an initiative with supervision.

Completeness: All phases, all deliverables, all common decisions are addressed.

Clarity: Instructions are unambiguous. A reader doesn't need to guess what the author meant.

Accuracy: The playbook reflects what actually works, not what sounds good in theory.

Adaptability: The playbook acknowledges that context varies and provides guidance for adaptation.

Maintainability: The playbook can be updated as learning accumulates without requiring complete rewrite.


The Teaching Challenge

Building the playbook surfaces a fundamental challenge: making tacit knowledge explicit.

Much of what experienced practitioners know is tacit—they feel when an opportunity is worth pursuing, they sense when a stakeholder is resistant, they recognize quality without consciously applying criteria. This tacit knowledge developed through experience but lives below conscious awareness.

Codifying this knowledge requires the practitioner to:

  1. Observe their own practice: Notice what they do, not just do it
  2. Articulate their reasoning: Explain why they make specific choices
  3. Identify patterns: Recognize recurring situations and responses
  4. Abstract principles: Extract general guidance from specific experiences
  5. Test understanding: Verify that others can apply the codified knowledge

This is difficult work. Many experts struggle to explain what they know because they've never had to make it explicit. Module 7B exercises provide structured approaches to this codification challenge.


Organizational Fit

The playbook must fit the organization where it will be used. A playbook developed for a manufacturing company may need adaptation for a financial services firm. A playbook for a large enterprise may be inappropriate for a mid-sized business.

Organizational fit considerations:

  • Vocabulary: Use terms the organization recognizes
  • Examples: Draw from relevant domains
  • Process integration: Connect to existing organizational processes
  • Authority levels: Match the organization's decision-making culture
  • Resource assumptions: Reflect realistic resource availability

The exercises prompt consideration of these fit factors throughout development.


Transition to Exercises

The following sections guide development of each playbook component:

  • 7.8 Playbook Structure: Architecture and component design
  • 7.9 Teaching System: Practitioner development program
  • 7.10 Governance Framework: Decision rights and accountability
  • 7.11 Portfolio Management: Prioritization and resource allocation
  • 7.12 Culture Assessment: Organizational readiness diagnostic
  • 7.13 Playbook Assembly: Integration and validation

Each exercise includes methodology, worked examples, and templates. The cumulative output is a complete Orchestration Playbook.


The playbook development approach draws on knowledge management best practices, particularly the work on explicit knowledge capture and transfer. The structure reflects lessons from organizations that have successfully scaled methodology-based capabilities.


Module 7B: LEAD — Practice

O — Operate

Playbook Structure: Architecture and Component Design


The playbook is a living document that enables others to execute the A.C.O.R.N. methodology independently. This exercise establishes the playbook's architecture—the components, their relationships, and the principles that govern the document's design.


Playbook Architecture Principles

Before designing components, establish the principles that will govern the playbook:


Principle 1: Action-Oriented

Every section should answer "what do I do?" not just "what should I understand?" Practitioners open the playbook when they need to act, not when they want to study.

Application:

  • Lead with steps, follow with explanation
  • Use imperative voice ("Review the friction inventory for completeness")
  • Include specific outputs for each action

Principle 2: Scannable

Practitioners won't read the playbook cover-to-cover. They'll scan for the section relevant to their current situation.

Application:

  • Clear headings and subheadings
  • Consistent structure across sections
  • Visual hierarchy (bold key terms, bullet lists for steps)
  • Table of contents with page numbers

Principle 3: Self-Contained Sections

Each section should be usable without requiring reference to other sections. Cross-references are fine, but a practitioner working on the Calculate phase shouldn't need to flip back to the Assess phase to understand what to do.

Application:

  • Repeat essential context where needed
  • Include phase-specific templates in each phase section
  • Provide complete examples, not partial ones

Principle 4: Judgment-Supporting

The playbook can't anticipate every situation. Where judgment is required, provide decision frameworks rather than pretending decisions are mechanical.

Application:

  • Include decision trees for common choices
  • Provide criteria, not just rules
  • Acknowledge ambiguity where it exists
  • Specify when to escalate

Principle 5: Evolutionarily Stable

The playbook will need updates as learning accumulates. Design for maintainability.

Application:

  • Modular structure (sections can be updated independently)
  • Version tracking
  • Clear ownership for updates
  • Changelog to track evolution

Component Inventory

A complete playbook includes these components:


Front Matter

ComponentPurposeLength
Executive SummaryOrientation for sponsors and stakeholders1-2 pages
Methodology OverviewA.C.O.R.N. explanation for context2-3 pages
How to Use This PlaybookNavigation and conventions1 page
GlossaryDefinition of key terms2-3 pages

Phase Guides

ComponentPurposeLength
Assess Phase GuideStep-by-step for opportunity identification8-12 pages
Calculate Phase GuideStep-by-step for business case development10-15 pages
Orchestrate Phase GuideStep-by-step for workflow design10-15 pages
Realize Phase GuideStep-by-step for pilot execution8-12 pages
Nurture Phase GuideStep-by-step for sustainability8-12 pages

Each phase guide includes:

  • Phase overview (purpose, inputs, outputs)
  • Step-by-step instructions
  • Phase-specific templates
  • Quality checklist
  • Common mistakes and how to avoid them
  • Decision trees for judgment calls
  • Worked example

Templates

TemplatePhasePurpose
Friction InventoryAssessDocument friction points systematically
Opportunity AssessmentAssessEvaluate and prioritize opportunities
Baseline MetricsCalculateCapture current-state measurements
Business CaseCalculatePresent ROI analysis and investment case
Workflow BlueprintOrchestrateSpecify future-state design
Pilot PlanRealizeDefine pilot scope and measurement
Results ReportRealizeDocument pilot outcomes
Sustainability PlanNurtureEstablish ongoing operations
Monitoring DashboardNurtureTrack operational metrics

Decision Trees

DecisionWhen UsedOptions
Pursue vs. DeferAfter opportunity identificationPursue now, defer, decline
Scope SelectionDuring pilot planningMinimum viable, expanded, comprehensive
EscalationAny phaseHandle independently, consult, escalate
Iteration vs. RebuildDuring lifecycle managementIterate, rebuild, retire

Case Examples

ExampleDemonstratesSource
R-01 Complete CaseFull A.C.O.R.N. cycleLearner's own implementation
Alternative Domain CaseMethodology in different contextFrom course materials or organization
Failure CaseWhat goes wrong and whyComposite from common patterns

Governance Guide

ComponentPurposeLength
Decision Rights MatrixWho decides what2-3 pages
Escalation ProceduresWhen and how to escalate1-2 pages
Quality Assurance ProcessReview and approval gates2-3 pages
Governance CalendarRegular meetings and reviews1 page

Teaching Guide

ComponentPurposeLength
Practitioner Development PathFoundation → Certification progression3-4 pages
Training CurriculumContent and schedule for each level4-6 pages
Assessment CriteriaHow to evaluate practitioner readiness2-3 pages
Mentorship GuidelinesHow to support developing practitioners2-3 pages

Playbook Outline Template

Use this structure to organize your playbook:

ORCHESTRATION PLAYBOOK
[Organization Name]
Version [X.X] | [Date]

FRONT MATTER
  Executive Summary
  Methodology Overview
  How to Use This Playbook
  Glossary

PART I: METHODOLOGY EXECUTION
  Chapter 1: Assess Phase
    1.1 Phase Overview
    1.2 Step-by-Step Guide
    1.3 Templates
    1.4 Quality Checklist
    1.5 Common Mistakes
    1.6 Decision Trees
    1.7 Worked Example

  Chapter 2: Calculate Phase
    [Same structure as Chapter 1]

  Chapter 3: Orchestrate Phase
    [Same structure as Chapter 1]

  Chapter 4: Realize Phase
    [Same structure as Chapter 1]

  Chapter 5: Nurture Phase
    [Same structure as Chapter 1]

PART II: ORGANIZATIONAL INFRASTRUCTURE
  Chapter 6: Governance
    6.1 Decision Rights Matrix
    6.2 Escalation Procedures
    6.3 Quality Assurance
    6.4 Governance Calendar

  Chapter 7: Portfolio Management
    7.1 Prioritization Framework
    7.2 Capacity Planning
    7.3 Portfolio Review Process

  Chapter 8: Teaching and Development
    8.1 Practitioner Development Path
    8.2 Training Curriculum
    8.3 Assessment Criteria
    8.4 Mentorship Guidelines

PART III: REFERENCE MATERIALS
  Appendix A: Complete Template Library
  Appendix B: Case Examples
  Appendix C: Decision Tree Library
  Appendix D: Quality Checklists
  Appendix E: Glossary (expanded)

DOCUMENT CONTROL
  Version History
  Change Log
  Review Schedule
  Document Owner

Exercise: Design Your Playbook Architecture

Step 1: Customize the Component List

Review the component inventory above. For your organization:

  • Which components are essential? (Include these)
  • Which components are lower priority? (Include later or abbreviated)
  • What components are missing? (Add these)

Document your customized component list.


Step 2: Establish Naming and Structure Conventions

Decide on conventions that will apply throughout:

  • Heading levels: How many levels? What formatting for each?
  • Numbering: Section numbers, step numbers, figure numbers?
  • Template format: Embedded in text or separate documents?
  • Example format: Inline or boxed? Real names or anonymized?
  • Cross-references: How will you link related sections?

Document your conventions.


Step 3: Draft the Table of Contents

Using the outline template as a starting point, draft a complete table of contents for your playbook. Include:

  • All major sections
  • All subsections
  • Estimated page counts

This becomes your project plan for playbook development.


Step 4: Identify Content Sources

For each section, identify where the content will come from:

SectionPrimary SourceStatus
Assess Phase GuideR-01 experience + course materialsTo develop
Business Case TemplateR-01 deliverable (refined)Exists, needs polish
Decision TreesCourse materials + experienceTo develop
Governance GuideOrganizational input neededRequires collaboration

This inventory reveals what exists, what needs development, and what requires organizational input.


Step 5: Establish Document Control

Define how the playbook will be maintained:

  • Version numbering: Major.Minor (e.g., 1.0, 1.1, 2.0)
  • Review frequency: Quarterly? After each initiative? Annually?
  • Change approval: Who can modify? Who approves changes?
  • Distribution: Where is the authoritative version stored? How is it accessed?

Document your control procedures.


Quality Checklist: Playbook Architecture

Before proceeding to content development, verify:

Completeness:

  • All A.C.O.R.N. phases are covered
  • All essential templates are included
  • Governance components are specified
  • Teaching components are specified

Usability:

  • Structure supports scanning and navigation
  • Sections are self-contained where possible
  • Conventions are clear and consistent
  • Cross-references are planned

Maintainability:

  • Modular structure allows independent updates
  • Version control is established
  • Ownership is clear
  • Review schedule is defined

Organizational Fit:

  • Vocabulary matches organizational usage
  • Structure fits organizational expectations
  • Length is appropriate for audience
  • Format is compatible with organizational tools

Common Mistakes in Playbook Architecture

Mistake 1: Starting with Content Before Structure

Writing content without clear architecture leads to inconsistent organization, redundancy, and gaps.

Prevention: Complete architecture design before writing content.


Mistake 2: Over-Engineering

Creating elaborate structures that practitioners won't use. A 500-page playbook is a reference manual, not a working tool.

Prevention: Start with minimum viable playbook. Add complexity only as demonstrated need emerges.


Mistake 3: Ignoring Maintenance

Creating a playbook without considering how it will stay current. Static playbooks become obsolete and ignored.

Prevention: Build maintenance into initial design. Assign ownership. Schedule reviews.


Mistake 4: Single-Author Syndrome

One person develops the playbook in isolation, producing a document that reflects their perspective but not broader organizational reality.

Prevention: Involve stakeholders in architecture review. Test sections with practitioners before finalizing.


Worked Example: Thornton Manufacturing Playbook Architecture

Diana developed a playbook architecture for Thornton Manufacturing:

Customization Decisions:

  • Removed "Alternative Domain Case" (manufacturing-focused company)
  • Added "Safety and Compliance Considerations" (manufacturing requirement)
  • Abbreviated "Teaching Guide" (small practitioner population initially)
  • Added "Integration with Lean/Six Sigma" (existing methodology context)

Structure Conventions:

  • Three heading levels maximum
  • Templates as embedded tables (no separate files)
  • Examples in gray boxes
  • Cross-references by section number

Estimated Scope:

  • Part I (Methodology): 60 pages
  • Part II (Infrastructure): 25 pages
  • Part III (Reference): 30 pages
  • Total: ~115 pages

Content Sources:

  • 70% from R-01 materials (existed, needed refinement)
  • 20% from course materials (adaptation required)
  • 10% new development (governance, manufacturing-specific)

Document Control:

  • Version: 1.0 (initial release)
  • Review: Quarterly for first year, then semi-annually
  • Ownership: Diana (methodology) + IT (document management)
  • Distribution: SharePoint with controlled access

Output: Playbook Architecture Document

Complete this exercise by producing:

  1. Customized component list with rationale for modifications
  2. Structure conventions document establishing formatting standards
  3. Complete table of contents with estimated page counts
  4. Content source inventory showing what exists and what needs development
  5. Document control specification establishing maintenance procedures

This architecture document guides all subsequent playbook development.


Playbook architecture principles draw on technical documentation best practices, particularly the DITA methodology for structured content and the principles of minimalist documentation design.


Module 7B: LEAD — Practice

O — Operate

Teaching System: Practitioner Development Program Design


Building organizational capability requires developing practitioners who can execute the methodology independently. This exercise designs the teaching system that will take learners from unfamiliarity to certification.

The teaching system isn't a single training event—it's a structured progression that develops knowledge, skill, and judgment over time.


Teaching System Components

A complete teaching system includes:

ComponentPurposeWhen Used
Learning PathProgression from novice to certifiedCareer development planning
CurriculumContent and activities for each levelTraining delivery
Assessment CriteriaStandards for advancementLevel transitions
Mentorship ModelSupport structure for learnersThroughout development
Certification ProcessFormal validation of capabilityEnd of development

Learning Path Design

The learning path defines the stages practitioners move through:


Level 0: Awareness

Target audience: Stakeholders, sponsors, collaborators who need to understand the methodology without executing it.

Objectives:

  • Understand what the A.C.O.R.N. methodology is and why it matters
  • Recognize opportunities for methodology application
  • Know how to engage with practitioners and the CoE

Duration: 2-4 hours (workshop or self-study)

Content:

  • Methodology overview (30 min)
  • Case study demonstrating value (45 min)
  • How to identify opportunities (30 min)
  • Working with practitioners (30 min)
  • Q&A and discussion (30-60 min)

Assessment: None required. Participation-based completion.


Level 1: Foundation

Target audience: Future practitioners beginning their development journey.

Objectives:

  • Understand the complete A.C.O.R.N. methodology in depth
  • Execute each phase using templates and guidance
  • Recognize common patterns and pitfalls
  • Know when to seek help

Duration: 40-60 hours over 4-6 weeks

Content:

  • Playbook study (self-paced): 20-30 hours
  • Instructor-led sessions: 8-12 hours
    • Methodology deep-dive (4 hours)
    • Template practice (4 hours)
    • Case analysis (2-4 hours)
  • Practice exercises: 12-18 hours
    • Friction inventory on practice case
    • Business case development on practice case
    • Workflow design critique

Assessment:

  • Knowledge quiz (methodology understanding)
  • Template application exercise (can they use the tools correctly?)
  • Readiness interview (are they prepared for supervised execution?)

Advancement criteria:

  • Quiz score ≥ 80%
  • Template exercise meets quality standards
  • Mentor recommendation for advancement

Level 2: Supervised Execution

Target audience: Foundation graduates executing their first real initiative.

Objectives:

  • Execute a complete A.C.O.R.N. cycle on a real opportunity
  • Apply methodology in organizational context
  • Develop judgment through practice with feedback
  • Build relationships with stakeholders

Duration: 4-6 months (one full initiative cycle)

Structure:

  • Practitioner executes each phase
  • Mentor reviews all major deliverables before finalization
  • Weekly check-ins during active execution
  • Explicit feedback on quality, approach, and judgment

Supervision Model:

PhaseSupervision LevelReview Points
AssessHighFriction inventory, opportunity selection
CalculateHighBaseline metrics, business case
OrchestrateMedium-HighWorkflow design, pilot plan
RealizeMediumPilot launch, weekly results
NurtureMediumSustainability plan, ownership assignment

Assessment:

  • Initiative outcome (did it succeed?)
  • Deliverable quality (did outputs meet standards?)
  • Process adherence (did they follow the methodology?)
  • Stakeholder feedback (how was the collaboration?)
  • Learning demonstration (can they articulate what they learned?)

Advancement criteria:

  • Initiative meets or exceeds projected value
  • All deliverables meet quality standards
  • Stakeholder feedback is positive
  • Mentor attestation of readiness for reduced supervision

Level 3: Supported Independence

Target audience: Practitioners ready for reduced supervision.

Objectives:

  • Execute initiatives with consultation rather than review
  • Handle routine decisions independently
  • Recognize when escalation is appropriate
  • Begin developing others informally

Duration: 4-6 months (one or more additional initiatives)

Structure:

  • Practitioner executes without deliverable-level review
  • Mentor available for consultation (practitioner initiates)
  • Bi-weekly check-ins rather than weekly
  • Focus on strategic guidance, not tactical direction

Consultation triggers (practitioner-initiated):

  • Unfamiliar situations not covered by playbook
  • Significant stakeholder resistance
  • Results diverging significantly from projections
  • Judgment calls with material consequences

Assessment:

  • Initiative outcomes
  • Consultation appropriateness (did they escalate when they should? Not escalate when they shouldn't?)
  • Independence growth (did consultation frequency decrease?)
  • Quality consistency (did quality maintain without close supervision?)

Advancement criteria:

  • Two or more successful initiatives
  • Demonstrated appropriate judgment in ambiguous situations
  • Stakeholder feedback remains positive
  • Mentor attestation of certification readiness

Level 4: Certified Practitioner

Target audience: Practitioners validated for fully independent execution.

Objectives:

  • Execute any standard initiative independently
  • Mentor developing practitioners
  • Contribute to methodology improvement
  • Represent methodology in organizational discussions

Duration: Ongoing

Certification requirements:

  1. Successful completion of supervised execution
  2. Successful completion of supported independence
  3. Peer review of deliverable quality
  4. Stakeholder endorsements (minimum 3)
  5. Certification interview with CoE leadership

Ongoing expectations:

  • Maintain initiative execution quality
  • Participate in community of practice
  • Mentor at least one developing practitioner per year
  • Contribute to playbook refinement

Recertification: Annual review of continuing competence (initiative outcomes, stakeholder feedback, community participation)


Curriculum Development

For each learning path level, develop detailed curriculum:


Foundation Curriculum Template

Module 1: Methodology Foundations (4 hours)

TopicDurationMethodMaterials
A.C.O.R.N. Overview45 minLecture + discussionSlides, playbook Ch. 1
The Five Leadership Principles45 minLecture + reflectionSlides, case examples
Thornton Case Study60 minCase analysisFull case study
Course Objectives and Structure30 minOrientationSyllabus

Module 2: Assess Phase (6 hours)

TopicDurationMethodMaterials
Friction Inventory Methodology60 minLecture + demoPlaybook Ch. 2.1-2.2
Friction Inventory Practice90 minWorkshopPractice case, template
Opportunity Assessment60 minLecture + demoPlaybook Ch. 2.3-2.4
Prioritization Exercise60 minWorkshopPractice opportunities
Quality Standards Review30 minDiscussionQuality checklist
Common Mistakes30 minCase examplesFailure examples

Module 3: Calculate Phase (6 hours)

TopicDurationMethodMaterials
ROI Lenses Overview45 minLecturePlaybook Ch. 3.1
Baseline Measurement60 minLecture + demoPlaybook Ch. 3.2, template
Business Case Development90 minWorkshopPractice case, template
Assumption Documentation45 minLecture + practiceTemplate
Stakeholder Presentation60 minRole playBusiness case from workshop
Quality Standards Review30 minDiscussionQuality checklist

Module 4: Orchestrate Phase (6 hours)

TopicDurationMethodMaterials
Workflow Analysis60 minLecture + demoPlaybook Ch. 4.1-4.2
Human-AI Collaboration Patterns60 minLecturePattern library
Workflow Design Workshop120 minWorkshopPractice case
Pilot Scoping60 minLecture + practicePlaybook Ch. 4.4, template
Quality Standards Review30 minDiscussionQuality checklist

Module 5: Realize Phase (4 hours)

TopicDurationMethodMaterials
Pilot Execution Planning60 minLecturePlaybook Ch. 5.1-5.2
Measurement and Monitoring60 minLecture + demoDashboard examples
Results Analysis60 minWorkshopSample pilot data
Iteration and Adjustment30 minDiscussionCase examples
Quality Standards Review30 minDiscussionQuality checklist

Module 6: Nurture Phase (4 hours)

TopicDurationMethodMaterials
Sustainability Planning60 minLecturePlaybook Ch. 6.1-6.2
Ownership and Governance60 minLecture + discussionRACI examples
Knowledge Management45 minLecturePlaybook Ch. 6.4
Lifecycle Management45 minLectureDecision framework
Quality Standards Review30 minDiscussionQuality checklist

Module 7: Integration (4 hours)

TopicDurationMethodMaterials
Full-Cycle Case Study90 minCase analysisComplete case
Decision Trees and Judgment60 minWorkshopDecision scenarios
Governance and Escalation45 minLecture + role playGovernance guide
Assessment Preparation45 minReviewAssessment criteria

Assessment Design

Assessments validate that practitioners have achieved learning objectives:


Foundation Assessment: Knowledge Quiz

25-30 questions covering:

  • Methodology phases and purposes (5 questions)
  • Template usage and outputs (5 questions)
  • Common patterns and pitfalls (5 questions)
  • Decision criteria and escalation (5 questions)
  • Quality standards (5 questions)

Format: Multiple choice and short answer Passing score: 80% Retake policy: One retake after additional study


Foundation Assessment: Template Application

Provide a practice case and require:

  • Complete friction inventory
  • Opportunity prioritization with rationale
  • Draft business case for top opportunity

Evaluation criteria:

  • Template completion (all fields populated appropriately)
  • Methodology adherence (approach follows playbook)
  • Quality of analysis (reasonable conclusions from evidence)
  • Documentation clarity (readable and organized)

Passing standard: Meets quality checklist requirements Retake policy: Revise and resubmit with mentor guidance


Supervised Execution Assessment: Initiative Review

Evaluate the complete initiative:

CriterionWeightEvaluation Method
Initiative outcome30%Actual vs. projected value
Deliverable quality25%Quality checklist scoring
Process adherence20%Mentor observation
Stakeholder feedback15%Stakeholder interviews
Learning demonstration10%Reflection discussion

Passing standard: Weighted average ≥ 70%


Certification Assessment: Comprehensive Review

Final certification requires:

  1. Portfolio Review: All deliverables from supervised and supported initiatives
  2. Peer Assessment: Quality review by certified practitioner (not the mentor)
  3. Stakeholder Endorsements: Written endorsements from 3+ stakeholders
  4. Certification Interview: 60-minute discussion with CoE leadership covering:
    • Methodology understanding (conceptual questions)
    • Judgment demonstration (scenario-based questions)
    • Reflection on learning journey
    • Commitment to ongoing development

Mentorship Model

Mentorship provides the human support that playbooks and training cannot:


Mentor Responsibilities

ResponsibilityFoundationSupervisedSupported
Weekly check-insYesYesBi-weekly
Deliverable reviewPractice exercisesAll major deliverablesOn request
Question responseWithin 24 hoursWithin 24 hoursWithin 48 hours
Career guidanceIntroductionActiveActive
Stakeholder supportN/AActiveOn request

Mentor Selection Criteria

  • Certified practitioner for at least 6 months
  • Completed at least 3 initiatives
  • Demonstrated teaching aptitude
  • Available capacity (no more than 2 mentees simultaneously)
  • Commitment to mentorship responsibilities

Mentor Development

Mentors need support too:

  • Mentor orientation (2 hours) on effective mentorship
  • Mentor community of practice (monthly meetings)
  • Escalation path for mentee issues they can't resolve
  • Recognition for successful mentee development

Exercise: Design Your Teaching System

Step 1: Define Learning Path Levels

Using the template above, customize learning path levels for your organization:

  • What modifications are needed to level definitions?
  • What is realistic duration for each level given organizational context?
  • What assessment methods fit organizational culture?

Document your customized learning path.


Step 2: Develop Foundation Curriculum

Create detailed curriculum for Level 1 (Foundation):

  • What content is essential? What can be abbreviated?
  • What instructional methods work in your organization?
  • What materials exist? What needs development?
  • Who will deliver instruction?

Document your curriculum outline with timing and materials.


Step 3: Design Assessment Instruments

Create specific assessment instruments:

  • Knowledge quiz (20-30 questions with answer key)
  • Template application exercise (case and evaluation rubric)
  • Supervised execution evaluation form
  • Certification interview guide

Document complete assessment instruments.


Step 4: Establish Mentorship Structure

Define your mentorship model:

  • How will mentors be selected?
  • What are specific mentor responsibilities at each level?
  • How will mentor capacity be managed?
  • How will mentor effectiveness be evaluated?

Document your mentorship guidelines.


Step 5: Create Development Tracking

Design a system to track practitioner development:

  • What information is tracked for each practitioner?
  • How is progress documented?
  • Who reviews development progress?
  • How are advancement decisions made?

Document your tracking system design.


Quality Checklist: Teaching System

Before finalizing your teaching system, verify:

Learning Path:

  • All levels are clearly defined with objectives
  • Advancement criteria are specific and measurable
  • Duration estimates are realistic
  • Path from novice to certification is clear

Curriculum:

  • All methodology phases are covered adequately
  • Instructional methods are varied and appropriate
  • Materials are identified or planned
  • Timing is realistic for content depth

Assessment:

  • Assessments align with learning objectives
  • Criteria are clear and measurable
  • Multiple assessment methods are used
  • Retake and remediation policies exist

Mentorship:

  • Mentor responsibilities are clear
  • Mentor selection criteria are defined
  • Mentor capacity is planned
  • Mentor support is provided

Worked Example: Thornton Manufacturing Teaching System

Diana designed a teaching system for Thornton's plant champions:

Learning Path Customization:

  • Skipped Level 0 (Awareness)—executives already committed
  • Extended Level 2 (Supervised Execution) to 6 months—manufacturing initiatives take longer
  • Combined Levels 3 and 4—smaller practitioner population didn't need fine gradation

Curriculum Adaptation:

  • Added manufacturing-specific examples throughout
  • Reduced Orchestrate content (fewer workflow patterns in manufacturing context)
  • Added safety and compliance module (manufacturing requirement)
  • Total Foundation: 32 hours (condensed from 40-60)

Assessment Modifications:

  • Knowledge quiz: 20 questions (reduced from 25-30)
  • Template application: Manufacturing-specific practice case
  • No formal certification interview (Diana knew all practitioners personally)

Mentorship Model:

  • Diana served as sole mentor for first cohort (4 champions)
  • Peer mentorship encouraged but not formalized
  • Weekly calls during supervised execution, bi-weekly during supported independence

Output: Teaching System Document

Complete this exercise by producing:

  1. Learning Path Definition: Customized levels with objectives, duration, and advancement criteria
  2. Foundation Curriculum: Detailed content, methods, timing, and materials
  3. Assessment Instruments: Complete quiz, exercises, and evaluation forms
  4. Mentorship Guidelines: Selection criteria, responsibilities, and support structure
  5. Development Tracker: System for tracking practitioner progress

This teaching system becomes a chapter in your Orchestration Playbook.


Teaching system design draws on instructional design principles, particularly the ADDIE model (Analysis, Design, Development, Implementation, Evaluation) and competency-based education research. The progression from supervised to independent practice reflects established models for professional skill development.


Module 7B: LEAD — Practice

O — Operate

Governance Framework: Decision Rights and Accountability


Governance determines who can decide what. Without clear governance, practitioners either wait for permission they don't need (bottleneck) or make decisions they shouldn't (risk). Clear governance enables speed by eliminating ambiguity about authority.

This exercise designs the governance framework for your Center of Excellence.


Governance Principles

Before designing specific structures, establish governing principles:


Principle 1: Decisions at the Lowest Appropriate Level

Push decisions down to people closest to the work. Only elevate decisions that genuinely require broader perspective or higher authority.

Application:

  • Practitioners decide execution tactics
  • CoE leadership decides methodology interpretation
  • Governance board decides strategic priorities and major investments

Principle 2: Clear Single Accountability

Every decision has one accountable person. "Shared accountability" means no one is accountable.

Application:

  • RACI assigns exactly one "A" per decision
  • Escalation paths are explicit
  • When accountability is unclear, clarify it immediately

Principle 3: Authority Matches Accountability

Don't hold people accountable for outcomes they can't influence. If someone is accountable, they need authority to act.

Application:

  • Practitioners have authority over execution they're accountable for
  • CoE leadership has authority over methodology they're accountable for
  • Executive sponsors have authority over resources they're accountable for

Principle 4: Transparency Over Permission

Where possible, replace "ask permission" with "inform and proceed." Trust people to make good decisions; provide visibility for oversight.

Application:

  • Standard decisions: practitioner decides, informs stakeholders
  • Significant decisions: practitioner proposes, proceeds unless objection
  • Major decisions: practitioner proposes, requires approval before proceeding

Governance Roles

Define the roles involved in governance:


Role: Executive Sponsor

AspectDefinition
PurposeChampion the CoE; ensure organizational support
AuthorityBudget approval, organizational barrier removal, strategic direction
AccountabilityCoE success at organizational level
Time Commitment2-4 hours/month + governance meetings
Typical PositionVP or above; operational or digital leadership

Key Decisions:

  • Annual budget and resource allocation
  • Major initiative approval (threshold: >$100K investment or >6 months)
  • Organizational barrier escalation
  • Strategic priority setting

Role: CoE Leader

AspectDefinition
PurposeManage CoE operations; maintain methodology; develop practitioners
AuthorityMethodology interpretation, practitioner assignment, quality standards
AccountabilityCoE operational performance; methodology integrity; practitioner development
Time Commitment50-100% of role (depending on CoE maturity)
Typical PositionSenior manager or director; process excellence or digital transformation

Key Decisions:

  • Methodology interpretation and evolution
  • Practitioner certification
  • Initiative staffing
  • Quality standard enforcement
  • Portfolio prioritization recommendations

Role: Governance Board

AspectDefinition
PurposeProvide oversight; make cross-functional decisions; resolve conflicts
AuthorityPortfolio prioritization, cross-functional resource allocation, escalation resolution
AccountabilityPortfolio performance; organizational alignment
Time Commitment2 hours/month (meeting) + preparation
Typical CompositionExecutive sponsor (chair), CoE leader, business unit representatives, finance

Key Decisions:

  • Portfolio composition and priorities
  • Cross-functional resource conflicts
  • Major scope changes
  • Escalated issues
  • Annual planning

Role: Business Unit Sponsor

AspectDefinition
PurposeProvide local support; remove unit-level barriers; represent unit interests
AuthorityUnit resource allocation, unit stakeholder engagement, local priority setting
AccountabilityInitiative success within unit; stakeholder cooperation
Time Commitment2-4 hours/month per active initiative
Typical PositionDepartment head or director within business unit

Key Decisions:

  • Unit participation in initiatives
  • Local resource allocation
  • Stakeholder engagement
  • Operational changes within unit

Role: Practitioner

AspectDefinition
PurposeExecute methodology; deliver initiative results
AuthorityExecution approach within guidelines; day-to-day decisions
AccountabilityInitiative deliverables; methodology adherence; stakeholder relationships
Time Commitment50-100% during active initiatives
Typical PositionAnalyst, manager, or specialist in relevant domain

Key Decisions:

  • Execution tactics
  • Stakeholder communication content and timing
  • Minor scope adjustments (within guidelines)
  • Escalation timing (when to seek help)

Decision Rights Matrix

Map specific decisions to roles:


Initiative Lifecycle Decisions

DecisionPractitionerCoE LeaderBU SponsorGov BoardExec Sponsor
Identify opportunityRCIII
Assess opportunityRCCII
Recommend pursuitRRCAI
Approve minor initiative (<$50K)IARII
Approve major initiative (>$50K)ICRAI
Allocate practitionerCACII
Define scopeRCAII
Modify scope (minor)RIAII
Modify scope (major)RCCAI
Approve business caseICRAI
Approve pilot planRACII
Decide continue/stop pilotRCACI
Approve full deploymentICRAI
Approve sustainability planRACII

R = Responsible (does the work) A = Accountable (final decision authority) C = Consulted (input before decision) I = Informed (notified after decision)


Methodology Decisions

DecisionPractitionerCoE LeaderGov BoardExec Sponsor
Apply standard methodologyRIII
Interpret methodology ambiguityCAII
Adapt methodology to contextRAII
Propose methodology changeRRCI
Approve methodology changeIACI
Set quality standardsIACI
Enforce quality standardsCAII

Resource Decisions

DecisionPractitionerCoE LeaderBU SponsorGov BoardExec Sponsor
Request resourcesRCCII
Allocate within unitICAII
Allocate across unitsICCAI
Approve budget (< threshold)IACII
Approve budget (> threshold)ICCCA
Resolve resource conflictCCCAI

Practitioner Development Decisions

DecisionPractitionerCoE LeaderBU SponsorGov Board
Enter training programRACI
Complete foundation levelRAII
Advance to supervisedRAII
Advance to supportedRAII
Grant certificationCAII
Revoke certificationIACI
Assign mentorCAII

Escalation Framework

When issues can't be resolved at one level, they escalate to the next:


Escalation Triggers

SituationFirst EscalationFinal Escalation
Stakeholder resistance blocking progressPractitioner → BU SponsorBU Sponsor → Governance Board
Resource conflict between initiativesPractitioner → CoE LeaderCoE Leader → Governance Board
Scope dispute with stakeholdersPractitioner → BU SponsorBU Sponsor → Governance Board
Quality standard disagreementPractitioner → CoE LeaderCoE Leader → Executive Sponsor
Methodology interpretation disputePractitioner → CoE LeaderCoE Leader decision is final
Initiative failing to meet projectionsPractitioner → CoE LeaderCoE Leader → Governance Board
Practitioner performance concernMentor → CoE LeaderCoE Leader → BU Sponsor

Escalation Protocol

  1. Document the issue: What is the problem? What has been tried? What decision is needed?

  2. Attempt resolution at current level: Has the practitioner genuinely tried to resolve before escalating?

  3. Escalate with recommendation: Don't just present the problem—propose a solution.

  4. Set timeline: Escalations should be resolved within 5 business days.

  5. Document resolution: Record the decision and rationale for future reference.


Escalation Path Diagram

Practitioner
    ↓
CoE Leader ←→ BU Sponsor
    ↓           ↓
Governance Board
    ↓
Executive Sponsor

Cross-arrows indicate peer consultation before escalation.


Governance Calendar

Regular governance activities maintain accountability:


Weekly

ActivityParticipantsDurationFocus
Practitioner check-inPractitioner + Mentor/CoE Leader30 minInitiative progress, blockers

Monthly

ActivityParticipantsDurationFocus
Governance Board meetingFull board90 minPortfolio review, decisions
Practitioner communityAll practitioners60 minKnowledge sharing, issues

Quarterly

ActivityParticipantsDurationFocus
Portfolio reviewGov Board + CoE Leader2 hoursStrategic assessment, reprioritization
Executive updateExec Sponsor + CoE Leader60 minPerformance, strategy, resources
Practitioner development reviewCoE Leader + Mentors90 minProgress, certification pipeline

Annually

ActivityParticipantsDurationFocus
Strategic planningExec Sponsor + Gov BoardHalf dayNext year priorities, budget
Methodology reviewCoE Leader + practitionersHalf dayPlaybook updates, lessons learned
Certification reviewCoE Leader2 hoursRecertification, development gaps

Governance Meeting Template

Standard agenda for monthly governance board meeting:


Governance Board Meeting Agenda

Date: [Date] Time: [Time] (90 minutes) Attendees: [List]

1. Portfolio Status (20 min)

  • Active initiatives: status summary
  • Completed since last meeting: results
  • Pipeline: upcoming initiatives

2. Performance Review (20 min)

  • Metrics vs. targets
  • Issues and risks
  • Resource utilization

3. Decisions Required (30 min)

  • [Decision 1]: Background, options, recommendation
  • [Decision 2]: Background, options, recommendation
  • [Escalations if any]

4. Strategic Discussion (15 min)

  • [Topic for board input]

5. Next Steps (5 min)

  • Action items and owners
  • Next meeting date

Exercise: Design Your Governance Framework

Step 1: Customize Governance Roles

Review the role definitions. For your organization:

  • What modifications are needed to fit organizational structure?
  • Who specifically will fill each role?
  • Are additional roles needed?
  • Are any roles unnecessary given your scale?

Document customized role definitions with named individuals where possible.


Step 2: Build Decision Rights Matrix

Using the templates as starting points:

  • What decisions are relevant to your organization?
  • What decisions are missing?
  • What thresholds define "minor" vs. "major"?
  • Who has which authority?

Create your customized decision rights matrix.


Step 3: Define Escalation Framework

Specify escalation for your organization:

  • What situations trigger escalation?
  • What is the escalation path for each situation?
  • What is the expected resolution timeline?
  • How are resolutions documented?

Document your escalation framework.


Step 4: Create Governance Calendar

Design your governance rhythm:

  • What meetings are needed?
  • Who attends each?
  • What is the cadence?
  • What are standard agendas?

Document your governance calendar with specific dates for the coming year.


Step 5: Develop Meeting Templates

Create templates for governance meetings:

  • Standard agendas
  • Status report formats
  • Decision request formats
  • Escalation request formats

Document templates for each regular meeting.


Quality Checklist: Governance Framework

Before finalizing governance, verify:

Roles:

  • All necessary roles are defined
  • Responsibilities are clear and complete
  • Authority matches accountability
  • Time commitments are realistic
  • Individuals are identified for each role

Decision Rights:

  • All significant decisions are mapped
  • Each decision has exactly one Accountable
  • Authority levels are appropriate
  • Thresholds are defined

Escalation:

  • Triggers are specific
  • Paths are clear
  • Timelines are defined
  • Documentation requirements are specified

Cadence:

  • Regular meetings are scheduled
  • Agendas are defined
  • Participants are identified
  • Meeting effectiveness can be evaluated

Worked Example: Thornton Manufacturing Governance

Diana designed governance for Thornton's CoE:

Role Customization:

  • Executive Sponsor: James Mitchell (VP Operations)
  • CoE Leader: Diana Okafor
  • Governance Board: James, Diana, Plant Managers (4), Finance Director
  • BU Sponsors: Plant Managers (double role with governance board)
  • Practitioners: 4 plant champions

Decision Rights Modification:

  • Major initiative threshold: $75K (lower than template—faster cycle time)
  • Methodology changes: Diana has final authority (small CoE, no need for board approval)
  • Resource allocation: Plant managers have full authority within plants (federated structure)

Escalation Simplification:

  • Most escalations go Practitioner → Diana → James
  • Plant-specific issues go Practitioner → Plant Manager → Diana
  • No formal governance board escalation (board meets monthly; too slow for urgent issues)

Governance Calendar:

  • Weekly: Diana + each champion (30 min each)
  • Monthly: Governance board (60 min—reduced from 90)
  • Quarterly: Portfolio review (expanded to 3 hours given manufacturing complexity)
  • Annually: Strategic planning integrated with plant planning cycles

Output: Governance Framework Document

Complete this exercise by producing:

  1. Role Definitions: Customized roles with named individuals
  2. Decision Rights Matrix: Complete matrix for all significant decisions
  3. Escalation Framework: Triggers, paths, timelines, documentation
  4. Governance Calendar: Complete schedule for coming year
  5. Meeting Templates: Agendas and formats for regular meetings

This governance framework becomes a chapter in your Orchestration Playbook.


Governance framework design draws on organizational governance literature, particularly COBIT (Control Objectives for Information Technologies) and the balanced governance approaches developed in project management professional standards.


Module 7B: LEAD — Practice

O — Operate

Portfolio Management: Prioritization and Resource Allocation Tools


Portfolio management translates strategic intent into resource allocation. This exercise develops the tools needed to prioritize opportunities, plan capacity, and manage the initiative portfolio over time.


Portfolio Management Tools

A complete portfolio management system includes:

ToolPurposeFrequency of Use
Opportunity IntakeCapture and initial assessmentAs opportunities arise
Prioritization ScorecardSystematic evaluation and rankingQuarterly or as needed
Capacity ModelResource planning and allocationMonthly
Portfolio DashboardStatus visibility and trackingWeekly/monthly
Portfolio ReviewStrategic assessment and adjustmentMonthly/quarterly

Tool 1: Opportunity Intake Form

Capture opportunities in a consistent format for evaluation:


Opportunity Intake Form

Section A: Basic Information

FieldResponse
Opportunity Name
Submitter
Submission Date
Business Unit
Sponsor (if identified)

Section B: Opportunity Description

FieldResponse
Process/Area Affected
Current Problem or Friction
Proposed Solution Concept
Affected Stakeholders

Section C: Initial Value Estimate

FieldResponse
Estimated Annual Value$
Value Basis (time/throughput/focus)
Confidence Level (High/Medium/Low)
Assumptions

Section D: Initial Complexity Estimate

FieldResponse
Estimated Durationmonths
Stakeholder ComplexityHigh/Medium/Low
Technical ComplexityHigh/Medium/Low
Change Management NeedsHigh/Medium/Low

Section E: Strategic Alignment

FieldResponse
Strategic Priority Supported
Urgency (timing driver?)
Dependencies (on or from other initiatives)

Section F: Disposition

FieldResponse
Initial DispositionAccept to Pipeline / Defer / Decline
Disposition Rationale
Disposition Date
Disposition Authority

Tool 2: Prioritization Scorecard

Systematically evaluate opportunities for prioritization:


Prioritization Scorecard

Opportunity: _______________ Evaluation Date: _______________ Evaluator: _______________

Scoring Instructions: Rate each criterion 1-5. Weight reflects relative importance. Final score = sum of (rating × weight).


Strategic Criteria (40% weight)

CriterionDescriptionRating (1-5)WeightScore
Strategic AlignmentHow directly does this support strategic priorities?0.20
Executive SupportHow strong is executive sponsorship?0.10
Organizational ReadinessHow ready is the organization for this change?0.10

Value Criteria (35% weight)

CriterionDescriptionRating (1-5)WeightScore
Value MagnitudeHow significant is the projected annual value?0.15
Value ConfidenceHow reliable is the value estimate?0.10
Time to ValueHow quickly will value be realized?0.10

Feasibility Criteria (25% weight)

CriterionDescriptionRating (1-5)WeightScore
Technical FeasibilityHow achievable is the technical solution?0.10
Resource AvailabilityDo we have or can we get required resources?0.10
Risk LevelHow manageable are the risks?0.05

Total Weighted Score: ___ / 5.0


Rating Definitions

RatingMeaning
5Exceptional—among the best opportunities we've seen
4Strong—clearly above average
3Adequate—meets minimum expectations
2Weak—below expectations but not disqualifying
1Poor—significant concerns

Score Interpretation

Score RangeInterpretationTypical Disposition
4.0 - 5.0High priorityPursue actively
3.0 - 3.9Medium priorityPursue if capacity allows
2.0 - 2.9Low priorityDefer or decline
Below 2.0Not recommendedDecline

Tool 3: Capacity Planning Model

Plan resource allocation across the portfolio:


Capacity Planning Worksheet

Planning Period: _______________


Section A: Available Capacity

Resource TypeHeadcount% AvailableMonthly CapacityPeriod Capacity
Certified Practitioners
Developing Practitioners
CoE Leadership
Total

Calculation:

  • Monthly Capacity = Headcount × % Available
  • Period Capacity = Monthly Capacity × Months in Period

Section B: Committed Demand

InitiativeStatusRemaining EffortResource TypeMonthly Demand
Total Committed

Section C: Capacity Available for New Initiatives

Resource TypePeriod CapacityTotal CommittedAvailable
Certified Practitioners
Developing Practitioners
CoE Leadership
Total

Section D: Pipeline Assessment

Pipeline InitiativePriority ScoreEst. EffortStart DateResource Match

Notes:

  • Est. Effort in practitioner-months
  • Resource Match: Which practitioners could staff this initiative?

Section E: Capacity Decisions

Based on this analysis:

DecisionRationale
Initiatives to start
Initiatives to defer
Capacity gaps to address
Resource conflicts to resolve

Tool 4: Portfolio Dashboard

Track portfolio status for governance visibility:


Portfolio Dashboard

As of: _______________


Section A: Portfolio Summary

MetricCurrentTargetTrend
Active Initiatives↑ ↓ →
Annual Value (Operational)$$↑ ↓ →
Annual Value (In Development)$$↑ ↓ →
Pipeline Depth (months of work)↑ ↓ →
Practitioner Utilization%70-85%↑ ↓ →
Delivery Success Rate%>70%↑ ↓ →

Section B: Active Initiatives

InitiativePhaseHealthValueTarget DateNotes
🟢 🟡 🔴$
🟢 🟡 🔴$
🟢 🟡 🔴$

Health Definitions:

  • 🟢 On track—proceeding as planned
  • 🟡 At risk—issues identified, mitigation in progress
  • 🔴 Off track—significant problems, intervention needed

Section C: Pipeline

OpportunityPriorityEst. ValueEst. StartBlocker
High/Med/Low$
High/Med/Low$
High/Med/Low$

Section D: Operational Systems

SystemStatusMonthly ValueLast ReviewNext Review
🟢 🟡 🔴$
🟢 🟡 🔴$
🟢 🟡 🔴$

Section E: Key Issues and Actions

IssueImpactOwnerActionDue Date

Tool 5: Portfolio Review Template

Structure the periodic portfolio review:


Portfolio Review Agenda

Date: _______________ Participants: _______________


1. Portfolio Health Summary (15 min)

Review dashboard metrics:

  • How are we performing against targets?
  • What trends are emerging?
  • What requires attention?

2. Active Initiative Deep Dive (30 min)

For each active initiative:

  • Current status and phase
  • Progress since last review
  • Issues and risks
  • Support needed
  • Forecast for completion

3. Pipeline Review (20 min)

  • New opportunities since last review
  • Prioritization of pipeline
  • Recommended starts given capacity
  • Opportunities to defer or decline

4. Capacity Assessment (15 min)

  • Current utilization
  • Upcoming capacity changes
  • Resource gaps or conflicts
  • Development pipeline for new practitioners

5. Strategic Alignment Check (10 min)

  • Does the portfolio reflect strategic priorities?
  • Are there priority areas not represented?
  • Should any initiatives be reprioritized?

6. Decisions and Actions (10 min)

  • Decisions made this session
  • Action items and owners
  • Preparation for next review

Exercise: Build Your Portfolio Management System

Step 1: Customize Intake Form

Adapt the opportunity intake form for your organization:

  • What fields are essential?
  • What fields should be added for your context?
  • What fields can be removed?
  • Who completes the form? Who reviews it?

Document your customized intake form.


Step 2: Calibrate Prioritization Scorecard

Customize the scorecard for your organization:

  • Are the criteria correct for your context?
  • Are the weights appropriate?
  • What rating definitions work for your organization?
  • Who conducts scoring?

Document your calibrated scorecard.

Test the scorecard by scoring 3-5 known opportunities. Do the results match intuition? If not, adjust criteria or weights.


Step 3: Build Capacity Model

Create your capacity model:

  • What resources are tracked?
  • What is current capacity?
  • What committed demand exists?
  • What capacity is available for new work?

Complete the capacity worksheet for your current situation.


Step 4: Design Portfolio Dashboard

Customize the dashboard for your needs:

  • What metrics matter most?
  • What is the right level of detail?
  • Who is the primary audience?
  • How often is it updated?

Create your initial dashboard with current data.


Step 5: Establish Review Cadence

Define your portfolio review process:

  • How often are reviews held?
  • Who participates?
  • What is the standard agenda?
  • How are decisions documented?

Document your review process and schedule the first review.


Quality Checklist: Portfolio Management System

Before finalizing, verify:

Intake:

  • Form captures necessary information
  • Submission process is clear
  • Review and disposition process is defined
  • Rejected opportunities receive feedback

Prioritization:

  • Criteria reflect organizational priorities
  • Weights are calibrated to produce sensible results
  • Scoring process is objective and consistent
  • Results are actionable

Capacity:

  • All relevant resources are tracked
  • Capacity calculations are realistic
  • Committed demand is accurately captured
  • Model supports decision-making

Dashboard:

  • Metrics provide meaningful visibility
  • Status is easy to understand
  • Issues are highlighted
  • Updates are timely

Review:

  • Cadence matches portfolio dynamics
  • Right people participate
  • Agenda covers essential topics
  • Decisions are made and documented

Worked Example: Thornton Manufacturing Portfolio Management

Diana implemented portfolio management for Thornton:

Intake Customization:

  • Added "Safety Impact Assessment" (manufacturing requirement)
  • Added "Union Considerations" (labor relations factor)
  • Removed "Technical Complexity" (Diana assessed this separately)
  • Form submitted to Diana directly; reviewed within 5 business days

Prioritization Calibration:

  • Increased weight on "Organizational Readiness" to 0.15 (change management was harder than expected)
  • Reduced weight on "Time to Value" to 0.05 (manufacturing initiatives inherently longer)
  • Added "Multi-Plant Applicability" criterion (0.05 weight) for scalability

Scorecard Test:

Diana scored four known opportunities:

  • Greenville Quality Documentation: 4.2 (validated—this was the successful R-01)
  • Oak Ridge Inventory: 3.8 (matched assessment—medium priority)
  • Riverside Maintenance Scheduling: 2.6 (validated decline decision)
  • Lakeside Customer Service: 3.5 (matched assessment—proceed with support)

Results matched intuition; no calibration changes needed.

Capacity Model:

ResourceHeadcount% AvailableMonthly Capacity
Diana (CoE Lead)150%0.5
Certified Champions0N/A0
Developing Champions430%1.2
Total1.7

Note: Low capacity drove decision to use hub-and-spoke model with Diana as central support.

Dashboard Design:

  • Simplified to single page (4 plants, manageable portfolio)
  • Added plant-by-plant breakout
  • Monthly update (aligned with governance board)
  • Shared via manufacturing operations SharePoint

Review Cadence:

  • Monthly: Governance board review (combined with general governance)
  • Quarterly: Deep portfolio assessment (separate 2-hour session)
  • Annually: Strategic planning (aligned with manufacturing budget cycle)

Output: Portfolio Management System

Complete this exercise by producing:

  1. Customized Intake Form: Ready for opportunity submission
  2. Calibrated Prioritization Scorecard: With test results validating calibration
  3. Capacity Model: With current state populated
  4. Portfolio Dashboard: With current portfolio status
  5. Review Process Documentation: Cadence, participants, agenda

This portfolio management system becomes a chapter in your Orchestration Playbook.


Portfolio management approaches draw on project portfolio management standards (PMI, AXELOS) and portfolio optimization theory. The prioritization scorecard methodology reflects multi-criteria decision analysis best practices.


Module 7B: LEAD — Practice

O — Operate

Culture Assessment: Organizational Readiness Diagnostic


Culture determines whether structures and processes translate into sustained capability. This exercise provides tools to assess organizational culture, identify barriers, and develop strategies for cultural alignment.


The Culture Assessment Framework

Organizational culture is assessed across four dimensions:

DimensionQuestionImpact on Methodology
Evidence OrientationHow does the organization use data in decisions?Affects business case credibility, measurement acceptance
Learning OrientationHow does the organization respond to mistakes and new information?Affects pilot learning, methodology evolution
Change ToleranceHow does the organization respond to process change?Affects implementation adoption, stakeholder cooperation
Execution DisciplineHow consistently does the organization follow through?Affects sustainability, governance adherence

Assessment Tool: Culture Diagnostic Survey

Administer this survey to stakeholders, practitioners, and leaders to assess cultural readiness:


Culture Diagnostic Survey

Instructions: Rate your organization on each statement using the scale below. Be honest—the assessment is only useful if it reflects reality.

Scale:

  • 5 = Strongly Agree
  • 4 = Agree
  • 3 = Neutral
  • 2 = Disagree
  • 1 = Strongly Disagree

Section A: Evidence Orientation

#StatementRating
A1Leaders regularly ask for data to support recommendations
A2Disagreements are resolved by examining evidence, not by hierarchy
A3Forecasts and projections are tracked against actual results
A4People are comfortable presenting data that contradicts leadership assumptions
A5Measurement is seen as helpful for improvement, not threatening
A6Business cases are scrutinized rather than rubber-stamped

Section A Total: ___ / 30


Section B: Learning Orientation

#StatementRating
B1After-action reviews are standard practice for projects
B2Failures are examined for learning, not just blame assignment
B3Lessons learned from one project influence future projects
B4People share knowledge across departmental boundaries
B5The organization changes practices when evidence suggests improvement
B6People feel safe admitting they don't know something

Section B Total: ___ / 30


Section C: Change Tolerance

#StatementRating
C1People are generally open to new ways of doing their work
C2Process changes are implemented smoothly when well-communicated
C3Resistance to change is addressed constructively, not ignored
C4Early adopters are recognized and their input valued
C5Change initiatives are given adequate time and resources
C6The organization doesn't suffer from "initiative fatigue"

Section C Total: ___ / 30


Section D: Execution Discipline

#StatementRating
D1Projects typically finish what they start
D2Accountability for outcomes is clear and maintained
D3Deadlines are taken seriously and usually met
D4Follow-through on commitments is consistently expected
D5Governance processes are followed, not circumvented
D6Resources allocated to initiatives are protected from reallocation

Section D Total: ___ / 30


Overall Score: ___ / 120


Scoring Interpretation

Section Scores (out of 30)

Score RangeInterpretationImplication
25-30StrongThis dimension supports methodology adoption
18-24AdequateSome work needed but foundation exists
12-17WeakSignificant barriers; requires active mitigation
Below 12CriticalMajor cultural change needed before scaling

Overall Score (out of 120)

Score RangeInterpretationRecommendation
96-120Highly ReadyProceed with full confidence
72-95Ready with AttentionProceed but address weak dimensions
48-71ChallengedProceed cautiously; invest in culture work
Below 48Not ReadyAddress cultural foundations before scaling

Assessment Tool: Cultural Barrier Interview

Complement the survey with targeted interviews to understand barriers in depth:


Cultural Barrier Interview Guide

Introduction: "I'm exploring how our organizational culture might affect our ability to scale the methodology. I'd like your honest perspective on how things really work here."

Duration: 30-45 minutes

Questions:


Evidence Orientation

  1. "When someone presents a recommendation with data, how does leadership typically respond?"

    • Probe: Do they engage with the data? Challenge assumptions? Accept uncritically?
  2. "Can you think of a time when data changed someone's mind about a decision they were inclined to make?"

    • Probe: Was this typical or exceptional?
  3. "What happens when measurement reveals that something isn't working as expected?"

    • Probe: Is this seen as helpful information or threatening criticism?

Learning Orientation

  1. "After a project ends, how thoroughly does the organization examine what worked and what didn't?"

    • Probe: Are lessons documented? Applied to future work?
  2. "What happens when someone makes a mistake here?"

    • Probe: Is the focus on blame or learning?
  3. "How easily does knowledge flow between different departments?"

    • Probe: What enables or blocks this flow?

Change Tolerance

  1. "When leadership announces a new initiative, what's the typical reaction?"

    • Probe: Enthusiasm? Skepticism? Cynicism?
  2. "Can you tell me about a recent change that went smoothly? What made it work?"

    • Probe: What factors contributed to success?
  3. "Can you tell me about a recent change that struggled? What got in the way?"

    • Probe: What barriers emerged?

Execution Discipline

  1. "When a project or initiative starts, how often does it actually complete as planned?"

    • Probe: What causes initiatives to stall or fail?
  2. "How clear is accountability for outcomes in this organization?"

    • Probe: Do people know who's responsible? Are they held accountable?
  3. "What happens when governance processes are inconvenient? Are they followed anyway?"

    • Probe: Is governance respected or circumvented?

Summary Questions

  1. "If we wanted to build a sustained capability for AI-augmented process improvement, what would be our biggest cultural challenges?"

  2. "What has made similar initiatives succeed or fail here in the past?"

  3. "Who are the cultural champions—people who embody the behaviors that would make this work?"


Assessment Tool: Cultural Pattern Analysis

Synthesize survey and interview data to identify patterns:


Cultural Pattern Analysis Template

Assessment Date: _______________ Data Sources: _______________


Section A: Dimension Summary

DimensionSurvey ScoreInterview ThemesOverall Assessment
Evidence Orientation/30Strong/Adequate/Weak/Critical
Learning Orientation/30Strong/Adequate/Weak/Critical
Change Tolerance/30Strong/Adequate/Weak/Critical
Execution Discipline/30Strong/Adequate/Weak/Critical

Section B: Key Barriers Identified

BarrierDimensionEvidenceImpact on Methodology

Section C: Cultural Enablers Identified

EnablerDimensionEvidenceHow to Leverage

Section D: Cultural Champions

PersonRoleCultural StrengthHow to Engage

Section E: Risk Assessment

RiskLikelihoodImpactMitigation Strategy
High/Med/LowHigh/Med/Low
High/Med/LowHigh/Med/Low
High/Med/LowHigh/Med/Low

Mitigation Strategies

For each weak dimension, develop targeted mitigation:


Mitigating Weak Evidence Orientation

StrategyActions
Lead with dataEnsure every presentation starts with evidence, not opinion
Celebrate data-driven decisionsRecognize when leaders change their minds based on data
Make measurement safeEmphasize that measurement is for learning, not punishment
Build data literacyTrain stakeholders on how to interpret metrics
Model transparencyShare methodology results openly, including disappointments

Mitigating Weak Learning Orientation

StrategyActions
Institute after-action reviewsMake them mandatory, with documented lessons
Separate failure from blameFocus post-mortems on system factors, not individuals
Create learning forumsRegular sessions where practitioners share experiences
Document and distributeEnsure lessons learned reach people who can apply them
Reward learning behaviorRecognize people who identify improvement opportunities

Mitigating Weak Change Tolerance

StrategyActions
Start with believersEngage early adopters; let skeptics see success
Over-communicateExplain the why, not just the what
Address concerns directlyDon't dismiss resistance; engage with it
Pace appropriatelyDon't rush; give people time to adapt
Demonstrate respectAcknowledge current expertise while introducing change

Mitigating Weak Execution Discipline

StrategyActions
Make commitments explicitDocument what's agreed; track against it
Create visible accountabilityClear owners for every initiative and deliverable
Celebrate completionRecognize finishing, not just starting
Protect initiative resourcesShield allocated time from competing demands
Address non-complianceDon't ignore when governance is bypassed

Exercise: Assess Your Organizational Culture

Step 1: Administer Survey

  • Identify 10-20 respondents across relevant stakeholder groups
  • Include practitioners, sponsors, business unit leaders, executives
  • Ensure anonymity to encourage honest responses
  • Set 1-week deadline for responses

Step 2: Conduct Interviews

  • Select 5-8 interviewees representing diverse perspectives
  • Include enthusiasts and skeptics
  • Use interview guide; allow conversation to flow naturally
  • Take detailed notes; look for patterns

Step 3: Analyze Data

  • Calculate survey scores by dimension
  • Identify themes from interviews
  • Complete Cultural Pattern Analysis template
  • Note discrepancies between survey and interview findings

Step 4: Identify Priorities

Based on analysis:

  • Which dimensions require immediate attention?
  • Which barriers are most significant?
  • Which enablers can be leveraged?
  • Who are the cultural champions?

Step 5: Develop Mitigation Plan

For each significant barrier:

  • What specific actions will address it?
  • Who is responsible for each action?
  • What is the timeline?
  • How will progress be measured?

Document your mitigation plan.


Quality Checklist: Culture Assessment

Before finalizing assessment, verify:

Data Quality:

  • Survey had adequate response rate (>60%)
  • Respondents represented diverse stakeholder groups
  • Interviews provided depth beyond survey scores
  • Anonymity was maintained for honest responses

Analysis Quality:

  • All four dimensions were assessed
  • Both barriers and enablers were identified
  • Cultural champions were identified
  • Risks were assessed realistically

Mitigation Quality:

  • Strategies address actual barriers (not generic)
  • Actions are specific and actionable
  • Owners are identified
  • Progress measures are defined

Worked Example: Thornton Manufacturing Culture Assessment

Diana conducted cultural assessment at Thornton:

Survey Results:

DimensionScoreAssessment
Evidence Orientation24/30Adequate
Learning Orientation18/30Weak
Change Tolerance21/30Adequate
Execution Discipline26/30Strong

Overall: 89/120 (Ready with Attention)

Interview Themes:

  • Evidence: Strong manufacturing culture of measurement; data respected
  • Learning: After-action reviews happen but lessons don't transfer between plants
  • Change: "Initiative fatigue" mentioned by 3 of 6 interviewees; past programs faded
  • Execution: Manufacturing discipline carries over; people finish what they start

Key Barriers:

  1. Cross-plant learning gap: Each plant operates independently; lessons don't travel
  2. Initiative cynicism: Previous programs launched with fanfare, then abandoned
  3. Patricia problem: Critical expertise concentrated in individuals with no backup

Key Enablers:

  1. Measurement culture: Manufacturing understands metrics; business cases credible
  2. Execution discipline: Plant managers follow through on commitments
  3. James's commitment: VP Operations is genuine champion, not just sponsor

Cultural Champions:

  • James Mitchell (VP Operations): Executive visibility, genuine belief
  • Plant Manager at Oak Ridge: Early adopter, well-respected by peers
  • Quality Director: Sees methodology as natural extension of quality culture

Mitigation Plan:

BarrierStrategyActionsOwnerTimeline
Cross-plant learningCreate practitioner communityMonthly all-hands call; shared Slack channelDianaMonth 1
Initiative cynicismDemonstrate sustainability6-month commitment before "declaring victory"JamesOngoing
Patricia problemKnowledge documentationRequire backup for all critical expertiseDianaMonth 3

Output: Culture Assessment Package

Complete this exercise by producing:

  1. Survey Results Summary: Scores by dimension with interpretation
  2. Interview Summary: Key themes and illustrative quotes
  3. Cultural Pattern Analysis: Completed template
  4. Barrier Mitigation Plan: Specific strategies with owners and timelines
  5. Ongoing Monitoring Plan: How cultural health will be tracked

This culture assessment becomes a foundation document for the Orchestration Playbook and informs implementation strategy.


Cultural assessment approaches draw on organizational culture research, particularly Schein's culture model and the competing values framework. The diagnostic tools reflect validated approaches to organizational assessment.


Module 7B: LEAD — Practice

O — Operate

Playbook Assembly: Complete Orchestration Playbook


This exercise integrates all previous Module 7 work into a complete Orchestration Playbook. The playbook is the capstone deliverable of Module 7—a comprehensive document that enables others to execute the A.C.O.R.N. methodology independently.


Assembly Process

Playbook assembly follows five steps:

  1. Compile components from previous exercises
  2. Develop remaining content (phase guides, examples)
  3. Integrate into unified document
  4. Quality review and refinement
  5. Validation with test users

Step 1: Compile Components

Gather outputs from Module 7 exercises:

ComponentSourceStatus
Playbook ArchitectureExercise 7.8☐ Complete
Teaching SystemExercise 7.9☐ Complete
Governance FrameworkExercise 7.10☐ Complete
Portfolio Management SystemExercise 7.11☐ Complete
Culture AssessmentExercise 7.12☐ Complete

Gather outputs from Modules 2-6 (R-01 deliverables):

ComponentSourceStatus
Friction InventoryModule 2☐ Refined for playbook
Opportunity AssessmentModule 2☐ Refined for playbook
Baseline MetricsModule 3☐ Refined for playbook
Business CaseModule 3☐ Refined for playbook
Workflow BlueprintModule 4☐ Refined for playbook
Pilot PlanModule 5☐ Refined for playbook
Results ReportModule 5☐ Refined for playbook
Sustainability PlanModule 6☐ Refined for playbook

Step 2: Develop Remaining Content

The following content must be developed for the playbook:


Front Matter

SectionContent RequirementsWord Count
Executive SummaryPurpose, audience, methodology overview, how to use500-750
Methodology OverviewA.C.O.R.N. explanation, principles, value proposition1,000-1,500
How to Use This PlaybookNavigation, conventions, when to reference300-500
GlossaryKey terms and definitionsAs needed

Phase Guides

Each phase guide follows a consistent structure:

Phase Guide Template

[PHASE NAME] PHASE GUIDE

1. Phase Overview
   - Purpose: Why this phase exists
   - Inputs: What you need before starting
   - Outputs: What you'll produce
   - Duration: Typical time required
   - Key Success Factors: What makes this phase succeed

2. Step-by-Step Process
   - Step 1: [Action]
     - Purpose: Why this step matters
     - Instructions: Specific guidance
     - Tips: Practical advice
   - Step 2: [Action]
     ...

3. Templates
   - [Template 1] with instructions
   - [Template 2] with instructions

4. Quality Checklist
   - [ ] Checkpoint 1
   - [ ] Checkpoint 2
   ...

5. Common Mistakes
   - Mistake 1: Description and prevention
   - Mistake 2: Description and prevention

6. Decision Trees
   - Decision 1: [Situation] → [Options and criteria]
   - Decision 2: [Situation] → [Options and criteria]

7. Worked Example
   - R-01 illustration of this phase
   - Key decisions and rationale

Phase Guide Development Checklist:

PhaseDraft CompleteTemplates IncludedExample IncludedQuality Checked
Assess
Calculate
Orchestrate
Realize
Nurture

Case Examples

Develop 2-3 complete case examples:

CasePurposeContent
R-01 Primary CaseFull methodology illustrationComplete walkthrough of your first implementation
Alternative Domain CaseDemonstrate breadthCase from different organizational function
Failure CaseShow what goes wrongComposite example of common failure patterns

Case Structure:

CASE: [Name]

Background
- Organization context
- Problem or opportunity
- Initial situation

The A.C.O.R.N. Journey
- Assess: What was discovered
- Calculate: How value was quantified
- Orchestrate: How the solution was designed
- Realize: How the pilot was executed
- Nurture: How sustainability was established

Results
- Value delivered
- Lessons learned
- What would be done differently

Key Takeaways
- Principles illustrated
- Decisions that mattered

Step 3: Integrate into Unified Document

Assemble all components into the playbook structure:


Playbook Assembly Template

ORCHESTRATION PLAYBOOK
[Organization Name]
Version 1.0 | [Date]

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

FRONT MATTER

Executive Summary...................................... [page]
Methodology Overview................................... [page]
How to Use This Playbook............................... [page]
Glossary............................................... [page]

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

PART I: METHODOLOGY EXECUTION

Chapter 1: Assess Phase................................ [page]
  1.1 Phase Overview
  1.2 Step-by-Step Process
  1.3 Templates
  1.4 Quality Checklist
  1.5 Common Mistakes
  1.6 Decision Trees
  1.7 Worked Example

Chapter 2: Calculate Phase............................. [page]
  [Same structure]

Chapter 3: Orchestrate Phase........................... [page]
  [Same structure]

Chapter 4: Realize Phase............................... [page]
  [Same structure]

Chapter 5: Nurture Phase............................... [page]
  [Same structure]

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

PART II: ORGANIZATIONAL INFRASTRUCTURE

Chapter 6: Governance Framework........................ [page]
  6.1 Roles and Responsibilities
  6.2 Decision Rights Matrix
  6.3 Escalation Framework
  6.4 Governance Calendar
  6.5 Meeting Templates

Chapter 7: Portfolio Management........................ [page]
  7.1 Opportunity Intake
  7.2 Prioritization Scorecard
  7.3 Capacity Planning
  7.4 Portfolio Dashboard
  7.5 Review Process

Chapter 8: Teaching and Development.................... [page]
  8.1 Learning Path
  8.2 Foundation Curriculum
  8.3 Assessment Instruments
  8.4 Mentorship Guidelines
  8.5 Certification Process

Chapter 9: Culture and Sustainability.................. [page]
  9.1 Culture Assessment Framework
  9.2 Barrier Mitigation Strategies
  9.3 Ongoing Culture Monitoring

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

PART III: REFERENCE MATERIALS

Appendix A: Complete Template Library.................. [page]
  - Friction Inventory Template
  - Opportunity Assessment Template
  - Baseline Metrics Template
  - Business Case Template
  - Workflow Blueprint Template
  - Pilot Plan Template
  - Results Report Template
  - Sustainability Plan Template
  - All governance and portfolio templates

Appendix B: Case Examples.............................. [page]
  - R-01 Complete Case
  - Alternative Domain Case
  - Failure Case Analysis

Appendix C: Decision Tree Library...................... [page]
  - Pursue vs. Defer Decision
  - Scope Selection Decision
  - Escalation Decision
  - Iteration vs. Rebuild Decision

Appendix D: Quality Checklists......................... [page]
  - Assess Phase Checklist
  - Calculate Phase Checklist
  - Orchestrate Phase Checklist
  - Realize Phase Checklist
  - Nurture Phase Checklist
  - Playbook Quality Checklist

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

DOCUMENT CONTROL

Version History
Change Log
Review Schedule
Document Owner
Contact Information

Step 4: Quality Review

Review the assembled playbook against quality criteria:


Playbook Quality Checklist

Completeness

  • All A.C.O.R.N. phases are covered with equal depth
  • All templates are included and functional
  • Governance framework is complete
  • Teaching system is fully specified
  • Portfolio management tools are ready for use
  • Culture assessment is actionable

Usability

  • A new practitioner can follow the playbook to execute an initiative
  • Navigation is clear; users can find what they need
  • Instructions are unambiguous
  • Templates are self-explanatory
  • Examples illustrate key concepts effectively

Accuracy

  • Methodology description matches actual practice
  • Templates reflect what's actually used
  • Examples are realistic and instructive
  • Decision trees guide to correct outcomes

Consistency

  • Terminology is used consistently throughout
  • Formatting is consistent across sections
  • Quality standards are consistent across phases
  • Cross-references are accurate

Maintainability

  • Modular structure allows independent section updates
  • Version control system is established
  • Review schedule is defined
  • Ownership is clear

Peer Review Process

Before finalizing, conduct peer review:

  1. Select reviewers: 2-3 people who will use the playbook
  2. Provide review criteria: Share quality checklist above
  3. Allow adequate time: 1-2 weeks for thorough review
  4. Collect feedback: Written comments on specific sections
  5. Revise based on feedback: Address all substantive issues
  6. Confirm with reviewers: Verify revisions are adequate

Step 5: Validation with Test Users

Validate the playbook works in practice:


Validation Approach

Validation MethodPurposeParticipants
WalkthroughTest navigation and clarityNew practitioner candidate
Template TestVerify templates workExperienced practitioner
Teaching TestValidate curriculumFoundation learner
Decision Tree TestVerify guidance qualityPractitioner with real decision

Walkthrough Protocol

Ask a test user unfamiliar with the methodology to:

  1. Find how to start an initiative (tests navigation)
  2. Explain what the Assess phase produces (tests clarity)
  3. Complete a friction inventory template (tests template usability)
  4. Determine when to escalate a stakeholder conflict (tests decision guidance)
  5. Describe what quality looks like for a business case (tests quality criteria)

Capture:

  • Where did they get stuck?
  • What was confusing?
  • What questions arose that weren't answered?
  • How long did each task take?

Template Test Protocol

Ask an experienced practitioner to:

  1. Complete each template using a real or realistic scenario
  2. Note any fields that are unclear
  3. Identify missing fields that would be useful
  4. Rate template usability (1-5 scale)
  5. Suggest improvements

Revision Based on Validation

FindingSourceRevision Made

Exercise: Assemble Your Playbook

Step 1: Inventory and Gap Analysis

Complete the component compilation checklist above. Identify:

  • What components are complete?
  • What components need development?
  • What is the estimated effort for remaining work?

Step 2: Develop Phase Guides

For each A.C.O.R.N. phase:

  1. Draft the phase guide using the template
  2. Incorporate R-01 deliverables as templates and examples
  3. Apply the quality checklist
  4. Refine based on gaps identified

Estimated effort: 4-8 hours per phase


Step 3: Develop Front Matter

Write:

  • Executive Summary (500-750 words)
  • Methodology Overview (1,000-1,500 words)
  • How to Use This Playbook (300-500 words)
  • Glossary (compile key terms)

Estimated effort: 4-6 hours


Step 4: Develop Case Examples

Create:

  • R-01 Complete Case (primary example)
  • Alternative Domain Case (or adapt from course materials)
  • Failure Case (composite from common patterns)

Estimated effort: 6-10 hours


Step 5: Assemble Document

Integrate all components into unified playbook:

  • Apply consistent formatting
  • Create table of contents
  • Add page numbers
  • Verify cross-references

Estimated effort: 4-6 hours


Step 6: Quality Review

Complete quality checklist self-assessment Conduct peer review (1-2 weeks) Revise based on feedback


Step 7: Validation

Conduct validation tests:

  • Walkthrough with new user
  • Template test with experienced practitioner
  • Revise based on findings

Worked Example: Thornton Manufacturing Playbook Summary

Diana assembled Thornton's playbook over six weeks:

Scope:

  • 95 pages total
  • 5 phase guides (~12 pages each)
  • Governance and infrastructure chapters (~25 pages)
  • Appendices with templates and examples (~20 pages)

Key Customizations:

  • Added manufacturing-specific safety and compliance sections
  • Integrated with existing Lean/Six Sigma terminology
  • Simplified governance (smaller organization)
  • Added union consideration guidance

Review Process:

  • Peer review by 2 plant champions
  • Executive review by James Mitchell
  • Validation walkthrough with new operations analyst

Major Revisions After Review:

  • Simplified Calculate phase guide (was too theoretical)
  • Added more decision tree options for stakeholder resistance
  • Clarified escalation paths (original was ambiguous)
  • Added quick-reference card for each phase

Final Deliverable:

  • Version 1.0 approved month 8 of 18-month initiative
  • Distributed via SharePoint with controlled access
  • First external use by Oak Ridge champion

Output: Complete Orchestration Playbook

Complete this exercise by producing:

  1. Front Matter (Executive Summary, Methodology Overview, How to Use, Glossary)
  2. Part I: Phase Guides (Assess, Calculate, Orchestrate, Realize, Nurture)
  3. Part II: Infrastructure (Governance, Portfolio, Teaching, Culture)
  4. Part III: Reference (Templates, Cases, Decision Trees, Checklists)
  5. Document Control (Version, ownership, review schedule)

The complete playbook is the capstone deliverable of Module 7, demonstrating readiness to lead organizational transformation.


Playbook Completion Criteria

The playbook is complete when:

  • All components are present and integrated
  • Quality checklist passes self-assessment
  • Peer review feedback has been addressed
  • Validation tests confirm usability
  • Document control is established
  • Organizational approval obtained (if required)

The Orchestration Playbook is a living document. Version 1.0 represents initial capability; subsequent versions will incorporate learning from use. Plan for at least annual review and update.


Module 7B: LEAD — Practice

T — Test

Measuring Leadership Effectiveness


Module 7's anchor principle—"The measure of mastery is whether others can do it without you"—requires specific metrics. This section defines how to measure leadership effectiveness in building organizational capability.


The Leadership Measurement Framework

Leadership success is measured across three dimensions:

DimensionQuestionTimeframe
Capability BuildingAre others developing the ability to execute?6-18 months
Portfolio PerformanceIs the initiative portfolio delivering value?Ongoing
Organizational SustainabilityWill the capability persist beyond the leader?12-36 months

Dimension 1: Capability Building Metrics

These metrics track whether practitioners are developing competence:


Practitioner Pipeline

MetricDefinitionTargetMeasurement
Pipeline depthNumber of practitioners at each development levelAdequate for portfolio demandQuarterly census
Advancement rate% of practitioners advancing to next level within expected timeframe>70%Per cohort
Certification rate% of foundation entrants achieving certification>60%Per cohort
Time to certificationAverage months from foundation start to certification≤12 monthsPer cohort

Calculation Examples:

Pipeline Depth Assessment:
- Level 0 (Awareness): 25 stakeholders trained
- Level 1 (Foundation): 4 practitioners in training
- Level 2 (Supervised): 3 practitioners executing with support
- Level 3 (Supported): 2 practitioners with reduced supervision
- Level 4 (Certified): 2 practitioners independent

Assessment: Pipeline adequate for current portfolio of 4 active initiatives

Practitioner Performance

MetricDefinitionTargetMeasurement
First initiative success rate% of supervised initiatives meeting value projections>60%Per practitioner
Quality scoresAverage quality checklist scores across deliverables>80%Per initiative
Stakeholder satisfactionStakeholder ratings of practitioner collaboration>4.0/5Post-initiative survey
Independence trajectoryReduction in mentor intervention over timeDecreasing trendMonthly tracking

Knowledge Transfer Effectiveness

MetricDefinitionTargetMeasurement
Playbook utilization% of practitioners actively using playbook>80%Quarterly survey
Decision accuracy% of escalations that reflect appropriate judgment>75%Escalation review
Teaching effectivenessTrainee assessment scores>80% averagePer cohort
Knowledge retentionPractitioner performance after mentor reductionSustained quality6-month follow-up

Dimension 2: Portfolio Performance Metrics

These metrics track whether the initiative portfolio is delivering value:


Value Delivery

MetricDefinitionTargetMeasurement
Total annual value (operational)Sum of value from operational initiativesGrowth YoYMonthly calculation
Total annual value (in development)Projected value from active initiatives1.5-2x operationalMonthly calculation
Value realization rateActual value vs. projected value>80%Per initiative
Time to valueAverage months from initiative start to value deliveryDecreasing trendPer initiative

Calculation Example:

Portfolio Value Assessment (Month 12):

Operational Initiatives:
- Greenville Quality Documentation: $340,000/year (validated)
- Oak Ridge Inventory Optimization: $165,000/year (validated)
Total Operational: $505,000/year

In Development:
- Riverside Procurement: $220,000 projected
- Lakeside Customer Service: $95,000 projected
Total In Development: $315,000 projected

Value Realization Rate (YTD):
- Greenville: 100% (exceeded projection)
- Oak Ridge: 95% (slightly below)
Average: 97.5%

Portfolio Health

MetricDefinitionTargetMeasurement
Initiative success rate% of initiatives meeting or exceeding projections>70%Per initiative
Pipeline depthMonths of work in approved pipeline12-18 monthsQuarterly
Portfolio diversityDistribution across domains, sizes, complexityBalancedQuarterly review
Cycle timeAverage initiative duration by typeWithin targetsPer initiative

Resource Efficiency

MetricDefinitionTargetMeasurement
Practitioner utilization% of practitioner capacity engaged in initiatives70-85%Monthly
Cost per dollar of valueCoE cost / annual value deliveredDecreasingAnnual
Enhancement ratioEnhancements vs. new implementationsIncreasing over timeAnnual
Maintenance overhead% of capacity on operational maintenance<25%Monthly

Dimension 3: Organizational Sustainability Metrics

These metrics track whether the capability will persist:


Structural Sustainability

MetricDefinitionTargetMeasurement
Governance adherence% of decisions made through governance>90%Governance review
Process documentation currency% of playbook current within 6 months>90%Quarterly audit
Succession readinessRoles with identified successors100% of critical rolesAnnual review
Budget stabilityYear-over-year budget variance<10% varianceAnnual

Cultural Sustainability

MetricDefinitionTargetMeasurement
Methodology reputationStakeholder perception of methodology valuePositive trendAnnual survey
Demand growthOrganic opportunity submissionsGrowingMonthly
Retention ratePractitioner retention>80%Annual
Cultural assessment scoresScores on culture diagnostic dimensionsImprovingAnnual

Leader Independence

MetricDefinitionTargetMeasurement
Decision independence% of decisions made without CoE leaderIncreasingMonthly
Escalation rateEscalations to CoE leader per initiativeDecreasingMonthly
Leader absence resiliencePerformance during leader absenceNo degradationTest periods
Knowledge distributionSingle points of expertiseDecreasingAnnual audit

Leading vs. Lagging Indicators

Leading Indicators (predict future success):

IndicatorWhat It PredictsAction Trigger
Practitioner satisfactionRetention, performanceScore <3.5/5
Stakeholder engagementInitiative adoptionEngagement declining
Pipeline qualityFuture value deliveryPipeline thinning
Training completion ratesFuture capabilityRates declining
Governance participationSustainabilityAttendance dropping

Lagging Indicators (confirm past performance):

IndicatorWhat It ConfirmsReview Frequency
Annual value deliveredPortfolio effectivenessAnnual
Certification completionsCapability building successSemi-annual
Initiative success rateMethodology effectivenessQuarterly
Cost per value dollarResource efficiencyAnnual

Dashboard Template: Leadership Effectiveness

Leadership Effectiveness Dashboard

Period: _______________


Capability Building

MetricCurrentTargetTrendStatus
Certified practitioners↑↓→🟢🟡🔴
Pipeline depth↑↓→🟢🟡🔴
First initiative success rate>60%↑↓→🟢🟡🔴
Time to certification≤12 mo↑↓→🟢🟡🔴

Portfolio Performance

MetricCurrentTargetTrendStatus
Annual value (operational)$$↑↓→🟢🟡🔴
Value realization rate>80%↑↓→🟢🟡🔴
Initiative success rate>70%↑↓→🟢🟡🔴
Practitioner utilization70-85%↑↓→🟢🟡🔴

Organizational Sustainability

MetricCurrentTargetTrendStatus
Governance adherence>90%↑↓→🟢🟡🔴
Decision independenceIncreasing↑↓→🟢🟡🔴
Cultural assessment score/120Improving↑↓→🟢🟡🔴
Succession readiness100%↑↓→🟢🟡🔴

Key Issues

IssueDimensionAction Required

Measurement Implementation

Frequency:

Metric CategoryCollection FrequencyReview Frequency
Capability metricsMonthlyQuarterly
Portfolio metricsMonthlyMonthly
Sustainability metricsQuarterlyQuarterly
Leading indicatorsMonthlyMonthly
Lagging indicatorsPer eventQuarterly

Responsibility:

ActivityOwner
Data collectionCoE Leader or designated analyst
Dashboard productionCoE Leader
Dashboard reviewGovernance Board
Corrective actionCoE Leader with executive sponsor support

Worked Example: Thornton Manufacturing Leadership Metrics (Month 18)

Capability Building:

MetricResultAssessment
Certified practitioners3Target: 4; slightly behind
Pipeline depth4 in developmentAdequate for demand
First initiative success rate75% (3/4)Exceeds target
Time to certification10 months averageAhead of target

Portfolio Performance:

MetricResultAssessment
Annual value (operational)$680,000Doubled from baseline
Value realization rate97%Exceeds target
Initiative success rate80%Exceeds target
Practitioner utilization78%Within target range

Organizational Sustainability:

MetricResultAssessment
Governance adherence95%Exceeds target
Decision independence65% (up from 40%)Improving trend
Cultural assessment score94/120 (up from 89)Improving
Succession readiness80%Gap: need Diana's successor

Overall Assessment: Strong progress on capability building and portfolio performance. Primary risk is Diana as single point of leadership—succession planning is priority for year 2.


Quality Checklist: Leadership Measurement

Before finalizing measurement system, verify:

Completeness:

  • All three dimensions are covered
  • Both leading and lagging indicators are included
  • Metrics are actionable (can drive decisions)

Feasibility:

  • Data is collectible with reasonable effort
  • Collection frequency is sustainable
  • Responsibility is assigned

Validity:

  • Metrics measure what they claim to measure
  • Targets are realistic and meaningful
  • Trends are interpretable

Utility:

  • Dashboard is usable by intended audience
  • Issues trigger clear actions
  • Metrics connect to strategic objectives

Leadership effectiveness metrics draw on balanced scorecard methodology and organizational capability maturity models. The three-dimension framework reflects the integration of practitioner development, portfolio performance, and organizational sustainability.


Module 7B: LEAD — Practice

S — Share

Reflection Prompts and Peer Exercises


Module 7 addresses the transition from practitioner to leader—a shift that requires both skill development and mindset change. These exercises support consolidation of learning through individual reflection and peer engagement.


Individual Reflection Prompts

Complete these reflections as you work through Module 7:


Reflection 1: The Practitioner-Leader Transition

Complete after reading the Thornton Manufacturing case study

Consider your own journey:

  1. What aspects of practitioner work do you find most satisfying? (Direct execution, visible results, personal control, recognized expertise)

  2. Which of these would you need to surrender to become an effective leader?

  3. Diana experienced resistance to "letting go" of direct execution. Where might you experience similar resistance?

  4. The case describes Diana's shift from "what I produce" to "what others produce without me." How does this shift feel to you—exciting, threatening, both?

Write 200-300 words reflecting on your readiness for the practitioner-to-leader transition.


Reflection 2: Your Teaching Challenge

Complete after the teaching system exercise

Teaching requires making tacit knowledge explicit. Reflect on your own expertise:

  1. What do you know how to do that you've never had to explain to someone else?

  2. If you had to teach someone to do what you do, what would be the hardest part to transfer?

  3. Think of a decision you made during your R-01 implementation. Could you write a decision tree that would help someone else make a similar decision? What would be missing?

  4. What expertise do you have that you developed through experience that can't be captured in documentation?

Write 200-300 words about the specific challenges of codifying your own expertise.


Reflection 3: Governance and Control

Complete after the governance framework exercise

Governance replaces personal control with structural control. Consider:

  1. In your current organization, how are decisions about process improvement typically made? By hierarchy? By consensus? By whoever acts first?

  2. What would change if decision rights were explicit as Module 7 describes?

  3. Where might you resist having your decisions subject to governance? Where might you welcome it?

  4. Diana's governance gave her "final authority" on methodology interpretation. What would you want final authority over? What would you be willing to share?

Write 200-300 words exploring your relationship with formal governance structures.


Reflection 4: Cultural Reality

Complete after the culture assessment exercise

Apply the culture assessment framework to your own organization:

  1. Without administering the formal survey, estimate your organization's scores on the four dimensions (Evidence Orientation, Learning Orientation, Change Tolerance, Execution Discipline).

  2. What's the biggest cultural barrier to scaling the methodology in your organization?

  3. Who are the cultural champions—people who embody the behaviors that would make this succeed?

  4. If you were Diana at Thornton, what cultural mitigation would be your first priority?

Write 200-300 words assessing your organization's cultural readiness.


Reflection 5: The Playbook Test

Complete after playbook assembly

The playbook's purpose is enabling others to execute without you. Test this mentally:

  1. Imagine giving your playbook to a competent professional who has never met you. What questions would they have that the playbook doesn't answer?

  2. What decisions in your playbook still require "judgment" that you haven't fully codified?

  3. If you disappeared tomorrow, what would your playbook's users struggle with most?

  4. What would you add to Version 2.0 of your playbook that you couldn't include in Version 1.0?

Write 200-300 words honestly assessing your playbook's completeness.


Peer Learning Exercises

Complete these exercises with learning partners or cohort members:


Exercise 1: Leadership Philosophy Exchange

Pairs or groups of 3; 30 minutes

Setup: Each participant completes this statement in advance:

"The most important thing I've learned about leadership from Module 7 is _____________ because _____________."

Process:

  1. Share (5 min each): Each participant shares their statement and explains their reasoning.

  2. Probe (5 min each): Partners ask clarifying questions:

    • What experience led you to this conclusion?
    • How does this differ from what you believed before Module 7?
    • How will this change how you approach leading?
  3. Synthesize (5 min): Group identifies common themes and differences across perspectives.

Output: Each participant refines their leadership philosophy statement based on the discussion.


Exercise 2: Playbook Peer Review

Pairs; 60-90 minutes

Setup: Exchange playbooks with a peer at least one week before the exercise.

Review Focus:

Each reviewer evaluates the partner's playbook against these criteria:

CriterionRating (1-5)Specific Feedback
Could a new practitioner follow this?
Are templates complete and usable?
Are decision trees helpful and accurate?
Is governance clear and appropriate?
Is the teaching system realistic?
Is anything critical missing?

Process:

  1. Written feedback (pre-work): Complete evaluation form for partner's playbook.

  2. Share feedback (15 min each): Discuss strengths, gaps, and suggestions.

  3. Clarifying discussion (15 min each): Reviewer explains ratings; author asks questions.

  4. Improvement planning (15 min each): Author identifies 3-5 specific improvements based on feedback.

Output: Each participant has specific improvement list and revised playbook plan.


Exercise 3: Culture Assessment Calibration

Groups of 3-4; 45 minutes

Setup: Each participant completes cultural assessment of their own organization before the exercise.

Process:

  1. Share assessments (5 min each): Each participant presents their organization's scores and key findings.

  2. Calibration discussion (15 min): Group discusses:

    • How did different participants interpret the assessment questions?
    • What standards did each apply for "strong" vs. "weak"?
    • Are some assessments more optimistic or pessimistic than warranted?
  3. Barrier comparison (10 min): Group identifies:

    • Which barriers appear across multiple organizations?
    • Which barriers are organization-specific?
    • What mitigation strategies worked for similar barriers elsewhere?
  4. Action exchange (10 min): Each participant shares one mitigation action they'll take; others offer suggestions.

Output: Calibrated cultural assessments and peer-informed mitigation plans.


Exercise 4: Teaching Simulation

Pairs; 45 minutes

Setup: Each participant selects one A.C.O.R.N. phase to "teach" to their partner.

Process:

  1. Teach (10 min): One participant teaches their selected phase to their partner, using only their playbook as reference. The "student" has not read this phase and asks genuine questions.

  2. Feedback (5 min): Student provides feedback:

    • What was clear?
    • What was confusing?
    • What questions weren't answered?
  3. Switch and repeat (15 min): Reverse roles with different phase.

  4. Debrief (15 min): Both participants discuss:

    • What was harder to teach than expected?
    • Where did the playbook support teaching? Where did it fall short?
    • What would you change about your teaching approach?

Output: Insights on teaching effectiveness and playbook gaps.


Exercise 5: Succession Planning Discussion

Groups of 3-4; 30 minutes

Setup: Each participant identifies one critical role in their planned Center of Excellence (could be their own role).

Process:

  1. Present the role (3 min each): Describe the role and why it's critical.

  2. Identify succession risks (2 min each): What happens if this person leaves? What knowledge would be lost?

  3. Group problem-solving (15 min): For each role, the group brainstorms:

    • How could knowledge be distributed?
    • Who could be developed as successor?
    • What documentation would help?
    • What governance changes would reduce dependency?

Output: Succession risk assessment and mitigation ideas for critical roles.


Discussion Questions

For group discussion in workshops or learning cohorts:


Discussion 1: The Hero Problem

Diana's first approach created dependency on her. Many organizational heroes resist building systems that would make them less essential.

  • What incentives maintain hero culture?
  • How can leaders build capability without undermining their own position?
  • What signals tell you someone is a "hero" versus a genuine capability builder?

Discussion 2: Governance Overhead

Clear governance prevents chaos, but governance also creates overhead. Some organizations have too little governance (chaos); others have too much (bureaucracy).

  • How do you know when governance is helping versus hindering?
  • What are warning signs that governance has become bureaucratic?
  • How should governance evolve as the CoE matures?

Discussion 3: Cultural Barriers

Module 7 argues that culture determines whether structures produce results.

  • Is culture change possible, or must leaders work within existing culture?
  • What's the appropriate scope for a CoE leader's cultural ambition?
  • When should a leader conclude that cultural barriers are insurmountable?

Discussion 4: Scaling Trade-offs

Scaling requires accepting variance. Diana's trainees didn't execute as well as Diana would have.

  • What quality variance is acceptable in exchange for scale?
  • How do you maintain standards while building capability?
  • When does the drive for consistency become counterproductive?

Discussion 5: The Measure of Mastery

The anchor principle states: "The measure of mastery is whether others can do it without you."

  • Do you agree with this definition of mastery?
  • What's lost when you shift from "doing" to "enabling others to do"?
  • How do you find satisfaction in leadership if your contribution becomes less visible?

Consolidation Exercise

Complete at the end of Module 7

The Leadership Letter

Write a letter to yourself, to be opened in one year, addressing:

  1. My commitment: What am I committing to do differently as a result of Module 7?

  2. My concern: What aspect of the practitioner-to-leader transition am I most worried about?

  3. My measure: How will I know if I've succeeded in building organizational capability?

  4. My support: Who will help me when leadership gets difficult?

  5. My reminder: What insight from Module 7 do I most need to remember?

Seal the letter (physically or digitally) with a calendar reminder to open it in 12 months.


Module 7 Completion Reflection

Before proceeding to the Capstone, complete this final reflection:

The Full Journey

You've now completed all seven modules of the discipline:

  • Module 1: Foundation—capability without clarity is dangerous
  • Module 2: Assess—the map is not the territory
  • Module 3: Calculate—proof isn't about being right, it's about being checkable
  • Module 4: Orchestrate—design for the person doing the work
  • Module 5: Realize—one visible win earns the right to continue
  • Module 6: Nurture—systems don't maintain themselves
  • Module 7: Lead—the measure of mastery is whether others can do it without you

Reflect:

  1. Which module's principle was most challenging for you? Why?

  2. Which module's deliverable will have the most impact in your organization?

  3. How has your understanding of AI-augmented operations evolved since Module 1?

  4. What would you tell someone just starting Module 1 that you wish you had known?

Write 300-400 words capturing your journey through the discipline.


These exercises support learning consolidation through reflection, peer engagement, and practical application. They are designed to be completed alongside Module 7 content, not as an add-on afterward.