Three tracks. Two weeks. Everything you need to move from thesis to traction. LearnAIR is one piece of this. The thesis is bigger.
You've got a strong thesis, a real network, and a window right now where the right moves compound. Three tracks running in parallel. All three matter. None of them wait on anyone.
Today proved you can build rapport and hold a room. Justin said the coffee was "critical." That's your strength. The next step is pairing that presence with the depth to walk someone through the thesis cold, on your feet, in your own words.
Start with these three sections from this page:
Then go deeper:
By end of week 2, the bar is: someone asks you "why veterans and AI, and what have you found?" and you give a 5-minute answer that hits the competitive gap, the funding landscape, the employer demand, and your network advantage. Your words, your conviction.
What from this list do you already feel strongest on? That's where you spend the least time. Go deep on the gaps.
Your Rolodex is real. You've named Honor Foundation, BreakLine, Deloitte Veterans, BetterUp, Mission 22. Those aren't cold leads. Those are people who know you or are one degree away.
Two purposes here. First, these conversations sharpen the thesis. You learn what they're seeing, what's working, what's not. Second, some of these lead somewhere: a partnership, a role, a paid engagement. Both outcomes are good.
Not pitching. Not "we're doing a veteran program." Just: "I'm working on veterans and AI upskilling. Curious what you're seeing in your world." Let the conversation breathe.
Suggested first two:
Put feelers out for roles. Your background (Marine officer, Deloitte consulting, therapy/coaching, Forest Service) positions you for:
LinkedIn, your network, direct outreach. The goal isn't to land something in two weeks. The goal is to have motion so you're not waiting by the phone.
These conversations are yours regardless of what happens with LearnAIR. If they say yes, you walk in with fresh network intel and external validation. If they say no or stall, you already have momentum.
Which two from your network feel like the most natural first calls? Go with your gut there.
LearnAIR is one vehicle for the veterans + AI thesis. It might be the right one. It might not. Either way, the thesis gets sharper by working it.
By end of week 2, the bar is: you've talked to at least one person at WorkSource or Mission 22, you've floated the idea to someone in your F3 network, and you've built or walked through at least one agent workflow yourself.
LearnAIR is on standby until next week. Here's the honest picture:
| Action | Track |
|---|---|
| Read Claude Code docs | Learn |
| Subscribe to Every newsletter | Learn |
| Read Business Model + Oregon sections | Learn |
| Reach out to BreakLine | Conversations |
| Float the idea to 1-2 guys in F3 network | Thesis |
| Action | Track |
|---|---|
| Two more network conversations | Conversations |
| Start job search: LinkedIn, direct outreach | Conversations |
| Build or walk through first agent workflow | Thesis |
| Follow up with Mission 22 | Thesis |
| Read Curriculum + Scripts sections | Learn |
| LearnAIR decision expected | LearnAIR |
You have a strong thesis, a real network, and three months to make something happen. The worst thing you can do right now is wait. The best thing you can do is be in motion on all three tracks so that when the next conversation happens, whether it's with Justin, with Honor Foundation, or with a hiring manager, you're not presenting a deck. You're telling them what you've already started doing.
What feels like the highest energy move for you right now? Start there.
Stay moving.
Strategic clarity. Read this first.
Employers will pay for veterans trained in AI operations supervision.
Not "veterans can learn AI." Not "mission command maps to AI supervision." Not "there's a gap in the competitive landscape." Those are supporting arguments. The load-bearing question is whether someone writes a check.
3 employer discovery conversations. The question isn't "would you hire an AI operations supervisor?" It's:
If three different employers describe a problem that sounds like "my people are using AI but nobody's governing the outputs," we've validated demand. If they describe something different, the thesis adapts.
We are not employees. We are not volunteers. We are a consultancy that owns its IP.
| Person | What They Bring | Open Question |
|---|---|---|
| Mike Deal | Marine vet, Deloitte, USFS facilitator, BreakLine network | Technical depth (ramping) |
| Justin Coats | LearnAIR brand, SDVOSB, OpenAI partnership, 100+ clients | Is the vet program his priority or Teresa's? |
| Teresa Coats | Operations, sport science, passion for vet program | The execution gate. More aligned than Justin. |
| Trisha Sargent | L&D (Consumer Cellular), Easter Seals HVRP | Advisor/guide role, not operator |
| Bobby Napiltonia | AppExchange creator, Twilio first CRO, enterprise GTM | Not engaged directly yet. Massive if activated. |
23 competitors mapped. The gap is real. The window is 12-18 months.
Every veteran AI program in 2026 does one of two things: it teaches veterans to use AI tools (Google certs, OpenAI sprints, VetsinTech workshops), or it helps veterans get hired (Hiring Our Heroes, BreakLine, FourBlock, ACP). None trains veterans to supervise AI-enabled workflows.
This is the exact skill that enterprise AI adoption demands and that military service uniquely prepares people to deliver. Veterans already operate in mission command environments: decentralized execution under commander's intent, where human judgment governs semi-autonomous systems under uncertainty.
The market has AI literacy programs and veteran career programs. It has no veteran AI leadership program. That's the gap.
Corporate Fellowship at Google, Microsoft, Salesforce, Amazon. Some rotations touch AI/data. Strong brand, weak AI depth. AI exposure is incidental to placements.
Free remote immersive sprints. Hiring partners include Anduril, Meta, Google, Palantir. Zero AI technical depth. Career accelerator, not a training program.
Dedicated "Vets in AI" program launched 2025. ML, data analytics, AI ethics. Events with Nvidia, Microsoft, Google. Closest veteran-specific competitor on AI depth. But focused on making veterans into AI practitioners, NOT training them to supervise workflows.
Pilot: 4-hour hands-on AI sprint (Mar 2026, 100 cap). Free ChatGPT Plus for transitioning veterans. Tool literacy, not workflow supervision. Still in pilot phase, tiny scale.
New national initiative targeting Officers and Senior Enlisted. LearnVantage platform with "agentic AI" courses. White House AI Education Taskforce participant. But: it's a corporate hiring funnel, not an independent training program. Veterans become Accenture employees, not broadly skilled AI supervisors.
Coursera-hosted, ~$200 total. Free for veterans via Google Launchpad. Covers AI fundamentals and prompt engineering. No workflow management or supervision framework.
AI + Fullstack Engineering bootcamp. VA-approved. Strongest current competitor combining veteran focus + AI content + government funding. But trains coders who use AI, not managers who supervise AI workflows.
Reauthorized Dec 2024, funded from Oct 2025. Still not operational. VA has not approved any providers yet. ~4,000 spots/year expected. Applications may open June 2026. When it reopens, approved providers will have a massive distribution advantage.
Two axes: veteran-specificity (horizontal) and AI depth (vertical).
| Risk | Severity | Mitigation |
|---|---|---|
| Hiring Our Heroes adds "AI Management" track | HIGH | Move fast. Establish "mission command" as the recognized vocabulary. Build curriculum IP (case studies, simulations) that can't be replicated by bolting a webinar onto a fellowship. |
| Accenture scales its veteran AI initiative | HIGH | Accenture funnels to Accenture. Emphasize independence: "We don't train you for one employer, we train you to lead AI operations anywhere." |
| VET TEC 2.0 approval gatekeeping | MED | File VET TEC 2.0 provider application immediately. Simultaneously pursue GI Bill approval as parallel path. |
| "AI supervisor" role doesn't materialize as distinct job | MED | Frame credential as additive, not replacement. "AI Operations Leadership" is a skill overlay on existing management. |
| Google/OpenAI deepen free veteran AI training | MED | Depth beats breadth. Position as post-certification: "You got your Google AI cert. Now learn to lead AI operations." |
The window is 12-18 months. Accenture's veteran initiative and VET TEC 2.0 will reshape the landscape by late 2026 or early 2027. The advantage is being first to name the role (AI operations supervisor), first to connect it to a military framework (mission command), and first to build a credentialing program around it. Speed and curriculum depth are the moat.
Real people at real companies. The contact phase starts here.
Brian Buch, VP Information and Digital Services
35-year Les Schwab veteran who owns IT and digital strategy. Recently earned all 12 AWS certifications including AI Practitioner. Actively thinking about AI/data strategy at enterprise scale. HQ Bend, OR. ~8,000 employees, 500+ stores. Warmest lead.
Rebecca Berry, MBA, VP & Chief Human Resources Officer
At St. Charles since 2007. Led their workforce turnaround (reducing reliance on traveling nurses). AI workforce readiness lands squarely in her domain. HQ Bend, OR. Largest healthcare provider in Central Oregon, ~3,800 employees.
Mark Nordstrom, EVP Operations
15+ years driving operational excellence at a company that builds cooling infrastructure for hyperscale data centers (Meta, Apple). Based in Bend. HQ Redmond, OR.
Jared Holum, CPA, President
Largest locally-owned accounting firm in Oregon, 165+ employees. Professional services firms face the sharpest gap between individual AI usage and firm-wide coordination. Portland, OR.
Venki Krishnababu, Chief Technology and Information Officer
Joined Dec 2024 from lululemon (7 years as CTO). Nearly 30 years of enterprise technology leadership. A new CTIO building out tech strategy at a fast-growing company. 900+ locations. Grants Pass, OR (NYSE: BROS).
"I'm building a program that trains veterans to manage AI operations. I'm looking for employers feeling the gap between individual AI productivity and organizational coordination. Would you be open to a 15-minute call?"
Don't send all 5 at once. Start with the warmest (Brian Buch at Les Schwab) and iterate based on response.
Three scripts for three audiences. 25-35 minutes each. Signal extraction, not selling.
Target: VP of Operations, Program Manager, or HR Director. Prioritize orgs with existing veteran hiring commitments.
"Thanks for making time. We're exploring a workforce training concept focused on veterans transitioning into AI-adjacent roles. Not building AI models, but supervising AI systems in operational settings: reviewing outputs, catching errors, managing workflows where AI is doing the first pass and a human makes the final call. We're early. We haven't built anything yet. And we're talking to people like you to understand whether this solves a real hiring problem. Nothing to buy, no pitch. I just want to learn from your experience."
Questions:
Commitment test: "If we ran a pilot cohort of 10-15 veterans this summer, trained specifically on AI workflow supervision with a capstone project, would you be willing to interview the top graduates? No obligation to hire, just review their work and give us feedback on readiness."
Target: Program Director or Executive at Hiring Our Heroes, BreakLine, a state workforce board, a Vet Center, or a VSO with a career transition program.
"Appreciate you taking this. We're designing a training program that would prepare transitioning service members for AI workflow supervision roles. The jobs where a person reviews, validates, and manages AI system outputs in fields like healthcare operations, government services, and logistics. We're not trying to turn veterans into software engineers. We're focused on the oversight and judgment layer. Before we build anything, we want to learn from organizations like yours."
Questions:
Commitment test: "If we designed a 4-6 week pilot this summer and needed 10-15 veteran participants, would your organization help us recruit them? Even informally? And would you be open to a follow-up conversation once we have a curriculum outline?"
Target: E-5 to O-4, 6-15 years of service, within 12-24 months of transition. Any branch, especially operations, intelligence, logistics, medical, or maintenance backgrounds.
"Thanks for your time. I know transition is a busy period. We're building a training program for veterans that focuses on AI workflow supervision. That means: jobs where AI does a first draft or a first pass, and a human reviews it, catches mistakes, and makes the final call. Think quality assurance for AI systems, not programming them. It maps pretty directly to the kind of process discipline and judgment you've been using in the military."
Questions:
Commitment test: "We're planning a pilot cohort this summer. If you qualified, would you apply? And, separate question, can you think of two or three people who'd also be interested?"
15-20 validation conversations across employers, VSOs, and veterans. Pattern analysis. Draft v0.1 curriculum outline. Identify pilot partners. Research funding (WIOA, SkillBridge, state workforce board).
Stop and reassess if:
10-15 participants, 4 weeks, remote-first. Screen applicants. Onboard. Teach. Collect feedback after every session. Adjust in real time. Submit first funding application.
Capstone projects presented to employer panel. Compile results. Write the Pilot Results Brief (your primary credibility asset). Go/no-go on Cohort 2.
| Signal | Strong (Scale) | Mixed (Iterate) | Weak (Pause) |
|---|---|---|---|
| Completion rate | 80%+ | 60-79% | <60% |
| Employer "would hire" | 3+ employers | 1-2 | 0 |
| Participant NPS | 50+ | 20-49 | <20 |
| 90-day placement | 50%+ | 25-49% | <25% |
| Funding for next cohort | Secured | Pending | No leads |
Veterans, AI, employers, and funding in your backyard.
| Company | Industry | Why |
|---|---|---|
| St. Charles Health System | Healthcare, ~4,500 emp | Largest employer in region. Clinical AI + compliance. |
| BASX Solutions | Mfg (data center cooling), ~500+ emp | Hyperscale clients demand process rigor. |
| Les Schwab | Auto services, 500+ locations | Already bought ChatGPT Team. Warmest lead. |
| Health Elements AI | AI/Healthcare, 10-30 emp | Actual AI company. Would hire veterans directly. |
| Dutchie | Cannabis tech, 100-300 emp | Fast-growing SaaS, heavy AI users. |
| Meta (Prineville) | Data center, ~150 permanent | 3.2M sq ft, $2B+ invested. |
| Apple (Prineville) | Data center, ~150 permanent | 3.2M sq ft campus. |
| Company | Industry | Why |
|---|---|---|
| Intel (Hillsboro) | Semiconductor/AI, 22,300 emp | Largest private employer in OR. AI at massive scale. |
| Nike (Beaverton) | Sportswear/tech, ~12,000 HQ | AI across design, supply chain, marketing. |
| Daimler Trucks NA | Manufacturing, 34,000 emp | Heavy manufacturing + AI in logistics. |
| Northrop Grumman | Defense R&D, 500+ emp | Defense contractor, hires veterans. |
| Dutch Bros Coffee | F&B, 16,500 emp | 900+ locations. Same distributed challenge as Les Schwab. |
| Perkins & Co | Accounting, 200+ emp | Professional services = acute AI pain. |
| Source | Amount | Details |
|---|---|---|
| ODVA Veteran Services Grant | $972K (2025-27) | Competitive grants to nonprofits. Contact: Brenna Bandstra, brenna.bandstra@odva.oregon.gov |
| JVSG | $2.4M | 100% federal (DOL-VETS). Funds LVERs/DVOPs. |
| Oregon WIOA Title I | Part of ~$2.9B national | Priority of service for veterans. Administered by OR Employment Dept. |
| Oregon-NVIDIA AI Partnership | $10M state | AI Ambassador Program, campus AI integration. |
| Source | Amount | Details |
|---|---|---|
| Future Ready Oregon | $200M total (nearly spent) | Must spend by end of 2026. Veterans explicitly listed as target. |
Employer pain tiers, the consultancy flywheel, and veteran-to-role matching.
Individual workers bought ChatGPT and got 10x more productive at their tasks. But the organization didn't change its processes, workflows, approval chains, or quality controls. Now you have powerful individuals operating in an organizational vacuum. Someone needs to supervise the AI ecosystem. That's the AI Ecosystem Supervisor.
Key stats: 28% of workers currently use generative AI at work. More than half use AI without employer approval. 70% have never received AI training from their employer. 87% of healthcare workers say their employer lacks clear AI policies. 64% of workers have presented AI-generated work as their own.
Companies with ChatGPT Team/Enterprise licenses but no governance. SMBs that did a workshop and now ask "now what?" Professional services firms with partners using AI for drafts, junior staff using it unsupervised. Healthcare organizations (87% lack AI policies. HIPAA + AI = liability).
Manufacturing (quality control, safety documentation). Healthcare systems (clinical documentation, billing). Construction (permitting, compliance). Logistics (distributed operations).
Tech startups (small teams using AI heavily but no dedicated AI ops role). Data center adjacent companies. E-commerce/DTC brands.
Central Oregon trades (HVAC, plumbing, electrical). Professional practices (dental, veterinary, small law). Agriculture/ranching. Hospitality/tourism.
Every consulting engagement generates demand for a veteran placement. Every veteran placement demonstrates the model works, generating referrals for more consulting.
| Phase | What | Price |
|---|---|---|
| Phase 1: AI Readiness Assessment | Audit AI usage, map workflows, interview staff. 1-2 weeks. | $5,000-10,000 |
| Phase 2: Ecosystem Build-Out | Governance policies, workflow templates, monitoring, training. 4-8 weeks. | $15,000-30,000 |
| Phase 3: Ongoing Retainer | Monthly health check, new use cases, staff training, quarterly exec briefing. | $3,000-5,000/mo |
| Phase 4: Veteran Placement | Source from program, 90-day onboarding support. | 15-20% first-year salary |
| Tier | Military Rank | Role | Salary Range |
|---|---|---|---|
| Tier 1 | E4-E6 (Tactical NCOs) | AI Operations Coordinator | $55,000-75,000 |
| Tier 2 | E7-E9 + O1-O3 | AI Ecosystem Manager | $80,000-120,000 |
| Tier 3 | O4+ (Senior Officers) | Chief AI Officer / VP AI Ops | $130,000-200,000+ |
The naming advantage: No one has standardized what to call this role. The program that names it owns the category. The military rank mapping gives employers an instant credibility signal.
Why Alpha School's 2.3x model matters. LearnAIR's gaps. What gets built on top.
Alpha School's homeschool version uses the same software platform but without guides, incentives, or peers. It produces only 1x learning velocity, no acceleration at all. The load-bearing elements are: guide relationships + incentive system + peer effects + physical environment. The AI handles content delivery. The humans handle motivation, accountability, and judgment.
Alpha School puts students in the top 0.1% on standardized tests (97th-99th percentile MAP, 1535 average SAT) with ~2 hours of AI-personalized academics per day. The rest is project-based learning. Results: 2.3x-2.6x national average growth rate.
For adults, the case is even stronger:
LearnAIR's Foundation Series is 3 live sessions x 1.5 hours each. Groups of 5-20. It covers ChatGPT basics, interface, prompting (DIRECT framework), personas, Custom GPTs, and agent mode. It produces real results: "John completed 4-6 months of work in one week."
| LearnAIR Teaches | We Add |
|---|---|
| Build a digital employee | Manage a team of digital employees |
| Personal productivity | Organizational productivity |
| ChatGPT-specific skills | Model-agnostic principles |
| GUI workflows | CLI + voice-first workflows |
| Individual personas | System-level governance |
| "Here's what AI can do" | "Here's how to be responsible for what AI does" |
| 18 minutes of hands-on | Weeks of applied experiential learning |
| Certificate of completion | Job placement tied to outcomes |
Compress instruction. Expand practice. Never remove the guide.
Based on Alpha's architecture, applied to a veteran learning AI supervision:
| Time | Activity |
|---|---|
| 0:00-0:05 | Daily Dash: review progress, today's targets, adaptive path |
| 0:05-0:30 | Concept Module: AI-personalized lesson, mastery-gated |
| 0:30-0:50 | Hands-On Lab: supervised practice in sandboxed AI environment |
| 0:50-1:00 | Break |
| 1:00-1:20 | Scenario Drill: branching decision trees, judgment checks |
| 1:20-1:35 | Spaced Review: quick-fire on material at risk of being forgotten |
| 1:35-1:50 | Mentor Check-in (2x/wk) or Peer Discussion (3x/wk) |
| 1:50-2:00 | Reflection + Tomorrow Preview |
After the 2-hour instruction block: hands-on practice with a guide present. Ratio: 1:2 or 1:3. For every hour of AI-delivered instruction, 2-3 hours of supervised practice. This maps to the 70-20-10 model: 70% experiential, 20% social, 10% formal.
Don't build a 40-hour/week classroom program. Build a 10-hour/week AI-personalized instruction program with 20-30 hours/week of supervised hands-on practice. That's the Alpha model for adults.