New: AI-Ready Workforce-Aligned Program Design course launching Q3 2026. Get notified →
Dr. Tennyson Johnson

CLNA evidence that actually holds up

Most CLNAs are written for compliance and read like it. Here's what evidence that actually holds up looks like — three layers, a practical table, and what to stop doing immediately.

Cover Image for CLNA evidence that actually holds up

CLNA evidence that actually holds up

Most CLNAs are written for compliance and read like it. Forty pages of survey screenshots, a state labor market PDF, a list of advisory committee attendees, and language like "stakeholders were consulted to inform program improvement decisions."

Nobody — not your superintendent, not your board, not the state monitor who actually reviews your work — believes a word of it. They sign off because the form is filled out, not because the document told them anything.

This isn't a problem with Perkins V. The statute is reasonably clear about what it asks for. It's a problem with how districts treat evidence: as documentation that something happened, rather than as the connective tissue between data and decisions.

Here's what evidence that actually holds up looks like, and what to stop doing immediately.

The framing problem

Perkins V requires a Comprehensive Local Needs Assessment at least every two years across six elements: student performance, program-LMI alignment, scope/size/quality, programs of study implementation, educator recruitment and training, and equity for special populations.

Each of those is a question. The CLNA is the answer. The mistake almost everyone makes is treating evidence as proof — proof that data was reviewed, proof that stakeholders showed up, proof that an analysis happened.

That's not what a credible CLNA is for. A credible CLNA shows what the data means for the program's next two years of decisions. Proof of process is the floor. Anyone who's read a few CLNAs can tell within five minutes whether they're holding a binder of screenshots or a planning document. State monitors can tell faster.

If your CLNA can't survive being handed to a successor on day one — if a new director can't read it and know what to keep, what to change, and what to defend at the next board meeting — it isn't evidence. It's paperwork.

Three layers, in order

Useful CLNA evidence has three layers. They're cumulative — Layer 3 doesn't exist without Layer 2, which doesn't exist without Layer 1.

Layer 1: data that's actually local

Most CLNAs over-rely on this layer because it's the easiest to assemble. State labor market projections, BLS occupational data, district enrollment counts, completer rates, credential pass rates. All of it should be in the document. None of it proves anything about your program by itself.

The failure mode: pasting state-level data into a regional CLNA. The state says healthcare is growing. Your county says the regional hospital system just merged and froze hiring for eighteen months. You used the state number anyway because it's easier to download. Now your CLNA argues for expanding a healthcare program your local employers can't absorb.

If the data isn't local — county, regional workforce board, employer-specific — it's context, not evidence.

Layer 2: stakeholder voice that says something

"Stakeholders were consulted" is the laziest sentence in the entire CLNA literature. Perkins V explicitly requires consultation with employers, labor organizations, parents, students, special populations representatives, and postsecondary partners. Consultation is doing the work in that sentence, and most CLNAs treat it as synonymous with attendance.

What stakeholder evidence should look like:

  • Advisory committee minutes that show actual disagreement, with named participants
  • Employer interviews documented as outcomes, not just dates
  • Student focus group summaries with anonymized quotes and clear themes
  • Postsecondary articulation conversations where someone said no to something

What it usually looks like: an attendance log and a generic claim of consultation.

The test is simple. Can you point to one thing a stakeholder said that surprised you? If the answer is no, you didn't consult — you presented. There's a difference, and the people reading your CLNA can tell.

Layer 3: decisions you can actually trace

This is the layer almost no CLNA builds intentionally, and the one that turns the document from a binder into a planning instrument.

Across two decades in CTE — teaching, building curriculum, sitting on committees, running programs — I've watched the same pattern repeat. A committee spends three meetings reviewing labor market data and stakeholder feedback before anyone asks the question that actually matters: what are we going to stop doing because of this? Until someone says it out loud, the data review is just a ritual.

Decision linkage is one question, applied to every piece of evidence: what changed because of this?

Labor market data shows welding demand declining → decision to phase the program down to one section → reflected in the next local application budget → reviewed in two years to confirm the call was right.

Employer feedback shows graduates lack troubleshooting skills → decision to add problem-based learning in semester two → curriculum map updated → instructor PD scheduled → measured in next year's employer satisfaction survey.

Evidence that doesn't drive a documented decision is just data sitting in a document. Programs that build decision linkage end up with CLNAs their boards take seriously. Programs that skip it end up with CLNAs nobody references between submission cycles.

A practical evidence table

Here is the structure that works. It's deliberately simple — a table, not a system.

Perkins elementLocal data (Layer 1)Stakeholder input (Layer 2)Decision driven (Layer 3)OwnerReview date
Core indicator: 2S1District ELA scores by programInstructor focus group, March 2026Add literacy support block to two programsDirector of CTESep 2026
Program-LMI alignment: IT pathwayCounty five-year IT projections + employer surveyTech employer roundtable, April 2026Sunset legacy A+ track, build cybersecurity strandPathway leadJan 2027

Populated honestly across all six required elements, this table will outperform a forty-page narrative on every dimension that matters: clarity, defensibility, usefulness for the people who execute the plan.

What state monitors actually look for

State monitors aren't trying to catch you. They're trying to find out whether the CLNA was real — whether it actually shaped how money was spent and how programs evolved. The questions they ask, in different forms:

  • Can you point to a decision in last year's local application that traces back to a specific CLNA finding?
  • Who, by name, did you consult, and what did they say?
  • What did you stop doing as a result of the CLNA?

That last one is the tell. A CLNA that resulted in zero stops — no programs sunset, no pathways consolidated, no investments redirected — is a CLNA that wasn't really used. Monitors know this. So do superintendents. So do the employers on your advisory committee who watched you nod through their feedback and then change nothing.

What to drop

Six things that show up in most CLNAs and add nothing:

  1. State-level data presented as if it were local. Replace with regional or strip out entirely.
  2. Attendance lists framed as consultation evidence. They're proof of a meeting, not proof of input.
  3. Generic equity language unhooked from data. "We are committed to serving special populations" without disaggregated outcomes is filler.
  4. Multi-page literature reviews of CTE best practices. Nobody reads them. Cut to your local conclusions.
  5. Aspirational statements about programs you haven't started yet. Save them for the local application; the CLNA documents what is, not what you wish were true.
  6. Hedged language designed to avoid commitment. "May consider exploring" doesn't drive a decision. Either you decided something or you didn't.

The two-year cycle that actually works

The CLNA is a two-year instrument, but the evidence work isn't a two-year project. Programs that do this well treat evidence collection as continuous:

  • Year 1, Q1–Q2: Refresh local labor market data; schedule employer interviews
  • Year 1, Q3–Q4: Run advisory committee deep dives on each program area
  • Year 2, Q1: Compile evidence table; identify gaps
  • Year 2, Q2: Draft narrative tied to the evidence table
  • Year 2, Q3: Stakeholder review and revision
  • Year 2, Q4: Submit, then immediately begin Year 1 of the next cycle

Programs that wait until the submission window to assemble evidence end up with the binder-of-screenshots version. Programs that build the evidence table continuously end up with a document the next director can pick up on day one.

What this is really about

Evidence isn't really about Perkins. It's about whether your CTE program can defend its decisions to an outside reader two years from now. That's the same skill that wins board approvals, attracts employer partners, and survives superintendent transitions.

A CLNA built around an honest evidence table is a CLNA you can hand to anyone — your board chair, a new director, a state monitor, a skeptical employer — and have them come away knowing what you decided and why.

That's worth more than the binder.


If you're building or refreshing a CLNA right now, our Perkins V and CLNA Evidence Planning course walks through the evidence table approach in detail, with templates for each of the six required elements.

Try the product on your own workflow

Run an interactive demo (no account), or book 20 minutes with the founder to walk through curriculum, AI readiness, or program design.

Do Not Sell My Data