Skip to main content
FixturFab

How to Write a Test Specification for Your PCBA

A test specification defines what gets tested, how, and what counts as a pass. Learn how to write one your CM, fixture vendor, and test operators can execute.

A test specification is the engineering document that defines what gets tested on your PCBA, how each measurement is performed, and what constitutes a pass or fail. It is the foundation of your manufacturing test strategy. Every production board needs one. Without a written spec, your CM guesses at requirements, test operators make inconsistent judgment calls, and fixture vendors build to assumptions instead of your actual needs.

Most engineering programs never cover how to write a functional test specification. This guide walks through the structure, the decisions each section forces, and the mistakes that cause rework.

What a test specification covers#

A test spec is a single document that answers six questions about your board:

  1. What gets tested — which nets, components, and functional behaviors
  2. How it gets tested — measurement methods, whether that's functional testing, ICT, boundary scan, or a combination
  3. What passes — explicit numerical limits for every measurement
  4. What equipment is needed — instruments, fixtures, and software
  5. Under what conditions — environmental requirements, power sequencing, warm-up times
  6. How many units — sample plans and lot acceptance criteria

The spec does not need to be exhaustive on day one. But it does need to exist before you order a fixture, negotiate with a CM, or hand a board to a test operator. A vague spec produces vague results — no pcb test plan template can fix that.

When you need a test specification#

You can get away without a formal spec when you're testing five prototypes on your bench with a multimeter. That stops working when any of these happen:

  • You're handing boards to a CM. Your CM needs to know what "tested" means. Without a spec, "we tested it" could mean anything from "we powered it up and nothing smoked" to "we verified every net against schematic."
  • You're scaling past prototype volumes. At 50+ boards per run, consistency matters more than coverage. A test operator running the same board 200 times needs clear pass/fail criteria, not engineering judgment.
  • You're entering a regulated market. Medical, automotive, and aerospace all require documented test procedures. The spec is how you prove what you tested and why. For teams navigating compliance requirements, see our approach to regulated industries.
  • You're onboarding a new product. Even if your test process is mature, every new board design needs its own spec. Reusing a spec from a different product is how you miss the requirements that actually differ.

The cost of testing without a spec shows up later: inconsistent results across shifts, finger-pointing between your team and the CM when failures escape, and fixture revision cycles that could have been avoided by getting the requirements right up front.

Anatomy of a PCBA test specification#

If you search for an electronic assembly test plan template, you'll find spreadsheets with blank columns. A test specification is more than a test plan template for electronics — it's a set of engineering decisions, documented clearly enough that someone who didn't make those decisions can execute them. Here's what each section accomplishes and what it forces you to think through.

Scope and test coverage goals#

Start with identification and boundaries. What product, what revision, what does "tested" mean for this board? This section captures the core PCBA test requirements that every downstream decision depends on.

Product: Widget Controller PCBA
Part Number: WC-100-A, Rev C
Applicable to: All production units
Test coverage target: 95% of nets, 100% of power rails
Supersedes: WC-100 Test Spec Rev 2

Include revision control. Specifications change as products evolve — you need to trace which version was active for any production run.

The coverage target forces a real conversation: are you testing every component, or accepting gaps on low-risk passives? Test coverage strategies affect fixture complexity, test time, and cost. State the target explicitly so everyone knows what's in scope and what isn't.

Test methods and sequence#

Define the order of operations and what each step catches. A typical sequence:

  1. Visual inspection (workmanship, component presence, polarity)
  2. In-circuit test (component values, shorts, opens)
  3. Functional test (power rails, signal behavior, communication interfaces)
  4. Final inspection (labeling, conformal coat if applicable)

The sequence matters. ICT before functional test catches assembly defects before you power the board — saving time on diagnosis and protecting sensitive components from damage caused by shorted power rails.

Pass/fail criteria and measurement tolerances#

This is where vague specs cause the most damage. Every measurement needs explicit numerical limits.

TestLower LimitUpper LimitUnitsMethod
3.3V rail (TP5-TP1)3.1353.465VDC voltage, DMM
5.0V rail (TP6-TP1)4.755.25VDC voltage, DMM
Input current (idle)150mAIn-line, power input
Input current (active)200500mAIn-line, power input
Clock frequency (TP12)7.998.01MHzFrequency counter

Notice the specifics: "DC voltage between TP5 and TP1" is testable. "Measure the 3.3V rail" is not — where, exactly? Specify the test points, the reference, and the conditions (loaded or unloaded, after how much settling time).

Limits from bench testing one prototype are not production limits. A single unit tells you what one board measured, not what production variation looks like. Set limits based on design analysis and component tolerances, with margin for manufacturing variation. Too tight and you fail good boards. Too loose and you ship bad ones.

Equipment and fixture requirements#

List the instruments and their minimum specifications:

EquipmentMinimum SpecExample
DMM6.5 digit, 0.1% accuracyKeysight 34465A or equivalent
Power supply0-30V, 0-3A, programmableKeysight E36312A or equivalent
Oscilloscope100MHz BW, 4 channelRigol DHO804 or equivalent
Test fixtureBed-of-nails, matching PCBA outlinePer fixture drawing

"Or equivalent" provides flexibility while setting a floor. But don't stop at listing equipment — address the choice between development and production instrumentation. A bench DMM is fine for prototyping. A production test station at 200 boards per day needs programmable instruments with remote control capability and automated data logging. State which tier your spec targets.

When selecting between development and production instrumentation, weigh three factors: throughput, automation capability, and calibration cost. Development instruments — bench DMMs, manual oscilloscopes, benchtop supplies — work for engineering validation where an operator records results by hand. Production instruments need SCPI or similar remote control, faster measurement cycles, and rack-mount form factors for automated test stations. Specifying this in your test spec prevents a common failure: building a fixture around bench equipment that cannot sustain the measurement rate your production schedule demands.

Your equipment and test point access requirements drive fixture design directly. A spec that calls for 4-wire resistance measurement requires Kelvin probing — two contacts per measurement point instead of one. That doubles the probe count for those nets and changes the fixture cost. Document these requirements before requesting a fixture quote, not after.

Sample plan and lot acceptance#

Define how many boards you test per lot and what failure rate triggers a lot hold:

  • 100% testing: Every board gets tested. Standard for medical, aerospace, and any application where field failure cost is high.
  • Statistical sampling: Test a defined sample size per lot (e.g., AQL 0.65, inspection level II per ANSI/ASQ Z1.4). Common for high-volume consumer electronics where test time per unit matters.
  • First article inspection: Full test on the first N units of a production run to verify process stability before releasing the lot.

State the plan explicitly. "We test everything" is a valid answer. "We'll figure it out" is not.

Worked example: from schematic to spec#

Consider a sensor interface board with a 3.3V LDO, an I2C temperature sensor, and an SPI flash memory. Walking through the schematic forces specific spec decisions:

  • Power section: The LDO datasheet specifies 3.3V ±2% output. Your pass/fail criteria should be tighter than the datasheet tolerance to catch regulator faults, but not so tight that component tolerance variation fails good boards. Result: 3.234V to 3.366V.
  • I2C sensor: You need to verify the sensor responds to its address and returns a reading within a sane range (not 0xFF, not -40°C in a room-temperature factory). This is a functional test, not a precision calibration.
  • SPI flash: Write a known pattern, read it back, compare. Pass/fail is binary — the data matches or it doesn't.

Each section of the schematic generates specific entries in your pass/fail criteria table. The spec writes itself when you work through the board systematically.

Common mistakes in test specifications#

Over-specifying tolerances. Limits derived from measuring one prototype catch production boards that are perfectly functional but slightly different. Use design analysis and worst-case component tolerances, not bench measurements.

Omitting fixture requirements. "We'll figure out the fixture later" means the fixture vendor has to guess at probe count, test point locations, and board support needs. Your spec should include enough mechanical detail — board outline, keep-out zones, test point coordinates — for a fixture quote. Design for test guidelines help here.

Writing specs that assume specific equipment. "Program the Keysight 34465A to measure..." ties your spec to one instrument. Specify what you need to measure and to what accuracy. Let the test station design determine which instrument fills that role.

Ignoring environmental conditions. A board that passes at 23°C may fail at the edges of its operating range. If your product ships to environments outside a climate-controlled factory, state whether your spec covers room-temperature testing only or includes temperature range testing. Not specifying this is itself a decision — make it deliberately.

From specification to test system#

A test specification is a document. A test system is the physical implementation of that document — the fixture, instruments, software, and wiring that turn your spec into repeatable pass/fail results.

The spec drives every downstream decision:

  • Probe count and type come from your measurement point list and accuracy requirements
  • Instrument selection comes from your measurement methods and throughput targets
  • Fixture mechanical design comes from your board outline, test point locations, and equipment requirements
  • Test software comes from your procedure sequence and pass/fail criteria

Teams that skip the spec and jump straight to fixture design end up in revision cycles. The fixture arrives, someone realizes it's missing probes for three test points that were never documented, and the fixture goes back for modification. Writing the spec first is faster than redesigning the fixture later.

Next steps#

Your situation determines the next step:

Check your board's testability

Upload your design files and get a free testability analysis. See which nets have test access, which don't, and what to change.

Last updated:March 8, 2026