Audit NVDA JAWS Testing: How to Verify Both Are Included

To confirm an accessibility audit includes both NVDA and JAWS evaluation, ask the vendor in writing which screen readers are used, request a sample report showing screen reader observations, and verify the auditor’s methodology lists each tool by name. A credible vendor will name the screen readers, the browser pairings, and the operating system used during evaluation. If a vendor cannot answer these questions directly, the audit likely does not cover both.

NVDA and JAWS behave differently with the same code. An audit that only uses one will miss issues the other would catch. Verifying coverage upfront prevents surprises after delivery.

How to Verify NVDA and JAWS Coverage in an Accessibility Audit
Verification Step What to Look For
Methodology document Names NVDA and JAWS by version, paired with specific browsers
Sample audit report Issue findings reference screen reader behavior observed during evaluation
Auditor credentials Evaluator has documented experience with both screen readers
Scope confirmation Statement of work lists screen readers used as part of the evaluation environment
Operating system pairing NVDA and JAWS run on Windows; macOS-only audits do not cover them

Why both NVDA and JAWS matter in an audit

JAWS is the most widely used commercial screen reader. NVDA is the most widely used free screen reader. They are built on different engines and announce content differently, especially with ARIA, dynamic updates, and complex widgets.

A button labeled correctly for one may be silent or confusing in the other. Tables, forms, and live regions are common areas where the two diverge. An audit covering only one screen reader produces an incomplete picture of how users actually experience the product.

What does a thorough audit methodology look like?

A credible methodology document lists the tools used during the evaluation. For screen readers, that means naming NVDA and JAWS, the version of each, and the browser pairings. JAWS is typically paired with Chrome or Edge. NVDA is typically paired with Firefox or Chrome.

The methodology should also confirm the operating system. NVDA and JAWS are Windows-only. If a vendor’s environment is macOS-based, the audit cannot cover either. VoiceOver is a separate tool and does not substitute for NVDA or JAWS.

How can you tell from a sample report?

Ask for a sample audit report before signing. Look at how issues are written. A report grounded in real screen reader use will reference what was announced, what was missing, or how navigation behaved.

If the issue descriptions are generic and never mention screen reader output, the audit was likely scan-driven or limited to code review. Scans only flag approximately 25% of issues and cannot evaluate screen reader behavior. Real screen reader evaluation produces specific, contextual findings.

Questions to ask the vendor before signing

Direct questions get direct answers. Vendors that cover both screen readers will respond confidently. Vendors that do not will hedge or redirect.

Which screen readers do you use during the audit? What versions of NVDA and JAWS are part of your standard environment? Which browsers do you pair with each screen reader? Can you provide a sample report that shows screen reader-specific findings? Is screen reader evaluation included in the base price or an add-on?

The last question matters. Some vendors quote a low base price and treat screen reader evaluation as an upgrade. Confirm what is included before you compare quotes.

What about mobile screen readers?

NVDA and JAWS are desktop tools. If your product has a mobile app or a responsive site that users access on phones, mobile screen reader evaluation is a separate consideration. VoiceOver on iOS and TalkBack on Android cover that environment.

For a website audit, confirm desktop coverage with NVDA and JAWS first. For a mobile app, ask about VoiceOver and TalkBack. A full evaluation across desktop and mobile uses all four where relevant.

Frequently asked questions

Should I pay extra for an audit that includes both NVDA and JAWS?

No. Both should be standard for any thorough website or web app evaluation. If a vendor positions screen reader coverage as a premium add-on, that is a sign the base offering is limited. A credible auditor includes both as part of the core methodology.

Can a vendor evaluate WCAG conformance without using screen readers?

Several WCAG success criteria depend on assistive technology behavior to evaluate accurately. Name, Role, Value (4.1.2), Info and Relationships (1.3.1), and Status Messages (4.1.3) are examples where screen reader output is the most reliable way to confirm conformance. Code review alone leaves gaps.

What if my vendor only uses NVDA?

NVDA-only coverage is better than no screen reader coverage, but JAWS users represent a large portion of the screen reader audience. Differences in announcement behavior between the two mean issues affecting JAWS users can go undetected. Ask whether JAWS can be added before the audit begins.

How do I know the auditor is qualified to use these tools?

Ask about the evaluator’s background. Auditors with DHS Trusted Tester certification, IAAP credentials, or documented years of hands-on screen reader use are reasonable signals. A qualified auditor can describe specific differences between NVDA and JAWS without prompting.

Verifying screen reader coverage takes one email. Skipping that step can leave a category of issues entirely unaddressed in the final report.

Looking for vetted accessibility auditors who cover both NVDA and JAWS? Contact Accessibility Base to browse the directory.

Leave a Comment