Skip to Content

Blog > VPAT Quality: How to Spot a VPAT That Wasn’t Properly Tested

VPAT Quality: How to Spot a VPAT That Wasn’t Properly Tested

Karl Groves. - 07/04/2026

Introduction

A VPAT that claims “Supports” across the board is a red flag, not a success. I’ve reviewed scores of VPATs (Voluntary Product Accessibility Templates) over the years, and the pattern is unmistakable: many are filed without substantive testing. Organizations create them to satisfy procurement requirements, not to honestly assess conformance. The damage extends beyond dishonest reporting. A weak VPAT puts your organization at legal risk, wastes procurement teams’ time, and undermines trust when discrepancies surface later. Understanding how to evaluate a VPAT’s quality and understanding what signals indicate actual testing occurred separates credible conformance reporting from checkbox exercises.

Note: Strictly speaking, the VPAT is the template format. The finalized document that contains the conformance information is known as an ACR (Accessibility Conformance Report). We use VPAT in this post simply because that’s the more recognizable term, especially for those who may be newer to the topic.

The Problem with Unrealistic Conformance Claims

A VPAT stating 100% “Supports” for each criteria is virtually impossible. Even the most mature, accessibility-focused applications carry partial or conditional support based on implementation context. I’ve audited best-in-class products and found edge cases, environmental limitations, or user configuration scenarios where strict conformance breaks down. When a VPAT claims universal support without qualification, one of two things is true: either the testing was superficial, or the assessment was written to appease stakeholders rather than reflect reality.

The VPAT format requires one of four status options: Supports, Partially Supports, Does Not Support, or Not Applicable. A responsible assessment uses all four, each when appropriate and accurate. When reviewing a VPAT, look for the distribution. If there are zero “Partially Supports” entries across a complex product, demand specific information about the scenarios where support might degrade. A VPAT that paints an optimistic picture of the product deserves more scrutiny, not less.

Red Flags in Remarks and Descriptions

The Remarks section of a VPAT reveals everything about testing quality and whether the people who did the testing actually understood accessibility. A thorough assessment includes specific, actionable information: which features were tested, under what conditions, what percentage of users the support applies to, and what workarounds exist for limitations. A weak VPAT uses generic language or leaves remarks blank.

Compare these two VPAT entries for the same criterion:

Weak: “Supports – fully compliant with standard”

Strong: “Supports – keyboard navigation implemented across primary application workflows. Third-party embedded widgets require fallback keyboard instructions in documentation. ”

The second entry demonstrates actual testing. It specifies scope, identifies conditional limitations, names tools and environments, and acknowledges incomplete assessment areas. This transparency signals competence. The first entry is a placeholder.

Watch for these common red flags in remarks:

  • Blank or missing remarks for “Partially Supports” entries
  • Vague language: “generally compliant”, “no known issues”
  • No mention of testing methodology, browser versions, or assistive technologies
  • Absence of identified limitations or edge cases
  • Boilerplate text that repeats across unrelated criteria
  • No dates or version information

Evaluating Testing Methodology

A VPAT backed by genuine testing includes documentation of methodology. What assistive technologies were used? JAWS? NVDA? VoiceOver? Did testing include mobile screen readers (TalkBack on Android, VoiceOver on iOS)? What about voice control, switch access, or magnification tools?

The scope matters. A “full” test means evaluating all core user workflows. A responsible VPAT specifies what was tested and, critically, what wasn’t. Many VPATs test happy-path scenarios and ignore edge cases, like advanced search functionality or permission-based workflows

Our experience shows that thorough VPAT assessments of commercial products typically takes around 100 hours of testing, depending on product complexity. If the document was assembled in a week or two, corners were cut. Request the testing timeline and resource allocation. A rushed VPAT is an unreliable VPAT.

Look for testing across relevant platforms. A SaaS application used on desktop and mobile should have Android and iOS testing documented. A desktop software shouldn’t claim mobile conformance without that testing. Specificity about what’s not applicable matters more than breadth of support claims.

The Revision Date Problem

VPAT documents often carry old dates. A VPAT from 2022 claiming current conformance is inherently suspect. Web and desktop products change constantly. New features, updated dependencies, design system modifications, and framework upgrades all introduce new accessibility risks. A responsible organization tests VPAT claims annually or when significant updates to the UI occur.

Check the revision date against the product’s release timeline. If the product had a major redesign six months after the VPAT was completed, the conformance picture has almost certainly shifted. The further the gap between testing and present day, the higher your uncertainty. For rapidly evolving products (which most digital products are), a VPAT older than 12 months requires updated testing before relying on its claims.

Verification Methods for Procurement Teams

When you receive a VPAT in procurement, follow this checklist:

  1. Check revision date. Anything older than 12 months deserves updated testing.
  2. Review density of remarks. When we produce a VPAT, every single cell in the Remarks column has content in it that justifies exactly how the product Supports, Partially Supports or Does Not Support the criteria. And, if we marked the item as Not Applicable, we explain why.  This way there’s no ambiguity as to how the answer was determined. That said, we recognize that others aren’t so thorough. A good rule of thumb is to expect detailed comments on anything that Does Not Support or Partially Supports.
  3. Verify testing scope. Request documentation of platforms, browsers, assistive technologies, and workflows tested. For high dollar/ high risk procurements, ask for more than just the VPAT. Ask for the supporting documentation upon which the VPAT is based.
  4. Ask about limitations. What did the vendor exclude? Why? Are those exclusions acceptable for your implementation context? Are they relevant to the version/ configuration that will be implemented at your organization? If so, ask for a VPAT that covers how you’ll be using the product.
  5. Request remediation examples. How does the vendor recommend addressing partial support? What documentation exists? What support is provided until remediation is complete?
  6. Confirm evaluation currency. Was this tested on current product versions or legacy ones?

A vendor willing to stand behind their VPAT will provide this evidence without resistance. Hesitation is a signal to dig deeper.

Conclusion

VPAT quality separates vendors who take accessibility seriously from those chasing compliance theater. A strong VPAT shows evidence of rigorous testing, honest limitation disclosure, and specific, actionable information about support conditions. Weak VPATs hide behind vague language and unrealistic conformance claims. When evaluating products in procurement, demand substantive remarks, specific methodology documentation, and current testing dates. Your procurement team and your end users—particularly disabled employees and customers—benefit when you hold vendors accountable for honest conformance reporting.

Next Steps

Build VPAT quality evaluation into your procurement process. Don’t accept template language. Require specificity. Our team regularly conducts independent conformance assessments and VPAT reviews for organizations making significant vendor decisions. We’ve identified critical gaps in vendor claims and provided procurement teams with actionable guidance on actual product accessibility. If your organization is evaluating vendors or needs to assess the quality of existing conformance documentation, let’s discuss how we can clarify the real accessibility picture.

Learn more about conformance reporting and VPAT evaluation services

Related Blog Posts

The major technical reasons why accessibility overlays don’t work

The Overlay Factsheet describes an accessibility overlay as follows: The fundamental constraint you will notice as you dive into the details below is that post-rendered remediation is working against the framework, not with it. React (and similar frameworks) own the DOM. The framework expects to be the single source of truth for element structure, attributes, […]

Karl Groves - 10/04/2026

Beware of “AI” accessibility audits

And, even moreso: beware of companies who sell them. I performed my first professional accessibility audit in 2006. Since that time, I’ve performed hundreds of accessibility audits for everything from small websites to massive standalone kiosks in Manhattan. I’ve done audits for small one-person e-commerce stores and massive software companies like Google, Adobe, Microsoft, and […]

Karl Groves - 27/03/2026

Introducing AFixt Accessibility Program Management

A Complete Solution Built on the W3C Accessibility Maturity Model Being successful in accessibility isn’t a one-off project. Like all compliance domains, it’s an organizational capability that must be built, measured, and sustained across every function of a business, ranging from how you recruit employees to how you procure software and from how your developers […]

Karl Groves - 20/03/2026

What I Like About WCAG 3.0

The W3C published an updated Working Draft of WCAG 3.0 on September 4, 2025, and I’ve spent considerable time pulling it apart. The accessibility community has been tracking this specification since the first public working draft dropped in January 2021, and after years of watching the sausage get made this latest draft finally feels like […]

Karl Groves - 03/03/2026

You’re getting sued. What happens now?

I’ve been working in the accessibility field for over 20 years. I’ve worked as an expert witness in numerous ADA cases, performed scores of accessibility audits, trained well over 1,000 developers around the world, and consulted for some of the biggest public and private sector organizations on earth. Along the way, I’ve worked for both […]

Karl Groves - 19/02/2026