Accessibility Conformance Testing and Technology Lifecycle
The Assessment included questions regarding outcome-based results in order to gauge if policies, practices, and procedures – a dimension that received higher maturity governmentwide – translated into Section 508 conformant ICT. The average conformance index value (for all reporting entities) was 1 .79 (out of 5), or Low, emphasizing that inputs are not translating into conformant ICT.
ICT Testing Outlook
As expected, the majority of reporting entities used a combination of automated and manual tools for ICT testing. While not feasible to test all ICT manually, strategic employment of automated tools coupled with manual testing allows reporting entities to achieve a wide scope and targeted depth of testing.
One question asked respondents what manual or hybrid testing methodology they used. 194 respondents reported using one or more of the manual or hybrid ICT accessibility test methodologies for web content shown in Table 6 below:
Methodology | Number of Reporting Entities Using Specified Methodology (of the 194 Reporting Entities) |
---|---|
Manual Testing with Guided Developer Tools | 119 reporting entities (61%) |
Assistive Technology | 94 reporting entities (48%) |
Manual Code Inspection | 79 reporting entities (41%) |
Trusted Tester 5.x | 75 reporting entities (39%) |
Reporting Entity-Specific Test Methodology | 56 reporting entities (29%) |
Similarly, the majority of respondents (153 reporting entities or 61%) reported using at least one automated accessibility testing tool for comprehensive, large-scale monitoring of web content. Of those reporting entities, 103 (67%) responded that personnel who use the tool and interpret the results received training on the tool.
In the preceding two figures, an average percentage of reporting entities that reportedly had automated testing tools was assessed across each bracket with respect to both maturity and conformance. That is, the percentages of each overall category with the same conformance bracket (i.e., Very Low-Very Low, Low-Very Low, or Moderate-Very Low) were calculated and then included in the chart.19 Again, a similar trend was seen: generally, the higher the conformance or maturity, the higher the percentage of reporting entities with automated testing tools.
Additionally, when asked how often reporting entities conduct web content user testing with people with disabilities prior to deployment to address all applicable Section 508 standards, reporting entities overwhelmingly (217 respondents or 87%) reported they never or only sometimes conduct user testing with people with disabilities (see Figure 18 below.)
Some reporting entities integrate accessibility throughout the technology lifecycle, which may have influenced lower results for comprehensive testing prior to deployment. To gauge the level of integration, the Assessment included a question asking the extent to which ICT accessibility is integrated throughout a reporting entity’s technology development lifecycle activities:
-
Just over half of reporting entities (125 reporting entities or 51%) reported ICT accessibility requirements are regularly, frequently, or almost always integrated throughout technology lifecycle activities, leaving just under half of the reporting entities unsure of inclusion, never including, or including on an ad hoc basis.
-
About half of all reporting entities (131 reporting entities or 52%) also reported that accessibility reviews are not or never formally integrated into the publication process, with review generally ad hoc.
-
Half of all reporting entities reported they have a formal policy requiring inclusion of Section 508 requirements, and regularly, frequently, or almost always include Section 508 requirements in ICT governance processes. 20
The Assessment included several questions specifically related to electronic documents and communications and found:
-
Just over half of reporting entities (142 reporting entities or 57%) reported they regularly, frequently, or almost always test electronic documents before posting, essentially reporting more bandwidth to perform more comprehensive electronic document testing than web testing.
-
Approximately 44% of reporting entities reported they have formal processes to ensure formal communications (internal, external, and in response to an emergency) are Section 508 conformant and they regularly, frequently, or almost always follow these processes.
Conformance Relationships: Regression Deep Dive
Through regression analysis, electronic document conformance emerged as a critical dependent variable. Regression 29 delved into a more complex analysis by examining the predictors of Section 508 conformance of electronic documents (Q80). It considered three independent variables: the status of the Section 508 Program (Q22), Section 508 Program resources and staffing (Q29), and Section 508 awareness training (Q59).21 Most notably, it showed a positive association between the status of a Section 508 Program (Q22) and electronic document conformance (Q80). A one-point change in the status of Section 508 Program (Q22), significantly predicted a 0.11 change in electronic document conformance (Q80). In contrast, a positive one-point change in Section 508 awareness training (Q59) significantly predicted a -0.042 change in electronic document conformance (Q80). Despite our initial expectation that increased training efforts would correspond to fewer accessibility issues within electronic documents, the empirical findings indicated a more nuanced relationship. There may be a limit to how much training can help electronic document conformance or reporting entities with more conformance issues are the ones pushing for more awareness training. In this model, resources and staffing (Q29) was not a significant predictor of electronic document conformance (Q80).
Regression 35 mirrored the structure of Regression 29, exchanging Section 508 awareness training (Q59) for ICT accessibility-related training (Q60) as independent variables.22 Like Regression 29, it showed another positive association between Section 508 Program status (Q22) and electronic document conformance (Q80). In particular, a one-point change in the status of Section 508 Program (Q22), significantly predicted a 0.12 change in electronic document conformance (Q80). In contrast, a positive one-point change in ICT accessibility-related training (Q60) significantly predicted a -0.057 change in electronic document conformance (Q80). Like Section 508 awareness training (Q59) in Regression 29, ICT accessibility-related training (Q60) was negatively correlated with electronic document conformance (Q80), and we speculate Regression 29 and Regression 35 share underlying reasons for this negative correlation. Also in this model, resources and staffing (Q29) were not significant predictors of electronic document conformance (Q80).
Furthermore, equivalent regression analysis replacing electronic document conformance with intranet, public internet, and video conformance did not stand out as statistically meaningful. The reasons behind this difference remain uncertain. Improved data quality or year-to-year analysis may shed more light on this matter. For now, regression suggests electronic document conformance serves as a better indicator of Section 508 Program maturity.
Regression analysis investigated the relationships between reporting entity size, as provided by publicly available datasets from Fedscope OPM, and Section 508 conformance of intranet web pages, public internet web pages, public electronic documents, and videos (Q61, Q71, Q78, Q79, Q80, Q81).23 The results consistently showed reporting entity size, on its own, does not have a meaningful impact on ICT conformance. While size itself may not be a good indicator, reporting entities with a strong department-level or parent agency that offers resources to component reporting entities may lead to higher conformance. Additionally, reporting entities that have a parent-component dynamic have implications for size, and we expect the department as a whole is relatively large while components individually are much smaller. For FY23, the criteria did not include tailored questions to pinpoint reporting entities who utilize parent-agency level resources in order to determine any correlation. We intend to hone questions in FY24 to find correlations between parent and component reporting entities.
Non-Conformance Tracking and Remediation
As demonstrated by low conformance, governmentwide, non-conformant ICT is prevalent. To understand how reporting entities track and prioritize remediation efforts, the Assessment included several questions surrounding the methodologies used and found that most respondents did not report robust accessibility remediation tracking and prioritization processes:
-
Just over half of all respondents (128 reporting entities or 51%) reported they do not track non-conformant digital content, or they do track but only sometimes take action to remediate. Figure 20 shows a further breakdown of responses.
-
79 respondents (40%) who engage in technology lifecycle activities said they do not identify or prioritize risk of Section 508 non-conformant ICT throughout the technology development life cycle, or they only sometimes utilize a risk assessment.24
-
Specifically related to web remediation, 120 reporting entities (48%) said they never remediate Section 508 conformance issues after deployment, or they only sometimes do so mostly on an ad hoc basis.
Conclusion
While reporting entities showed moderate maturity in Policies, Procedures, and Practices (government average of 2.54 out of 5), efficacy is lacking as conformance is relatively low at 1.79 out of 5. It may be that policies and procedures are not being thoroughly enforced or followed within reporting entities. Alternatively, if followed, reporting entities’ policies and procedures may not be effective or specific enough to ensure Section 508 conformant ICT is produced. Additionally, although respondents reported integrating accessibility into Technology Lifecycle Activities, implementing test methodologies and processes, and testing large swaths of ICT, the ICT that is deployed typically has Section 508 defects.
While the reporting entities are testing ICT, they are not prioritizing remediation or following processes they already have in place. Some non-conformance issues may be possible to identify and overcome in the acquisitions and procurement process, for example by holding vendors and contractors accountable for producing Section 508 conformant ICT). Some defects may be platform related, such as when a small error in a template outside of a reporting entity’s control can cause failures across all intranet pages. However, monitoring web content via automated tools should be coupled with a strategic manual testing and remediation plan to track, fix, and prevent the perpetuation of accessibility defects.
Footnotes
- Absolute values of reporting entities within each bracket (conformance or maturity) were also calculated, with similar results. That is, instead of averaging the percentage (%) of reporting entities with automated testing tools across overall categories with like conformance (or maturity brackets), the total number of reporting entities that reported having access to automated testing tools that fell into a conformance bracket (i.e. Very Low Conformance or Low Conformance) was divided by the total number of reporting entities that fell within that bracket. Again, similar trends were observed. ↩
- Governance processes include milestone reviews, publication and deployment decisions, and change control reviews. ↩
- Taken together, the independent variables yielded the regression equation Q80 = 0.084 + 0.11(Q22) + 0.015(Q29) - 0.042(Q59). The overall relationship was highly statistically significant (***), and independent variables (Q22, Q29, and Q59) could explain 12% of the differences in Q80. Both Q22 and Q59 significantly predicted Q80 (*** and **, respectively). Asterisks refer statistical significance at the following levels: *** 0.01 ≥ p-value; ** 0.05 ≥ p-value; * 0.1 p-value > 0.05. ↩
- The regression equation Q80 = 0.093 + 0.12(Q22) + 0.022(Q29) - 0.057(Q60) resembled that for Regression 29. Again, the overall relationship was highly statistically significant (***), and independent variables (Q22, Q29, and Q60) could explain 12% of the differences in Q80. Both Q22 and Q60 significantly predicted Q80 (*** and **, respectively). Asterisks refer statistical significance at the following levels: *** 0.01 ≥ p-value; ** 0.05 ≥ p-value; * 0.1 p-value > 0.05. ↩
- Regressions 37 to 42. ↩
- 51 reporting entities (20%) noted they do not engage in technology lifecycle activities and were removed from the calculation of overall percentage. ↩
Reviewed/Updated: December 2023