Upload
silas-tucker
View
220
Download
2
Embed Size (px)
Citation preview
Australian university website accessibility revisited
Dey AlexanderScott Rippon
WANAU Forums 2007 - Canberra
Introduction
• University website accessibility benchmarked in 2003• “How accessible are Australian university websites?”
1. Do Australian university websites meet the basic standards for web accessibility?
» Looked only at WCAG 1.0 priority 1 checkpoints
2. Are there any areas in which the basic standards are poorly understood/implemented?
• Looked at 4 key pages from 45 university websites
Key findings in 2003
• 98 per cent of sites failed to meet Level-A conformance– Only 1 site had its 4 pages pass– That site would have failed if more pages were included
• Only 27 of the 180 pages were Level-A conformant– 153 pages checked failed on at least 1 priority 1 checkpoint
• Most problems (138 pages, 44 sites) were in meeting checkpoint 1.1 – provide text alternatives for non-text elements
What has changed since 2003?
• Many contextual (cultural, standards) changes:– Standards-based design now popularised– Disability standards for education (August 2005)– AVCC guidelines on information access for students with disabiliti
es (Nov 2004)
– AVCC guidelines on students with disabilities (May 2006)– New standards for web content accessibility (WCAG 2.0)
forthcoming• Has accessibility of Australian university websites
improved?
Current study
• All sites evaluated between early January and mid March 2007– Some sites may have been resigned– Some pages may have been updated
• Now doing data analysis– Evaluations done by two people, so need to be checked for
consistency– Later WANAU presentations will provide more detailed
findings and statistics
Methodology: scope
• Audit based on priority 1 checkpoints in WCAG 1.0– Did not evaluate checkpoint 14.1 (clearest and simplest
language)• 41 universities • 4 pages from each university site
– Home page– Main prospective students page– Orientation page (or alternative)– Accommodation page (or alternative)
Methodology: study style and tools
• Similar to last study– Manual and automated testing– No user testing
• Key tools– HERA– Web accessibility toolbar from Vision Australia– The WAVE 3.5
Methodology: process used
• Visual inspection in Internet Explorer version 6– Captured screen
• Ran HERA to generate a report– Worked through checkpoints individually, using other tools
and updating report. Checks included:• Text alternatives for non-text elements• If page worked without scripts• Rendering with CSS turned off
– Captured screens for text only view, CSS-off view, WAVE report
Preliminary findings
• Few, if any, sites are Level-A conformant• Most problems are still with checkpoint 1.1 (text
alternatives for non-text elements)1. Images – similar type and amount of problems2. PDF documents – increased usage since 20033. Flash – increased usage since 2003
HREOC’s advice on PDF
“The Commission's view is that organisations who distribute content only in PDF format, and who do not also make this content available in another format such as RTF, HTML, or plain text, are liable for complaints under the DDA.”
See: HREOC’s DDA Advisory Notes (Section 2.3)
HREOC’s comments on Flash
“While some positive progress has been made, it will be a considerable time before most users will benefit, and even then, Flash may be accessible only in certain specific circumstances.”
See: HREOC’s DDA Advisory Notes (Section 2.4)
Problems with images (1)
• Content images– Not equivalent (example 1, example 2)– Unnecessary data included (example)– Blank ALT attribute (example 1, example 2)– No ALT attribute (example 1, example 2)– Background image (via CSS) with no text alternative (
example 1, example 2)– Background image (via CSS) with non-equivalent text
alternative (example)
Problems with images (2)
• Decorative images– Unnecessary data included (example 1, example 2)– No ALT attribute (example 1, example 2)