30
Web Accessibility Testing for AIR AIR Austin February 2012 Jim Thatcher http://jimthatcher.com [email protected] Accessibility Consulting Austin, Texas Jim Allan http://tsbvi.edu [email protected] TSBVI Austin, Texas

Web Accessibility Testing for AIR

  • Upload
    van

  • View
    33

  • Download
    0

Embed Size (px)

DESCRIPTION

Web Accessibility Testing for AIR. AIR Austin February 2012. Jim Allan http://tsbvi.edu [email protected] TSBVI Austin, Texas. Jim Thatcher http://jimthatcher.com [email protected] Accessibility Consulting Austin, Texas. Resources. Resources and Slides at - PowerPoint PPT Presentation

Citation preview

Page 1: Web Accessibility  Testing for AIR

Web Accessibility Testing for AIR

AIR AustinFebruary 2012

Jim Thatcherhttp://jimthatcher.com [email protected] ConsultingAustin, Texas

Jim Allanhttp://[email protected], Texas

Page 2: Web Accessibility  Testing for AIR

Resources Resources and Slides at

Resources list http://jimthatcher.com/testing

PowerPoint testingForAIR.ppt

Most important for us and testing are Web Accessibility Toolbar http://tinyurl.com/wattoolbar

Jim’s Favelets http://jimthatcher.com/favelets

Testing for accessibility 2

Page 3: Web Accessibility  Testing for AIR

Testing for Web AccessibilityWeb Accessibility:

Web resources are accessible if people with disabilities can use them as effectively as non-disabled people

(UT Accessibility Institute)

So testing means user testing with disabled and non-disabled people?

Instead there are “standards and guidelines” against which we can test

Testing for accessibility 3

Page 4: Web Accessibility  Testing for AIR

Testability – an example Contrast – 1990 (WCAG 1)

Ensure that foreground and background color combinations provide sufficient contrast when viewed by someone having color deficits or when viewed on a black and white screen.

Contrast – 2010 (WCAG 2) Text and images of text have a contrast ratio of at least

4.5:1 … 3:1 for larger text (18pt or 14pt bold) (Level AA)

A simple tool to measure contrast ratio Colour Contrast Analyser Tool

Testing for accessibility 4

Page 5: Web Accessibility  Testing for AIR

The Colour Contrast Analyser Tool

Testing for accessibility 5

Page 6: Web Accessibility  Testing for AIR

Testing Tools Colour Contrast Analyser

http://tinyurl.com/colourcontrast Web Accessibility Toolbar http://tinyurl.com/wattoolbar Jim T’s Favelets http://jimthatcher.com/favelets/ Toolbar for Firefox http://tinyurl.com/cptoolbar WAVE http://wave.webaim.org FireEyes http://tinyurl.com/jfireeyes Lists of tools http://www.w3.org/WAI/ER/tools/complete http://www.colostate.edu/Dept/ATRC/tools.htm http://www.webaim.org/articles/tools/

Testing for accessibility 6

Page 7: Web Accessibility  Testing for AIR

Computer tests vs. Human Review Testable by a computer vs. requiring human

judgment

Only ~25% of accessibility errors can be detected by computers

Many claim compliant/accessible because no errors reported by testing tool

Testing all for Section 508 Standards discussed http://jimthatcher.com/testing1.htm

Testing for accessibility 7

Page 8: Web Accessibility  Testing for AIR

Computer tests (about 12 of them) (3) Missing alt on <img>, <area>, or <input> with type=“image”

(3) Empty alt on <img> in <a> without text, same for <button> or <input> with type=“image”

(1) Form control with no title or label (or empty) (2) No title on frame or iframe (1) Sever-side image map (image link with ismap) (1) Missing lang attribute for page (1) No HTML headings (h1, h2, …) on page

Testing for accessibility 8

Page 9: Web Accessibility  Testing for AIR

The Issues from AIR Judging Form Judging form: http://www.knowbility.org/v/air-

dev-resources/AIR-Austin/34/ Core Accessibility 240 points (Basic Accessibility 70 points) Advanced Accessibility 120 Points

Testing for accessibility 9

Page 10: Web Accessibility  Testing for AIR

Core Accessibility – Inactive images

1. Inactive images (20 points). Use the alt attribute to provide a text alternative for non-linked (inactive) images. If the image is decorative or redundant, use alt=””; if the image conveys information the alt-text should convey the same information. … If the image consists of text, the alt-text should be the same. Each image without appropriate alt-text is an error.

Testing – Use WAT (images → show images) or a little simpler, Jim’s Favelets (Formatting images and larger images).

Testing for accessibility 10

Page 11: Web Accessibility  Testing for AIR

Core Accessibility – Active images

2. Active images (20 points). Use the alt attribute to provide a text alternative for every active image, including image links, image map hot spots, or input elements of type image. The text alternative must convey the function of the active element, the target of the link. If the image consists of text, the alt-text should be the same. Each active image without appropriate alt-text is an error.

Testing – Use WAT (images → show images mouse over for whether active) or the Favelets (Active images).

Testing for accessibility 11

Page 12: Web Accessibility  Testing for AIR

Core Accessibility - Links

3. Hypertext Links (20 points). Use descriptive link text to ensure that each link makes sense when it is read out of context. Make sure that links with same text on the same page go to same place. Each anchor with inadequate link text is an error.

Testing – Use WAT (Doc Info → List links) or inspection. Watch out for “More” “Click Here”, etc.

Testing for accessibility 12

Page 13: Web Accessibility  Testing for AIR

Core Accessibility – Semantic markup  4. Correct Markup/Parsing (20 points). Use

semantic markup (block quotes, headings, lists, etc.) to properly represent the structure of the document. … Each instance of a structural tag used for formatting or content that should use semantic markup or that is not structured to specifications is an error.

Testing – Use WAT (Structure → Headings) and (Structure → List Items) and (Structure → Q/Blockquote).

Testing for accessibility 13

Page 14: Web Accessibility  Testing for AIR

Core Accessibility  - Skip links

5. Skip links (10 points). Provide at least 1 and no more than 3 links at the top of the page which are either visible all the time or become visible on focus. These should jump to the main content or main navigation sections of the page. Intended for keyboard users, be sure to test them with the keyboard. Each page that does not meet these requirements is an error.

Testing – Use the keyboard to test, simply:1. Tab to skip link

2. Press enter to follow

3. Tab again; should be at first link after target of skip

Or Jim’s Favelet – Skip links

Testing for accessibility 14

Page 15: Web Accessibility  Testing for AIR

Core Accessibility - Headings

6. Headings for navigation (10 points). There should be a heading at the top of each major section of a page; use only one h1 and heading levels should not skip (h2 then h4). Each page that does not meet these requirements is an error.

Testing – Use WAT (Structure → Headings) and (Structure → Headings Structure) or Jim’s Favelet, Headings

Watch out for skipping levels (like h2 to h4) – Don’t do it.

Testing for accessibility 15

Page 16: Web Accessibility  Testing for AIR

Core Accessibility - Landmarks  7. Landmark roles (10 points). Use ARIA landmark

roles to label key sections that don’t easily accommodate headings, like role="main", role=”content info”, role=”navigation” (use aria-Label if more than one navigation area) and role=”search”. Don’t overdo it; time is taken announcing these. Each page that does not meet these requirements is an error.

Testing – Use JAWS (; or Ctrl+Ins+; to list) Watch for correct labeling

aria-labelledby for id of an on-screen element aria-label for text

Or Jim’s favelet aria to display ARIA markup

Testing for accessibility 16

Page 17: Web Accessibility  Testing for AIR

Core Accessibility – Info in presentation  8. Information in presentation layer (20 points).

Ensure that information conveyed through presentation (font, color) is also available in text. Any instance of information only available through presentation is an error.

Testing – Inspection!

Testing for accessibility 17

Page 18: Web Accessibility  Testing for AIR

Core Accessibility - Contrast   9. Contrast (20 Points). The visual presentation of text

and images of text has a contrast ratio of at least 4.5:1, except for the following:

Large Text (14 pt bold or 18 point or larger): have a contrast ratio of at least 3:1.

Each instance that does not follow these guidelines is an error

Testing –Use the Colour Contrast Analyser tool WAT (Colour → Contrast Analyser application)

Testing for accessibility 18

Page 19: Web Accessibility  Testing for AIR

Core Accessibility - Contrast   10. Keyboard (20 Points). All functionality must be

accessible with the keyboard. Keyboard functionality should not require the use of mouse keys, the JAWS cursor, or anything similar. Any instance of an operation or a control that requires the mouse, unless a keyboard equivalent is readily available, the score is 0.

Testing – Use the keyboard to be sure everything is available without a mouse. Combine with testing for #14 – focus indication.

Testing for accessibility 19

Page 20: Web Accessibility  Testing for AIR

Core Accessibility - Language   11. Language (20 Points). Identify the natural language

of each page. Any failure is score of 0.is readily available, the score is 0.

Testing: Use WAT – (Doc Info → Show lang attributes)

Testing for accessibility 20

Page 21: Web Accessibility  Testing for AIR

Core Accessibility - Forms   12. Forms (20 points).  Label all form controls. For

example, use the label element when on-screen text prompt is adequate.  Use the title attribute when on-screen text prompt is not adequate or is dispersed.  Use fieldset/legend to group radio buttons and check boxes.  Use aria-required for required fields. Use ARIA markup for error handling including aria-invalid on fields which have errors and use alerts or alert dialogs for announcing the errors. Each form control with inadequate labeling is an error.

Testing: use WAT – (Structure → Fieldset/Labels) and or use Jim’s Forms favelet and ARIA favelet

Testing for accessibility 21

Page 22: Web Accessibility  Testing for AIR

Core Accessibility – Data Tables   13. Data Tables (10 points).  For tabular data, use the

caption element and/or the summary attribute. Use the th element or use either th or td elements with the scope attribute to unambiguously identify all row and column headers. For complex data tables associate data cells with the appropriate headers using either headers/id or scope. Each instance of missing or incorrect table markup is an error.

Testing: Use WAT – (Tables → Show Data Tables) and or use Jim’s Data tables favelet

Testing for accessibility 22

Page 23: Web Accessibility  Testing for AIR

Core Accessibility – Focus Indication   14. Focus Indication (20 points). There must be a clear

visual indication (highlight, dotted rectangle, change in color) when an object receives keyboard focus. Each instance of an object with no focus indication is an error.

Testing: Use the keyboard and tab through every active object on each page.

Testing for accessibility 23

Page 24: Web Accessibility  Testing for AIR

Advanced Accessibility Flash   32. Flash (20 points).  If Flash content is non-essential to

the meaning of the page then assistive technology must be able to either tab through, or bypass the Flash object. If the content conveys information or responds to user input, assistive technology must be able to access the information and functionality. Each instance of missing information or function is an error. There must be a clear visual indication (highlight, dotted rectangle, change in color) when an object receives keyboard focus. Each instance of an object with no focus indication is an error.

Testing: Inspection: Use the keyboard and test with JAWS

Testing for accessibility 24

Page 25: Web Accessibility  Testing for AIR

Advanced Accessibility Flash   33. Video (20 points).  For video with soundtrack, provide

synchronized captions. Provide an html text description for video without sound. In addition, provide synchronized audio description if the video cannot by understood from the soundtrack alone. If not synchronized, audio description must be provided as text with the video link. Each instance of inadequate accommodation is an error.

Testing: Inspection.

Testing for accessibility 25

Page 26: Web Accessibility  Testing for AIR

Advanced Accessibility Flash   34. Audio (10 Points)  Provide html text transcripts for

audio files. Speakers must be identified. Each place where the text transcript substantially differs from the audio is an error.

Testing: Inspection.

Testing for accessibility 26

Page 27: Web Accessibility  Testing for AIR

Advanced Accessibility Style Sheets   35. Alternative Style Sheets (10 Points) Provide at

least two style sheets in addition to the default that can be conveniently selected by users, for example, to change font size, contrast, color schemes, printing, or displaying on small devices. Any page where the layout fails with any alternate style sheet or where the selection process fails is an error.

Testing: Inspection.

Testing for accessibility 27

Page 28: Web Accessibility  Testing for AIR

Advanced Accessibility Scripting   36. Client Side Scripting (20 Points) All scripted

functionality is available from the keyboard or there is a readily available keyboard accessible alternative. Scripted controls must identify themselves (name, role, state, value and properties) to assistive technology. Make sure that screen readers are aware of important content exposed with scripting. Each instance violating these requirements is an error. No points awarded if scripting is for switching style sheets, or scripting is peripheral to the purpose of the site.

Testing: Inspection, keyboard. JAWS.

Testing for accessibility 28

Page 29: Web Accessibility  Testing for AIR

Advanced Accessibility ARIA Widgets   37. ARIA Widgets (20 Points) Use ARIA Best Practices

to code special widgets including tab panels, accordion menus and modal windows.

Testing: Jim’s ARIA favelet, inspection, keyboard, JAWS.

Testing for accessibility 29

Page 30: Web Accessibility  Testing for AIR

Advanced Accessibility Fluid layout   38. Fluid Layout (20 Points) Using browser

magnification in at least two browsers, increase the magnification level to 200%. If content reflows to prevent the necessity of using a left-right scroll bar to access content and all content is available, 20 points. If screen does not reflow 0 points.

Testing: inspection.

Testing for accessibility 30