Tyler Technologies' Open Data Platform is used widely across local and state governments to enable users to “use easy-to-interpret, cloud-hosted data to tell stories and connect with [their] communities.” Data publishers on the Open Data Platform can create data stories where they can embed data visualizations and tell a story about their data.
There is no mechanism on the Open Data Platform to check for accessibility issues in data stories, and the data story tool relies on the users developing the data story to ensure that their content is accessible. Not all users are aware of accessibility best practices– this lack of accessibility support leads to the creation of data stories that are not accessible.
To add a feature to the data story tool that allows users to check for accessibility issues and resolve them.
I created a conceptual design for new feature for Tyler Technologies' Open Data Platform– an accessibility checker for data stories that allows users to run an automated accessibility scan, conduct manual accessibility testing, and resolve accessibility issues in their data stories before publishing their data story.
February - March 2023
End-to-end UX/UI designer. Responsible for user research, prototyping, and testing
Figma, Miro, Axe DevTools, Chartability
Research the needs of users and understand their needs, goals, and pain points
I interviewed five users of the data story tool to understand their current practices, needs, and challenges in making their content accessible. The takeaways from the interviews are summarized below.
Based on the interview findings, I developed two user personas: the experienced Open Data Platform user and the new, less experienced user.
The journey maps below illustrate the experience of the new and experienced users when they encounter the accessibility checker. These maps focus on how users can integrate the accessibility checker into their process of developing a data story on the Open Data Platform.
As part of my research, I reviewed tools that allow users to check the accessibility of their content, including: Axe DevTools, Lighthouse, Color Oracle, Accessibility Insights for Web, WAVE Web Accessibility Evaluation Took, and the Microsoft Office Accessibility Checker.
Based on my analysis, I identified six key features of successful accessibility checking tools. I determined that an accessibility checker should:
To understand what accessibility issues could currently be present in data stories, I conducted an accessibility assessment of an existing data story page on the Connecticut Open Data Portal. I used the automated accessibility checks from the Axe DevTools Chrome extension. I also conducted additional manual testing using the data visualization-specific accessibility heuristics from Chartability.
Using the automated accessibility test from the Axe DevTools Chrome extension and the manual testing guidelines from Chartability, I identified several accessibility issues that needed to be addressed in the data story, including:
Screenshot of Axe DevTools accessibility test results for data story page
Users of the Open Data Platform need an easy way to check the accessibility of their data stories because most users aren’t familiar with accessibility issues.
I began by brainstorming features that could be included in an accessibility checker based on my comparative analysis and user interviews. I then prioritized the features to determine which were essential, which would be nice to have, and which could come later.
I designed a user flow for checking accessibility issues in a data story before publication. I also designed two task flows: one for running an automated accessibility scan and one for conducting manual accessibility testing.
I conducted moderated, remote usability tests with users of the data story tool to understand:
The usability tests revealed some things that worked well with the design and some areas for improvement. I addressed the usability issues that were identified through testing in the design of my final prototype.
I used the results of my usability tests to determine my priority revisions, including the three detailed below.
Overwhelming initial results from automated test.
I added a transition screen before the test results, and I closed the accordion menu on the first screen of the test results.
Need to make manual testing more approachable.
I clearly highlighted each manual check, and I added resources from the Web Accessibility Initiative for each check to help users complete the checks.
Need a clearer introduction to the accessibility tools.
I added a clear introduction to the accessibility checker in the publishing panel, and clearly explained the accessibility requirements before publication.
After working through several iterations of my design and testing a prototype with potential users, I developed the final prototype of the accessibility checker for the Open Data Platform.
Automated accessibility testing process in the data story tool
See more of my work
good neighborResponsive website design