Team Members: Zach Breger, Kim Di Camillo, Skye Du, Vedant Iyer, Nishka Muzumdar, Myla Semanision, Nitish Vijai, Joe Wentzel, Peter Zhang
Web accessibility ensures that anyone would be able to use your website, including users with disabilities (auditory, visual, etc.), users on a mobile device, users with slow Internet / network bandwidth, etc. By providing recommendations to improve accessibility, it helps lower barriers to usage, promote social inclusion, and improve user experience.
Currently, many sites do not account for web accessibility, resulting in a subpar experience for many Internet users. For example, they might not be optimized for screen readers, do not work too well on a smartphone, or might refuse to work outright. Improving these websites from an accessibility standpoint would benefit all Internet users who need them.
To get a better understanding of this problem, we decided to diagnose several University of Michigan LSA website homepages for web accessibility problems. We found that while some sites did well on this front (Sociology, Music), other sites could use some work (Slavic department, etc.).
Most of the research and applications in web accessibility have been related to guidelines and evaluation tools. The most prominent resource is the W3C guidelines, which provide a standardized set of guidelines, describe various disabilities, and explain solutions to make accessible websites. Additionally, there are already a ton of existing web accessibility evaluation tools. To add something new to the field, we decided to use these guidelines and tools in order to analyze the University of Michigan websites. This would ideally find flaws that need to be addressed in the websites and illustrate that seemingly good websites still have web accessibility flaws.
What We Did
We obtained the web accessibility evaluation data through scraping evaluation tool websites (accessi.org, https://www.digitalsales.com/alt-tag-checker, etc.). We used Selenium to perform the web scraping because the websites weren’t immediately scrapable without a tool that would input websites. In our script, we ran these LSA sites through three tests: an image alternate checker (alt attributes are important for screen readers), a color contrast site checker, and an overall HTML site test. Once we ran the UM-LSA websites through our Selenium script, we compiled the data into a data frame, cleaned the data and then outputted to a CSV. Finally, we inputted the CSV into a notebook and created graphs to analyze the cleaned data. Some decisions that we made when cleaning the data were omitting low risk web accessibility faults and using a golf-like scoring system.
What We Found
Figure 1: Our distribution of LSA sites after running the color contrast test. A majority of the sites could improve in this regard.
Figures 2-4: Correlation scatterplots between HTML vs. CC (2), Image vs. CC (3), and Image vs. HTML (4).
In our data set, we found that there was not a significant correlation between lacking web accessibility features. According to figures 2-4, we did not find a correlation between HTML, Color Contrast, and image alternative tags.
Figure 5: Our final combined accessibility score on all LSA sites.
Figure 6: Summary statistics of all three tests.
Our final combined accessibility score was computed by adding the scores of the image alternate test, color contrast test, and HTML tag test. A histogram of these combined scores is displayed in figure 5 and summary statistics are in figure 6.
Our data found that in general, University of Michigan departments have significant work to do in terms of accessibility. None of their homepage sites were at acceptable levels, and often half of the necessarily accessibility features were missing. This is unacceptable, as there are many students struggling with disabilities. The Slavic department by far has the most work to do.
We did this project to compare UMich websites to identify which one really needs to update their websites to be accessible to any handicapped person. While not all the websites can be perfect, we were aiming to locate a few that have the worst accessibility. The implications of our findings is that we will know which websites need the most work to reach a respectable score compared to the other UMich websites. Based on our insight that Slavic is the worst website, we hope to either present this to ITS, who can update LSA-Slavic’s website, or the Michigan Daily, who can publish our findings and bring light to web accessibility as an important problem that needs to be addressed.