Most of the research and applications in web accessibility have been related to guidelines and evaluation tools. The most prominent resource is the W3C guidelines, which provide a standardized set of guidelines, describe various disabilities, and explain solutions to make accessible websites. Additionally, there are already a ton of existing web accessibility evaluation tools. To add something new to the field, we decided to use these guidelines and tools in order to analyze the University of Michigan websites. This would ideally find flaws that need to be addressed in the websites and illustrate that seemingly good websites still have web accessibility flaws.
What We Did
What We Found
Figure 1: Our distribution of LSA sites after running the color contrast test. A majority of the sites could improve in this regard.
In our data set, we found that there was not a significant correlation between lacking web accessibility features. According to figures 2-4, we did not find a correlation between HTML, Color Contrast, and image alternative tags.
Figure 5: Our final combined accessibility score on all LSA sites.
Figure 6: Summary statistics of all three tests.
We did this project to compare UMich websites to identify which one really needs to update their websites to be accessible to any handicapped person. While not all the websites can be perfect, we were aiming to locate a few that have the worst accessibility. The implications of our findings is that we will know which websites need the most work to reach a respectable score compared to the other UMich websites. Based on our insight that Slavic is the worst website, we hope to either present this to ITS, who can update LSA-Slavic’s website, or the Michigan Daily, who can publish our findings and bring light to web accessibility as an important problem that needs to be addressed.