Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2021 Oct:2021:58-67.
doi: 10.1145/3459637.3482457. Epub 2021 Oct 30.

Non-Visual Accessibility Assessment of Videos

Affiliations

Non-Visual Accessibility Assessment of Videos

Ali Selman Aydin et al. Proc ACM Int Conf Inf Knowl Manag. 2021 Oct.

Abstract

Video accessibility is crucial for blind screen-reader users as online videos are increasingly playing an essential role in education, employment, and entertainment. While there exist quite a few techniques and guidelines that focus on creating accessible videos, there is a dearth of research that attempts to characterize the accessibility of existing videos. Therefore in this paper, we define and investigate a diverse set of video and audio-based accessibility features in an effort to characterize accessible and inaccessible videos. As a ground truth for our investigation, we built a custom dataset of 600 videos, in which each video was assigned an accessibility score based on the number of its wins in a Swiss-system tournament, where human annotators performed pairwise accessibility comparisons of videos. In contrast to existing accessibility research where the assessments are typically done by blind users, we recruited sighted users for our effort, since videos comprise a special case where sight could be required to better judge if any particular scene in a video is presently accessible or not. Subsequently, by examining the extent of association between the accessibility features and the accessibility scores, we could determine the features that signifcantly (positively or negatively) impact video accessibility and therefore serve as good indicators for assessing the accessibility of videos. Using the custom dataset, we also trained machine learning models that leveraged our handcrafted features to either classify an arbitrary video as accessible/inaccessible or predict an accessibility score for the video. Evaluation of our models yielded an F 1 score of 0.675 for binary classification and a mean absolute error of 0.53 for score prediction, thereby demonstrating their potential in video accessibility assessment while also illuminating their current limitations and the need for further research in this area.

Keywords: non-visual accessibility; video accessibility.

PubMed Disclaimer

Figures

Figure 1:
Figure 1:
Accessibility analysis using handcrafted features. From left to right: (i) Two main sources of information, namely the video and the audio. (ii) Handcrafted feature computation. (iii) Use of features for predicting accessibility scores.
Figure 2:
Figure 2:
Annotation pipeline. The participants are shown pairs of videos chosen from the dataset, for which they provide one of the three options (A, B or equal). The videos with the same scores are paired together in the next round.
Figure 3:
Figure 3:
Number of samples with respect to accessibility scores.
Figure 4:
Figure 4:
Sample videos from the dataset. (a) and (b): sample videos with high rating, with both have 4/4 score. (c) and (d): sample videos with low ratings, with both having 0.5 score.

References

    1. [n.d.]. iOS Accessibility Scanner Framework. https://github.com/google/ GSCXScanner.
    1. [n.d.]. WAVE Web Accessibility Evaluation Tool. https://wave.webaim.org/.
    1. 2021. Improve your code with lint checks. https://developer.android.com/studio/write/lint.
    1. Aafaq Nayyer, Mian Ajmal, Liu Wei, Syed Zulqarnain Gilani, and Mubarak Shah. 2019. Video description: A survey of methods, datasets, and evaluation metrics. ACM Computing Surveys (CSUR) 52, 6 (2019), 1–37.
    1. Acosta Tania, Acosta-Vargas Patricia, Zambrano-Miranda Jose, and Lujan-Mora Sergio 2020. Web Accessibility evaluation of videos published on YouTube by worldwide top-ranking universities. IEEE Access 8 (2020), 110994–111011.

LinkOut - more resources