A wide range of tools can help you find accessibility issues, from browser extensions and website scanners to JavaScript libraries that analyze your codebase. Each tool has its own strengths and weaknesses when it comes to catching accessibility issues. Sparkbox’s goal is to research a variety of accessibility tools and empower anyone to run accessibility audits. In this article, I will review the Silktide Accessibility Checker Browser Extension by using it on a testing site that has planned accessibility issues. This review will rate the extension based on the following areas:
In order to fully vet accessibility tools like this one, Sparkbox created a demo site with intentional errors. This site is not related to or connected with any live content or organization.
Ease of Setup - Outstanding (4/4)
Installing the Silktide browser extension was easy, with one click to install, and two clicks to run. They also provide a comprehensive installation and usage guide on their website that is well-formatted and includes screenshots and video.
The extension is available for Chrome and Edge. To install, follow the install link on the Silktide website, which will bring you to its official Chrome Web Store page. From there, click the “Add to Chrome” button. You’ll then be prompted by Chrome to add the extension, after a security warning to make sure that you are okay with the permissions it needs to operate.
To run, navigate to the website that you want to scan, and click the extension icon. A sidebar will appear where you can click “Accessibility Checker” to run a scan.
Ease of Use - Meets Expectations (2/4)
While the tool is fairly straightforward and usable for the most part, I had to dock some points for focus and accessibility issues I noticed in the tool itself. This includes a few issues for those using keyboard navigation and screen readers.
Accessibility checker
For the issues list within each check, there were some issues with indicating focus. When using the tab key to navigate, a focus indicator disappears after leaving the “Show more”, when I would expect it to highlight the accordion items:
I also experienced an issue with selecting text in some of the toolbar’s UI. When expanding an issue, I could not select the text underneath to copy it. The link cursor remains on hover too. This seems like it may be a bug in the plugin’s expand and collapse UI. This problem could make it a little tricky to share and document results.
When clicking on the code block displayed under an issue, typically the page will scroll to the element in question and give it a nice red outline to help identify where it is on the page. But on another page I tried the scanner on, I encountered some results for “field labels” which did not do this.
There does not appear to be a way to easily re-scan the page, or to set the default testing level. It would be helpful to have a way to re-run the checker if the state of the page has changed dynamically with JavaScript, or if changes were introduced in the browser’s inspector. Currently, the only way I see to re-run it is by closing the extension entirely and opening it again.
Additional tools
This extension provides a bunch of other useful tools, including a tab focus order visualizer and a color contrast checker. The color contrast checker has a color dropper that can sample from the page, with a zoomed-in view around the cursor. There are also tools that give a focused look at landmarks, headings, and alt text.
I can see myself using the color blindness simulator in the future. It provides several presets which change the colors on the page:
Reputable Sources - Outstanding (4/4)
When running a scan, the tool provides a filter to choose which version of the WCAG guidelines to check against. It includes WCAG versions 2.0, 2.1, and 2.2—the most current, finalized recommendation. Each level of the guideline’s success criteria can be selected as well: A, AA, and AAA.
When you view each scan result, it will list the relevant guideline underneath, for example, “WCAG 2.0 A 1.3.1.” Clicking on an issue will provide a short explanation, along with a more detailed explanation after clicking “Show more.” One thing that I would like to see here is the inclusion of a link to the official guideline being referenced.
Accuracy of Feedback - Meets Expectations (2/4)
As part of our test site, we have a list of bugs that we think a robot would find, and ones that we expect a human to find. So far, no tools have been perfect in finding all of the issues. Overall, Silktide’s scanner did a decent job of surfacing the issues that we would expect from an automated tool.
As we’re nearing the end of our series of Automated Accessibility Tool Reviews in this format, it’s worth reiterating what Dustin brought up in the most recent review:
“Before diving into the evaluation, we have to recognize our own limitations as reviewers. The testing rubric that we use for these reviews is three years old at this point, so it does not account for new success criteria from WCAG 2.2, and it also makes some assumptions about what automated tools can catch that may need to be reevaluated.”
The checker did miss a few issues that we expected to be computer-discoverable. The two HTML markup issues that I would expect to be detected were the unordered list item missing its containing <ul>
tag, and a landmark inside of a landmark. It may not be doing the sort of markup validation that would call out the <ul>
tag.
It did not detect the missing “skip-to-content” link on our testing site either. Though oddly, I did see this issue detected when I briefly tested the checker on another website. So, it is being looked for, but for some reason was not flagged in this situation.
The incorrect heading hierarchy was not explicitly detected. It does sort of have a blanket issue that would cover this, by listing that you should do a manual check of “Meaningful sequence (HTML).”
Some bonus points should be given for finding the “funky tab order” issue that we have categorized as human-discoverable. It lists “Focus Order” as an assisted check, with a note that an element with a non-standard tab order is present and that “this is almost always a mistake.”
To see all the test site issues that the Silktide Accessibility Checker found or missed, skip to Accessibility Issues Found.
Clarity of Feedback - Exceeds Expectations (3/4)
Overall, the checker does a good job of listing issues, and allowing you to drill in to get more information about them. I did notice some lack of clarity on “assisted” items and categorization within the UI.
Assisted checks
The “Unavoidable animation” is listed as a warning, but I didn’t see any animation and it doesn’t point out where this warning may be originating from. I would be directionless if I was trying to resolve this.
The tool flags all images with alt text as “Appropriate alt text”, with the guidance, “Ensure that alternative text serves the same purpose and presents the same information as the media it describes.” It seems to be showing all of the images, rather than an actual issue that it may have found with them. It could use some clarity that it is listing everything if you want to manually check each one.
UI clarity
Some results have a hexagon with an “X”, an exclamation in a box, and an exclamation in a triangle, without any info about the difference between these icons. Hovering does not provide any more info either.
The exclamation in a box uses white on yellow with a significant lack of contrast. The colors used are failing WCAG AA contrast ratio requirements. Some adjustments should be made to the colors to ensure readability.
In the “Filter by type,” the “All checks” is a bit of a misnomer. It really means “all issues and all assisted issues” and does not include “passed” checks. This is a minor quibble. I initially was expecting to see all the passed checks here, as I saw that listed in the filter types.
Cost - Outstanding (4/4)
This tool is offered for free, with few barriers in the way of installing and using it. I was worried that there would be significant limitations or upsells that got in the way, but I thankfully did not encounter any. There is only a small banner at the bottom of the main sidebar with a link to take a tour of Silktide’s full platform. It’s unobtrusively built into the design and does not interfere with using the tool.
Strengths and Limitations - Exceeds Expectations (3/4)
The strengths of the accessibility checker within this tool are its simplicity, ability to quickly run an audit against the different WCAG guidelines, and the details provided for the results.
Besides the usability and clarity issues discussed previously, another limitation is the inability to export the results. They can only be viewed within the single panel sidebar. I also did not see any way to resize the sidebar to increase its width. This can result in a lot of clicking to drill in and out of every result, which some users might find tedious if there are a lot of results to go through.
Another big strength of this add-on are the loads of extra tools available, such as the color contrast checker, color blindness simulator, and the tab focus order visualizer. Having all of these in one place could be beneficial for auditing and testing.
Conclusion - 🏆 Recommend (22/28)
The Silktide Accessibility Checker has many strengths and extra tools that are impressive for a free extension without any installation or usage barriers. Personally, I see it as an okay option for small and medium projects, and for those getting into accessibility.
With other, more robust accessibility tools out there, I don’t see the scanner becoming part of my regular workflow yet. Some of the additional tools such as the tab order visualizer and color blindness simulator may make it worthwhile to keep handy, however. If some of the ease of use and clarity issues can be addressed, it would help elevate the scanner’s usefulness in more of a regular development workflow.
It’s always a good idea to check with more than one automated tool, to try and catch as many issues as possible. Automated tooling is still not a substitute for manual testing or getting people who regularly use assisted tools involved in the testing process. As highlighted by some of our example test site problems that “people should find,” there are many inaccessible aspects of a website involving usability and content that tools alone will have trouble finding.
Accessibility Issues Found
Test Site Issue | Tools Should Find | People Should Find | Silktide Found |
---|---|---|---|
Insufficient contrast in large text | ✅ | — | ✅ |
Insufficient contrast in small text | ✅ | — | ✅ |
Form labels not associated with inputs | ✅ | — | ✅ |
Missing alt attribute | ✅ | — | ✅ |
Missing lang attribute on html element | ✅ | — | ✅ |
Missing title element | ✅ | — | ✅ |
Landmark inside of a landmark | ✅ | — | ❌ |
Heading hierarchy is incorrect | ✅ | — | ❌ |
Unorder list missing containing ul tag | ✅ | — | ❌ |
ID attributes with the same name | ✅ | — | ✅ |
Target for click area is less than 44px by 44px | ✅ | — | ✅ |
Duplicate link text (lots of “Click Here”s or “Learn More”s) | ✅ | — | ✅ |
div soup | ✅ | — | ❌ |
Missing skip-to-content link | ✅ | — | ❌ |
Funky tab order | — | ✅ | ✅ |
Using alt text on decorative images that don’t need it | — | ✅ | ❌ |
Alt text with unhelpful text, not relevant, or no text when needed | — | ✅ | ❌ |
Page title is not relative to the content on the page (missing) | — | ✅ | ❌ |
Has technical jargon | — | ✅ | ❌ |
Using only color to show error and success messages | — | ✅ | ❌ |
Removed focus (either on certain objects or the entire site) | — | ✅ | ❌ |
Form helper text not associated with inputs | — | ✅ | ❌ |
Pop-up that doesn’t close when you tab through it | — | ✅ | ❌ |