Apr 052018
 

Our fourth and final SEO experiment for MDN, to optimize internal links within the open web documentation, is now finished. Optimizing internal links involves ensuring that each page (in particular, the ones we want to improve search engine results page (SERP) positions for, are easy to find.

This is done by ensuring that each page is linked to from as many topically relevant pages as possible. In addition, it should in turn link outward to other relevant pages. The more quality links we have among related pages, the better our position is likely to be. The object, from a user’s perspective, is to ensure that even if the first page they find doesn’t answer their question, it will link to a page that does (or at least might help them find the right one).

Creating links on MDN is technically pretty easy. There are several ways to do it, including:

  • Selecting the text to turn into a link and using the toolbar’s “add link” button
  • Using the “add link” hotkey (Ctrl-K or Cmd-K)
  • Any one of a large number of macros that generate properly-formatted links automatically, such as the domxref macro, which creates a link to a page within the MDN API reference; for example: {{domxref(“RTCPeerConnection.createOffer()”)}} creates a link to https://developer.mozilla.org/en-US/docs/Web/API/RTCPeerConnection/createOffer, which looks like this: RTCPeerConnection.createOffer(). Many of the macros offer customization options, but the default is usually acceptable and is almost always better than trying to hand-make the links.

Our style guide talks about where links should be used. We even have a guide to creating links on MDN that covers the most common ways to do so. Start with these guidelines.

The content updates

10 pages were selected for the internal link optimization experiment.

Changes made to the selected pages

In general, the changes made were only to add links to pages; sometimes content had to be added to accomplish this but ideally the changes were relatively small.

  • Existing text that should have been a link but was not, such as mentions of terms that need definition, concepts for which pages exist that should have been linked to, were turned into links.
  • The uses of API names, element names, attributes, CSS properties, SVG element names, and so forth were turned into links when either first used in a section or if a number of paragraphs had elapsed since they were last a link. While repeated links to the same page don’t count, this is good practice for usability purposes.
  • Any phrase for which a more in-depth explanation is available was turned into a link.
  • Links to related concepts or topics were added where appropriate; for example, on the article about HTMLFormElement.elements, a note is provided with a link to the related Document.forms property.
  • Links to related functions or HTML elements or whatever were added.
  • The “See also” section was reviewed and updated to include appropriate related content.

Changes made to targeted pages

Targeted pages—pages to which links were added—in some cases got smaller changes made, such as the addition of a link back to the original page, and in some cases new links were added to other relevant content if the pages were particularly in need of help.

Pages to be updated

The pages selected to be updated for this experiment:

The results

The initial data was taken during the four weeks from December 29, 2017 through January 25th, 2018. The “after” data was taken just over a month after the work was completed, covering the period of March 6 through April 2, 2018.

The results from this experiment were fascinating. Of all of the SEO experiments we’ve done, the results of this one were the most consistently positive.

Landing Page Unique Pageviews Organic Searches Entrances Hits
/en-us/docs/web/api/document_object_model/locating_dom_elements_using_selectors +46.38% +23.68% +29.33% +59.35%
/en-us/docs/web/api/htmlcollection/item +52.89% +35.71% +53.60% +38.56%
/en-us/docs/web/api/htmlformelement/elements +69.14% +57.83% +69.30% +70.74%
/en-us/docs/web/api/mediadevices/enumeratedevices +23.55% +14.53% +16.71% +15.67%
/en-us/docs/web/api/xmlhttprequest/html_in_xmlhttprequest +49.93% -3.50% +24.67% +59.63%
/en-us/docs/web/api/xmlserializer +36.24% +46.94% +31.50% +37.66%
/en-us/docs/web/css/all +46.15% +16.52% +23.51% +48.28%
/en-us/docs/web/css/inherit +22.55% +27.16% +20.58% +17.12%
/en-us/docs/web/css/object-position +102.78% +119.01% +105.56% +405.52%
/en-us/docs/web/css/unset +24.60% +18.45% +19.20% +35.01%

These results are amazing. Every single value is up, with the sole exception of a small decline in organic search views (that is, appearances in Google search result lists) for the article “HTML in XMLHttpRequest.” Most values are up substantially, with many being impressively improved.

Note: The data in the table above was updated on April 12, 2018 after realizing the “before” data set was inadvertently one day shorter than the “after” set. This reduced the improvements marginally, but did not affect the overall results.

Uncertainties

Due to the implementation of the experiment and certain timing issues, there are uncertainties surrounding these results. Those include:

  • Ideally, much more time would have elapsed between completing the changes and collecting final data.
  • The experiment began during the winter holiday season, when overall site usage is at a low point.
  • There was overall site growth of MDN traffic over the time this experiment was underway.

Decisions

Certain conclusions can be reached:

  1. The degree to which internal link improvements benefited traffic to these pages can’t be ignored, even after factoring in the uncertainties. This is easily the most benefit we got from any experiment, and on top of that, the amount of work required was often much lower. This should be a high priority portion of our SEO plans.
  2. The MDN meta-documentation will be further improved to enhance recommendations around linking among pages on MDN.
  3. We should consider enhancements to macros used to build links to make them easier to use, especially in cases where we commonly have to override default behavior to get the desired output. Simplifying the use of macros to create links will make linking faster and easier and therefore more common.
  4. We’ll re-evaluate the data after more time has passed to ensure the results are correct.

Discussion?

If you’d like to comment on this, or ask questions about the results or the work involved in making these changes, please feel free to follow up or comment on the thread I’ve created on our Discourse forum.

 Posted by at 10:08 AM
Mar 232018
 

The next SEO experiment I’d like to discuss results for is the MDN “Competitive Content Analysis” experiment. In this experiment, performed through December into early January, involved selecting two of the top search terms that resulted in MDN being included in search results—one of them where MDN is highly-placed but not at #1, and one where MDN is listed far down in the search results despite having good content available.

The result is a comparison of the quality of our content and our SEO against other sites that document these technology areas. With that information in hand, we can look at the competition’s content and make decisions as to what changes to make to MDN to help bring us up in the search rankings.

The two keywords we selected:

  • “tr”: For this term, we were the #2 result on Google, behind w3schools.
  • html colors”: For this keyword, we were in 27th place. That’s terrible!

These are terms we should be highly placed for. We have a number of pages that should serve as good destinations for these keywords. The job of this experiment: to try to make that be the case.

The content updates

For each of the two keywords, the goal of the experiment was to improve our page rank for the keywords in question; at least one MDN page should be near or at the top of the results list. This means that for each keyword, we need to choose a preferred “optimum” destination as well as any other pages that might make sense for that keyword (especially if it’s combined with other keywords).

To accomplish that involves updating the content of each of those pages to make sure they’re as good as possible, but also to improve the content of pages that link to the pages that should show up on search results. The goal is to improve the relevant pages’ visibility to search as well as their content quality, in order to improve page position in the search results.

Things to look for

So, for each page that should be linked to the target pages, as well as the target pages themselves, these things need to be evaluated and improved as needed:

  • Add appropriate links back and forth between each page and the target pages.
  • Is the content clear and thorough?
  • Make sure there’s interactive content, such as new interactive examples.
  • Ensure the page’s layout and content hierarchy is up-to-date with our current content structure guidelines.
  • Examine web analytics data to determine what improvements the data suggest should be done beyond these items.

Pages reviewed and/or updated for “tr”

The primary page, obviously, is this one in the HTML element reference:

These pages are closely related and were also reviewed and in most cases updated (sometimes extensively) as part of the experiment:

A secondary group of pages which I felt to be a lower priority to change but still wound up reviewing and in many cases updating:

Pages reviewed and/or updated for “html colors”

This one is interesting in that “html colors” doesn’t directly correlate to a specific page as easily. However, we selected the following pages to update and test:

The problem with this keyword, “html colors”, is that generally what people really want is CSS colors, so you have to try to encourage Google to route people to stuff in the CSS documentation instead of elsewhere. This involves ensuring that you refer to HTML specifically in each page in appropriate ways.

I’ve opted in general to consider the CSS <color> value page to be the destination for this, for reference purposes, with the article “Applying color” being a new one I created to use as a landing page for all things color related to route people to useful guide pages.

The results

As was the case with previous experiments, we only allowed about 60 days for the Google to pick up and fully internalize the changes as well as for user reactions to affect the outcome, despite the fact that 90 days is usually the minimum time you run these tests for, with six months being preferred. However, we have had to compress our schedule for the experiments. We will, as before, continue to watch the results over time.

Results for the “tr” keyword

The pages updated to improve their placement when the “tr” keyword is used in Google search, as well as the amount of change over time seen for each, is shown in the table below. These were the pages which were updated and which appeared in search results analysis for the selected period of time.

Change (%)

Address

Impressions

Clicks Position CTR
HTML/Element/tr -43.22% 124.57% 2.58% 285.71%
HTML/Element/table 26.68% 27.02% -2.90% 0.00%
HTML/Element/template 27.02% 9.21% -15.45% -14.05%
API/HTMLTableRowElement
API/HTMLTableRowElement/insertCell -2.78% -23.91% -2.16% -21.77%
API/HTMLTableRowElement/rowIndex
HTML/Element/thead 38.82% 19.70% 0.00% -13.67%
HTML/Element/tbody 42.72% 100.52% 14.19% 40.68%
HTML/Element/tfoot 8.90% 11.29% 2.64% 2.18%
HTML/Element/th -50.32% 3.43% 0.39% 106.25%
HTML/Element/td 20.05% 40.27% -8.04% 17.01%
API/HTMLTableElement/rows

The data is interesting. Impression counts are generally up, as are clicks and search engine results page (SERP) position. Interestingly, the main <tr> page, the primary page for this keyword, has lost impressions yet gained clicks, with the CTR skyrocketing by a sizeable 285%. This means that people are seeing better search results when searching just for “tr”, and getting right to that page more often than before we began.

Results for the “html colors” keyword

The table below shows the pages updated for the “html colors” keyword and the amount of change seen in the Google activity for each page.

Page Change Impressions (%) Change Clicks (%) Change Position (Absolute) Change Position (%) Change CTR (%)
https://developer.mozilla.org/en-US/docs/Learn/Accessibility/CSS_and_JavaScript +28.61% +19.88% -0.99 -4.54% -6.78%
https://developer.mozilla.org/en-US/docs/Learn/CSS/Styling_boxes/Backgrounds -43.88% -38.17% -2.71 -21.20% +10.17%
https://developer.mozilla.org/en-US/docs/Learn/CSS/Styling_boxes/Borders +51.59% +33.33% +3.28 +48.62% -12.04%
https://developer.mozilla.org/en-US/docs/Learn/CSS/Styling_text/Fundamentals +34.87% +29.55% -1.35 -11.34% -3.94%
https://developer.mozilla.org/en-US/docs/Web/CSS/background-color +9.03% +19.26% -0.17 -2.46% +9.38%
https://developer.mozilla.org/en-US/docs/Web/CSS/border-color +36.02% +36.98% -0.09 -1.38% +0.71%
https://developer.mozilla.org/en-US/docs/Web/CSS/color +23.04% +23.42% +0.03 +0.34% +0.31%
https://developer.mozilla.org/en-US/docs/Web/CSS/color_value +14.95% +34.09% -1.21 -10.26% +16.65%
https://developer.mozilla.org/en-US/docs/Web/CSS/CSS_Colors/Color_picker_tool -10.78% +6.68% +1.76 +24.78% +19.56%
https://developer.mozilla.org/en-US/docs/Web/CSS/outline-color +830.70% +773.91% -0.97 -12.42% -6.10%
https://developer.mozilla.org/en-US/docs/Web/CSS/text-decoration-color +3254.57% +3429.41% -1.45 -21.98% +5.21%
https://developer.mozilla.org/en-US/docs/Web/HTML/Applying_color +50.32% +45.21% -0.56 -4.83% -3.40%
https://developer.mozilla.org/en-US/docs/Web/HTML/Element/input/color +31.15% +25.57% -0.44 -4.44% -4.25%

These results are also quite promising, especially since time did not permit me to make as many changes to this content as I’d have liked. The changes for the color value type page are good; nearly 15% increase in impressions and a very good 34% rise in clicks means a health boost to CTR. Ironically, though, our position in search results dropped by nearly 1.25 points., or 10%.

The approximate 23% increase in both impressions and clicks on the CSS color attribute is quite good, and I’m pleased by the 10% gain in CTR for the learning area article on styling box backgrounds.

Almost every page sees significant gains in both impressions and clicks (take a look at text-decoration-color, in particular, with over 3000% growth!).

The sea of red is worrisome at first glance, but I think what’s happening here is that because of the improvements in impression counts (that is, how often users see these pages on Google), they are prone to reaching the page they really want more quickly. Note which pages are the ones with the positive click-through rate (CTR), which is the ratio of clicks divided by impressions. This is in order of highest change in CTR to lowest:

  1. https://developer.mozilla.org/en-US/docs/Web/CSS/CSS_Colors/Color_picker_tool
  2. https://developer.mozilla.org/en-US/docs/Web/CSS/color_value
  3. https://developer.mozilla.org/en-US/docs/Learn/CSS/Styling_boxes/Backgrounds
  4. https://developer.mozilla.org/en-US/docs/Web/CSS/background-color
  5. https://developer.mozilla.org/en-US/docs/Web/CSS/text-decoration-color
  6. https://developer.mozilla.org/en-US/docs/Web/CSS/border-color
  7. https://developer.mozilla.org/en-US/docs/Web/CSS/color

What I believe we’re seeing is this: due to the improvements to SEO (and potentially other factors), all of the color-related pages are getting more traffic. However, the ones in the list above are the ones seeing the most benefit; they’re less prone to showing up at inappropriate times and more likely to be clicked when they are presented to the user. This is a good sign.

Over time, I would hope to improve the SEO further to help bring the search results positions up for these pages, but that takes a lot more time than we’ve given these pages so far.

Uncertainties

For this experiment, the known uncertainties (an oxymoron, but we’ll go with that term anyway) include:

  • As before, the elapsed time was far too share to get viable data for this experiment. We will examine the data again in a few months to see how things are progressing.
  • This project had additional time constraints that led me not to make as many changes as I might have preferred, especially for the “html colors” keyword. The results may have been significantly different had more time been available, but that’s going to be common in real-world work anyway.
  • Overall site growth during the time we ran this experiment also likely inflated the results somewhat.

Decisions

After sharing these results with Kadir and Chris, we came to the following initial conclusions:

  • This is promising, and should be pursued for pages which already have low-to-moderate traffic.
  • Regardless of when we begin general work to perform and make changes as a result of competitive content analysis, we should immediately update MDN’s contributor guides to incorporate recommended changes.
  • The results suggest that content analysis should be a high-priority part of our SEO toolbox. Increasing our internal link coverage and making documents relate to each other creates a better environment for search engine crawlers to accumulate good data.
  • We’ll re-evaluate the results in a few months after more data has accumulated.

If you have questions or comments about this experiment or its results, please feel free to post to this topic on Mozilla’s Discourse forum.

 Posted by at 4:46 PM
Mar 212018
 

Following in the footsteps of MDN’s “Thin Pages” SEO experiment done in the autumn of 2017, we completed a study to test the effectiveness and process behind making changes to correct cases in which pages are perceived as “duplicates” by search engines. In SEO parlance, “duplicate” is a fuzzy thing. It doesn’t mean the pages are identical—this is actually pretty rare on MDN in particular—but that the pages are similar enough that they are not easily differentiated by the search engine’s crawling technology.

This can happen when two pages are relatively short but are about a similar topic, or on reference pages which are structurally and content-wise quite similar to one another. From a search engine’s standpoint, if you create articles about the background-position and background-origin CSS attributes, you have two pages based on the same template (a CSS reference page) whose contents are easily prone to being identical. This is especially true given how common it is to start a new page by copying content from another, similar, page and making the needed changes, and sometimes even only the barest minimum changes needed.

All that means that from an SEO perspective, we actually have tons of so-called “duplicate” pages on MDN. As before, we selected a number of pages that were identified as duplicates by our SEO contractor and made appropriate changes to them per their recommendations, which we’ll get into briefly in a moment.

Once the changes were made, we waited a while, then compared the before and after analytics to see what the effects were.

The content updates

We wound up choosing nine pages to update for this experiment. We actually started with ten but one of them wound up being removed from the set after the fact when I realized it wasn’t a usable candidate. Unsurprisingly, essentially all duplicate pages were found in reference documentation. Guides and tutorials were with almost no exceptions immune to the phenomenon.

The pages were scattered through various areas of the open web reference documentation under https://developer.mozilla.org/docs/Web.

Things to fix or improve

Each page was altered as much as possible in an attempt to differentiate it from the pages which were found to be similar. Tactics to accomplish this included:

  • Make sure the page is complete. If it’s a stub, write the page’s contents up completely, including all relevant information, sections, etc.
  • Ensure all content on the page is accurate. Look out for places where copy-and-pasted information wasn’t corrected as well as any other errors.
  • Make sure the page has appropriate in-content (as opposed to sidebar or “See also”) links to other pages on MDN. Feel free to also include links in the “See also” section, but in-content links are a must. At least the first mention of any term, API, element, property, attribute, technology, or idea should have a link to relevant documentation (or to an entry in MDN’s Glossary). Sidebar links are not generally counted when indexing web sites, so relying on them entirely for navigation will wreak utter havoc on SEO value.
  • If there are places in which the content can be expanded upon in a sensible way, do so. Are there details not mentioned or not covered in as much depth as they could be? Think about the content from the perspective of the first-time reader, or a newcomer to the technology in question. Will they be able to answer all possible questions they might have by reading this page or pages that the page links to through direct, in-content links?
  • What does the article assume the reader knows that they might not yet? Add the missing information. Anything that’s skimmed over, leaving out details and assuming the reader can figure it out from context needs to be fleshed out with more information.
  • Ensure that all of the API’s features are covered by examples. Ideally, every attribute on an element, keyword for a CSS property’s value, parameter for a method, and so forth should be used in at least one example.
  • Ensure that each example is accompanied by descriptive text. There should be an introduction explaining what the example does and what feature or features of the technology or API are demonstrated as well as any text that explains how the example works. For example, on an HTML element reference page, simply listing the properties then providing an example that only uses a subset of those properties isn’t enough.  Add more examples that cover the other properties, or at least the ones that are likely to be used by anyone who isn’t a deeply advanced user.
  • Do not simply add repetitive or unhelpful text. Beefing up the word count just to try to differentiate the page from other content is actually going to do more harm than good.
  • It’s frequently helpful to also change the pages which are considered to be duplicate pages. By making different changes to each of the similar pages, you can create more variations than by changing one page alone.

Pages to be updated

The pages we selected to update:

  • https://developer.mozilla.org/en-US/docs/Web/API/CanvasRenderingContext2D/fillText
  • https://developer.mozilla.org/en-US/docs/Web/API/Window/pageYOffset
  • https://developer.mozilla.org/en-US/docs/Web/API/MouseEvent/pageX
  • https://developer.mozilla.org/en-US/docs/Web/API/WebGLRenderingContext/getUniformLocation
  • https://developer.mozilla.org/en-US/docs/Web/API/WebGLRenderingContext/enableVertexAttribArray
  • https://developer.mozilla.org/en-US/docs/Web/API/GlobalEventHandlers/onpointerdown
  • https://developer.mozilla.org/en-US/docs/Web/SVG/Attribute/x1
  • https://developer.mozilla.org/en-US/docs/Web/CSS/max-block-size
  • https://developer.mozilla.org/en-US/docs/Web/API/MediaTrackSupportedConstraints/frameRate

In most if not all of these cases, the pages which are similar are obvious.

The results

After making appropriate changes to the pages listed above as well as certain other pages which were similar to them, we allowed 60 days to pass. This is less than the ideal 90 days, or, better, six months, but time was short. We will check the data again in a few months to see how things change given more time.

The changes made were not always as extensive as would normally be preferred, again due to time constraints created by the one-person experimental model. When we do this work on a larger scale, it will be done by a larger community of contributors. In addition, much of this work will be done from the beginning of the writing process as we will be revising our contributor guide to incorporate the recommendations as appropriate after these experiments are complete.

Pre-experiment unvisited pages

As was the case with the “thin pages” experiment, pages which were getting literally zero—or even close to zero— traffic before the changes continued to not get much traffic. Those pages were:

  • https://developer.mozilla.org/en-US/docs/Web/API/GlobalEventHandlers/onpointerdown
  • https://developer.mozilla.org/en-US/docs/Web/SVG/Attribute/x1
  • https://developer.mozilla.org/en-US/docs/Web/CSS/max-block-size
  • https://developer.mozilla.org/en-US/docs/Web/API/MediaTrackSupportedConstraints/frameRate

For what it’s worth, we did learn from this eventually and experiments that were begun after the “thin pages” results were in no longer included pages which got no traffic during the period leading up to the start of the experiment, but this experiment had already begun running by then.

Post-experiment unvisited pages

There were also two pages which had a small amount of traffic before the changes were made but no traffic afterward. This is likely a statistical anomaly or fluctuation, so we’re discounting these pages for now:

  • https://developer.mozilla.org/en-US/docs/Web/API/WebGLRenderingContext/getUniformLocation
  • https://developer.mozilla.org/en-US/docs/Web/API/WebGLRenderingContext/enableVertexAttribArray

The outcome

The remaining three pages had useful data. This is a small data set but is what we have, so let’s take a look.

 Page URL Sept. 16-Oct. 16 Nov. 18 – Dec 18
Clicks Impressions Clicks Impressions Clicks Chg. % Impressions Chg. %
https://developer.mozilla.org/en-US/docs/Web/API/CanvasRenderingContext2D/fillText 678 8,751 862 13,435 27.14% 34.86%
https://developer.mozilla.org/en-US/docs/Web/API/Window/pageYOffset 1,193 3,724 1,395 7,700 14.48% 51.64%
https://developer.mozilla.org/en-US/docs/Web/API/MouseEvent/pageX 447 2,961 706 7,906 36.69% 62.55%

For each of these three pages, the results are promising. Both click and impression counts are up for each of them, with click counts increasing by anywhere from 14% to 36% and impression counts increasing between 34% and 62% (yes, I rounded down for each of these values, since this is pretty rough data anyway). We’ll check the results again soon and see if the results changed further.

Uncertainties

Because of certain implementation specifics of this experiment, there are obviously some uncertainties in the results:

  • We didn’t allow as much time as is recommended for the changes to fully take effect in search data before measuring the results. This was due to time constraints for the experiment being performed, but, as previously mentioned, we’ll look at the data again later to double-check our results.
  • The set of pages with useful results was very small, and even the original set of pages was fairly small.
  • There was substantial overall site growth during this period, so it’s likely the results are affected by this. However, the size of the improvements seen here suggests that even with that in mind, the outcome was significant.

Decisions

After a team review of these results, we came to some conclusions. We’ll revisit them later, of course, if we decide that a review of the data later suggests changes be made.

  1. The amount of improvement seen strongly suggests we should prioritize fixing duplicate pages, at least in cases where at least one of the pages which are considered duplicates of one another is getting at least a low-to-moderate amount of traffic.
  2. The MDN meta-documentation, in particular the writer’s guide and the guides to writing each of the types of reference content, will be updated to incorporate the recommendations into the general guidelines for contributing to MDN. Additionally, the article on MDN about writing SEO-friendly content will be updated to include this information.
  3. It turns out that many of the changes needed to fix “thin” pages also applies when fixing “duplicate” pages.
  4. We’ll re-evaluate prioritization after reviewing the latest data after more time has passed.

The most interesting thing I’m learning about SEO, I think, is that it’s really about making great content. If the content is really top-notch, SEO practically attends to itself. It’s all about having thorough, accurate content with solid internal and external links.

Discussion?

If you have comments or questions about this experiment or the changes we’ll be making, please feel free to follow-up or comment on this thread on Mozilla’s Discourse site.

 Posted by at 7:02 PM