During my time in search, there are certain ranking factors that I’ve changed my perspective on. For instance, after coming to Go Fish Digital and working on internal linking initiatives, I started to realize the power of internal links over time. By implementing internal links at scale, we were able to see consistent success.
Freshness is another one of these factors. After working with a news organization and testing the learnings gained from that work on other sites, I started to see the immense power that content refreshes could produce. As a result, I think the entire SEO community has underrated this concept for quite some time. Let’s dig into why.
Reviewing news sites
This all started when we began to work with a large news publisher who was having trouble getting in Google’s Top Stories for highly competitive keywords. They were consistently finding that their content wasn’t able to get inclusion in this feature, and wanted to know why.
Inclusion in “Top stories”
We began to perform a lot of research around news outlets that seemed quite adept at getting included in Top Stories. This immediately turned our attention to CNN, the site that is by far the most skilled in acquiring coveted Top Stories positions.
By diving into their strategies, one consistent trend we noticed was that they would always create a brand new URL the day they wanted to be included in the Top Stories carousel:
As an example, here you can see that they create a unique URL for their rolling coverage of the Russia-Ukraine war. Since they know that Google will show Top Stories results daily for queries around this, they create brand new URLs every single day:
This flies in the face of traditional SEO advice that indicates web owners need to keep consistent URLs in order to ensure equity isn’t diluted and keywords aren’t cannibalized. But to be eligible for Top Stories, Google needs a “fresh” URL to be indexed in order for the content to qualify.
After we started implementing the strategy of creating unique URLs every day, we saw much more consistent inclusion for this news outlet in Top Stories for their primary keywords.
However, the next question we wanted to address was not just how to get included in this feature, but also how to maintain strong ranking positions once there.
Ranking in “Top stories”
The next element that we looked at was how frequently competitors were updating their stories once in the Top Stories carousel, and were surprised at how frequently top news outlets refresh their content.
We found that competitors were aggressively updating their timestamps. For one query, when reviewing three articles over a four-hour period, we found the average time between updates for major outlets:
USA Today: Every 8 Minutes
New York Times: Every 27 minutes
CNN: Every 28 minutes
For this particular query, USA Today was literally updating their page every 8 minutes and maintaining the #1 ranking position for Top Stories. Clearly, they were putting a lot of effort into the freshness of their content.
But what about the rest of us?
Of course, it’s obvious how this would apply to news sites. There is certainly no other vertical where the concept of “freshness” is going to carry more weight to the algorithm. However, this got us thinking about how valuable this concept would be to the broader web. Are other sites doing this, and would it be possible to see SEO success by updating content more frequently?
Fortunately, we were able to perform even more research in this area. Our news client also had many non-news specific sections of their site. These sections contain more “evergreen” articles where more traditional SEO norms and rules should apply. One section of their site contains more “reviews” type of content, where they find the best products for a given category.
When reviewing articles for these topics, we also noticed patterns around freshness. In general, high ranking articles in competitive product areas (electronics, bedding, appliances) would aggressively update their timestamps on a monthly (sometimes weekly) cadence.
For example, as of the date of this writing (May 25th, 2022), I can see that all of the top three articles for “best mattress” have been updated within the last 7 days.
Looking at the term “best robot vacuum”, it looks like all of the articles have been updated in the last month (as of May 2022):
Even though these articles are more “evergreen” and not tied to the news cycle, it’s obvious that these sites are placing a high emphasis on freshness with frequent article updates. This indicated to us that there might be more benefits to freshness than just news story results.
Performing a test
We decided to start testing the concept of freshness on our own blog to see what the impact of these updates could be. We had an article on automotive SEO that used to perform quite well for “automotive seo” queries. However, in recent years, this page lost a lot of organic traffic:
The article still contained evergreen information, but it hadn’t been updated since 2016:
It was the perfect candidate for our test. To perform this test, we made only three changes to the article:
Updated the content to ensure it was all current. This changed less than 5% of the text.
Added “2022” to the title tag.
Updated the timestamp.
Immediately, we saw rankings improve for the keyword “automotive seo”. We moved from ranking on the third page to the first page the day after we updated the content:
To verify these results, we tested this concept on another page. For this next article, we only updated the timestamp and title tag with no changes to the on-page content. While we normally wouldn’t recommend doing this, this was the only way we could isolate whether “freshness” was the driving change, and not the content adjustments.
However, after making these two updates, we could clearly see an immediate improvement to the traffic of the second page:
These two experiments combined with other tests we’ve performed are showing us that Google places value on the recency of content. This value extends beyond just articles tied to the news cycle.
Why does Google care?
Thinking about this more holistically, Google utilizing the concept of freshness makes sense from their E-A-T initiatives. The whole concept of E-A-T is that Google wants to rank content that it can trust (written by experts, citing facts) above other search results. Google has a borderline public responsibility to ensure that the content it serves is accurate, so it’s in the search giant’s best interest to surface content that it thinks it can trust.
So how does freshness play into this? Well, if Google thinks content is outdated, how is it supposed to trust that the information is accurate? If the search engine sees that your article hasn’t been updated in five years while competitors have more recent content, that might be a signal that their content is more trustworthy than yours.
For example, for the term “best camera phones”, would you want to read an article last updated two years ago? For that matter, would you even want an article last updated six months ago?
As we can see, Google is only ranking pages that have been updated within the last one or two months. That’s because the technology changes so rapidly in this space that, unless you’re updating your articles every couple of months or so, you’re dramatically behind the curve.
The concept of freshness also makes sense from a competitive perspective. One of the biggest weaknesses of an indexation engine is that it’s inherently hard to serve real-time results. To find when content changes, a search engine needs time to recrawl and reindex content. When combined with the demands of crawling the web at scale, this becomes extremely difficult.
On the other hand, social media sites like Twitter don’t have this issue and are made to serve real-time content. The platform isn’t tasked with indexing results, and engagement metrics can help quickly surface content that’s gaining traction. As a result, Twitter does a much better job of surfacing trending content.
Thinking about the web from a platform based perspective, it makes sense that most users would choose Twitter over Google when looking for real-time information. This causes a big threat to Google, as it’s a reason for users to migrate off the ecosystem, thus presenting fewer opportunities to serve ads.
Recently in Top Stories, you now see a lot more “Live Blog Posts”. These articles utilize LiveBlogPosting structured data, which signals to Google that the content is getting updated in real-time. While looking for real-time URLs across the entire web is daunting, using this structured data type can help them better narrow in on content they need to be crawling and indexing more frequently.
Google seems to be aggressively pushing these live blogs in Top Stories as they often see strong visibility in Top Stories results:
This might be a strategic move to encourage publishers to create real-time content. The goal here could be increased adoption of content that’s updated in real-time with the end result of showcasing to users that they can get this type of content on Google, not just Twitter.
Utilizing these concepts moving forward
I think as an industry, sometimes there’s room for us to be more creative when thinking about our on-page optimizations. When looking at how to improve pages that have lost traffic and positions over time, we could take freshness into consideration. When looking at pages that have lost prominence over time, we might want to consider checking if that content is also outdated. Through testing and experimentation, you could see if updating the freshness of your content has noticeable positive impacts on ranking improvements.