Recorded at SearchLove London in October — in the wake of three Google updates — Tom presents a different take on core updates in this Whiteboard Friday.
Click on the whiteboard image above to open a high resolution version in a new tab!
Happy Friday, Moz fans. So I’m here at SearchLove London recording this Whiteboard Friday. I don’t know when it will reach you, but this is a bit of a different take on how to think about core updates. So obviously, I’m filming this in October. We’ve just had three updates back-to-back in quick succession.
I think it’s quite interesting that we had three updates and they were described in very different ways. So we had a helpful content update, a core update, and a product update or a product review update. It’s interesting that sometimes Google talks about updates very specifically. So I think the best examples are things like HTTPS, or Core Web Vitals, page experience update, where they’re very concrete about what they’re going to do, how they’re going to do it, how they’re going to measure it, and how it’s going to impact the algorithm.
Then you have core updates, where they do say things, but they tend to be kind of saying the same thing every time. So every single core update, they’ve said make good content, work on your expertise, authoritativeness, trust. This isn’t very concrete. It’s not very specific about what they’ve changed this particular time.
Indeed, if you’re a site that’s affected by these updates, it can feel quite random. It can feel like you’re just going upwards or downwards. There’s no particular rhyme or reason. So how can that be? So I want to give you some different ways to think about that.
So the two different ways that I’d like to focus on, one of them is this concept of a refresh. So Google used to talk a lot about algorithm refreshers. This is up till about 2012. What they meant was this is something that was different to an algorithm update. So it wasn’t called an update. It was called a refresh, as distinct. They were trying to say that this would kind of be like a mini reset of how the algorithm was thinking about certain things.
If you look at how they talk about core updates in their documentation, they say things like this, “So your site might not recover until the next core update.” So you’d have situations where this is your rankings in blue. This is your competitor’s rankings in red. You get to a point where they’ve improved their site over time gradually. They’ve not been recognized for it, and then a core update comes along and suddenly they go up. So your position, you go down, and you’re left thinking, “Oh, that was a little bit random.” But, of course, it wasn’t random. It was just that they were gradually being recognized for things they’ve worked on or suddenly being recognized for things they’ve worked on gradually over time.
The other concept I’d like to talk about is the extent to which Google is testing. They’re iteratively testing over time. Again, they talk about this in their own documentation. There’s an article that I’ve come back to quite a few times, back in 2018, it’ll probably be linked beneath, where they invited some journalists to a meeting of their search engineering team. In that meeting, they were talking about how they were thinking about some changes they were making to the SERPs, and they talked a lot about how they were going to run some things as a test and look at certain metrics, see how they were improved. So it’s important to think that Google has their own metrics that they’re iterating towards, and they’re not necessarily saying, “Oh, your site is bad,” or, “There’s something wrong with your site.” They might be saying, “Oh, well, what we’re aiming for at this point could be more beneficially affecting some sites than others. Ultimately, if someone comes up, someone else has to go down.”
Indeed, in MozCast data, if we look at sites that were affected by at least four updates, so this is looking since Medic, technically there were some core updates before Medic, but I think the industry has been very focused on this since Medic, if we look at the core updates, of which there have been 12 now, and the sites that were affected by at least 4 of them, the vast majority in MozCast data had both some major positive movements and some major negative movements. So this tiny green slice represents the sites that only saw positive movements, and this red slice, the sites that only saw negative movements. So it’s incredibly unusual to have mono-directional movement, which just shows that people are winning and losing as Google tests different things. It’s not necessarily that some sites are just better suited to core updates and win every time. That’s very, very rare.
I also want to talk a little bit about the longer term. I think it’s important, when we think about these updates, to zoom out of it because these short-term effects can seem more random, harder to explain, harder to predict. So I’ve looked at a lot of sites again in the MozCast data over time and how they’ve been impacted by each update.
So this is an example, and obviously, it’s drawn on a whiteboard, so it’s not super precise. But this example I’ve attempted to illustrate here is actually Reuters, the news organization. I’ve chosen them because this is a site that obviously produces a lot of original content. It’s very authoritative. It’s hard to criticize it from the regards that Google likes to talk about in its core updates discussions and announcement. These bars represent how it was affected by each core update over a period of time. So it had some big negative hits, not many serious gains from these updates. So this doesn’t look very good. But if you track how their traffic grew or their visibility grew within MozCast over time, it looks a little bit different. So it sort of gradually grows over time. So what this means is even though on the days of the specific updates they were taking sometimes negative hits, like if we look at the week before and the week afterwards, sometimes they took big hits. Obviously, there’s long periods of time between these updates, so they might still be able to grow.
So say there might be three months between these bars, and even though they took a big hit here, they’re growing over the next three months. Maybe over here, they take this big hit, but they’ve more than recovered it by the time they get to the next update, take this hit, more than recovered it by the time they get to the next one. That could be that their SEO team is working some magic behind the scenes, but this is quite a consistent trend. This happens to a lot of sites. What I would suspect is actually happening is Google, when they launched the core update, they’re, to some degree, resetting certain things, looking at things afresh, valuing different metrics. Then over time, whatever historically was making that site perform will creep back in and start to be considered again.
So I hope that was interesting. That’s just a few different ways to think about core updates besides the usual messaging that we get, which is very consistently just E-A-T, good content. I’m not saying you shouldn’t do those things. Those are important. This longer-term trend that you get with a lot of sites that do do those things shows how important it is. But I think when you look at individual updates, you have to keep in mind that it’s not necessarily that Google is suddenly optimizing for these things more. They’re just iterating over time. That’s all from me. Thanks.