Posts tagged need

What marketers need to know about Google assistant and Google Home

Voice search is growing, and Google is investing heavily in products that utilize this technology. Contributor Joe Youngblood discusses these products and their implications for marketers.

The post What marketers need to know about Google assistant and Google Home appeared first on Search Engine…

Please visit Search Engine Land for the full article.

View full post on Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Google’s recent SERP changes and tests: everything you need to know

Over the past couple of weeks, Google has been spotted introducing some interesting changes to the look and layout of its search results pages.

The first, as we reported in our round-up of key search marketing news on Friday, is that Google appears to have increased the length limit for title tags in the SERPs to around 69-70 characters.

Title tags on mobile have lengthened even further, some now clocking in at a whopping 78 characters, giving rise to a bit of a dilemma for SEOs wanting to optimise for both desktop and mobile.

Featured snippets have also grown in size, while descriptions now seem to be able to fit a larger number of characters onto one line, although the overall length of descriptions appears to be staying the same for the moment.

Meanwhile, Google is also conducting some A/B testing of different link colours, in a move reminiscent of its infamous “fifty shades of blue” test in 2009. Some users have noticed Google trialling different shades of blue in their search results, while others are protesting a change in link colour from blue to black.

So what do we know so far about the new changes, and what impact could they have on SEO if they’re here to stay?

Extended title tags, descriptions and more

The longer titles in Google search results were first spotted by the sharp-eyed Ross Hudgens, who reported on Twitter that he was seeing title tags of 69 and 70 characters on Google.

While it’s still not confirmed whether the longer titles are a test or here to stay, at the time of writing (nearly a fortnight on from Hudgens’ tweet), I’m still seeing title tags as much as 68 and 69 characters long in search results for Google.co.uk.

70-character title tags were once the norm on Google, until Google increased the font size of titles from 16 pixels to 18 pixels in 2014.

After that, the optimum length for title tags was between 55 and 60 characters. But since Google did away with right-hand-side ads on the SERP earlier this year (possibly with the eventual goal of widening search results in mind), it has had a bit more width to play with.

Two side by side screenshots of Google search results, showing a search for "Who was the US president when the Angels won the world series?" The left-hand screenshot is narrower, and the titles of the top results are all cut off after eight or nine words. The right-hand search result is wider, and the title tags are all displayed in full, with no truncating.Then and now: the old style of search results, left, versus the new, right

The SEM Post reports that altogether, the width of Google search results has increased from 500 to about 600 pixels wide, decreasing the space between results and the Knowledge Panel on the right-hand side by 5 pixels to make room.

It’s important to note that pixels, not characters, are the measurement that Google actually uses for search results and title length. We use characters as a more practical measurement, which is why character limits can only be an approximation, because characters like ‘w’ take up much more pixel space than characters like ‘i’ or ‘l’.

It’s not just title tags which have been affected by the search results widening: meta descriptions are also being given more room (about 100 characters) on each line, and in some cases displaying three lines of description instead of the usual two, which is a huge change.

However, Google is still cutting many of them off at two lines and around 150 characters, meaning it’s probably too early to start rewording your meta descriptions just yet.

The width of featured snippets has also increased from 556 pixels wide to about 645 pixels, while the height has decreased by about 30 pixels, leaving the same amount of text within the box but also giving more room for title tags.

Finally, the three-pack of search results that appear at the top of the page when you make a local search has also widened to the same width as featured snippets while keeping the same height, allowing Google to add in a little more detail about the locations of local businesses.

Two screenshots of the 'local map pack' in Google search results side by side, showing search results for Shoe Zone branches in London. The left hand screenshot is narrower, and contains the branch name, street address, phone number and opening hours. The screenshot on the right is wider, and each result also has distance details (e.g. 1.3 miles) as well as the other information.Left: the old ‘map pack’ of local search results; Right: the new, wider map pack

Meanwhile, on mobile…

Google’s SERP changes don’t just apply to desktop. The length of mobile title tags has reportedly increased to as much as 78 characters, putting them at a couple of keywords longer than the new title tags on desktop.

The change on mobile was first reported by Jennifer Slegg on the SEM Post a couple of days ago, although interestingly, I have a screenshot from April showing a search result with a 76-character title tag – which might indicate that I was part of one of Google’s famous “1%” tests in which it trials a new change with just 1% of its users.

A mobile screenshot showing two Google search results with long title tags. The top search result is headlined, "Does any one else get this? : I have felt shivers up and down my spine story ..." which is a title tag of 76 characters in total. The second search result is almost as long, titled "Autonomous sensory meridian response - Wikipedia, the free encyclopedia"A search results screenshot taken on 25th April

The three-pack of local search results on mobile also seems to have changed a little, at least since this screenshot that I took back in February, with Google choosing to include ‘distance from you’ information even at the expense of showing address details, possibly to provide consistency with the level of detail that desktop search now features.

Two mobile screenshots side by side, both showing searches for "Marks and Spencer near me". The search on the left is from 29th February, as shown by the Google Doodle at the top for a leap year. Both screenshots give brief address details and opening hours of each branch, but the screenshot on the right also has distance details (e.g. 0.1 miles).

What could it all mean for SEO?

I should stress first of all that none of these changes are confirmed to be permanent, and could revert back or change further over the coming weeks and months as Google refines its plans for the SERP.

It’s definitely too early to start altering your strategy based on what has been reported so far, especially as there are a lot of different changes being reported to different extents by different users.

But there’s no harm in considering how they might affect SEO, especially if these changes are any indication of what Google has in store for the future.

The most immediately obvious benefit of longer title tags and descriptions is more room to fit in keywords. While this is true, it’s also 2016, which means we can do a lot more interesting things with the extra characters – after all, it’s not just about keywords any more.

A photograph of a long-tailed widowbird perched on a stalk in a grassy field.Longer title tags and descriptions would give SEOs more leeway to use long-tail keywords.
Image by Dr. Ron Matson, available via CC0 (public domain image)

The first is that longer titles and descriptions would allow more room for long-tail keywords, three or four-word phrases that tend to indicate that searchers are closer to taking action or making a purchase, making them extremely valuable for marketers.

Similarly, the increased room in search results would give website owners more opportunity to use natural language in their titles and descriptions.

Natural language queries are search queries that use full, everyday language instead of short, disjointed keywords. They are becoming more common as voice search is on the rise and search engines are increasingly able to interpret specific, multi-part queries.

In theory, SEOs should adapt to this trend in order to give their websites the best chance of ranking for natural language search queries, but with a limit on how much will fit on the search results page, this is easier said than done.

Extending the length of titles and descriptions might be Google’s way of acknowledging this and adjusting the SERP to allow for it.

Finally, another benefit of longer title tags is that in some cases, it can make room for brand names at the end of a title, which is useful for making sure your brand stamp is on a search result without necessarily having to sacrifice any valuable content.

The increased width of featured snippets probably won’t make big difference as long as the amount of text within stays the same, but they’re still an excellent way to leapfrog your search competition if you know how.

Meanwhile, the wider map pack could be great news for local SEO, as it gives enough room to display short descriptions plus precise address details, distance (at least on Google.co.uk – I’ve noticed screenshots from Google.com don’t appear to have this feature) and phone number, all without truncating.

A screenshot of the new wider three-pack of local search results, showing three types of shoe shop near Hammersmith. Each shop is given a short description, plus an exact location (e.g. Kings Mall Shopping Centre), a distance away (e.g. 0.2 miles), a street address (e.g. 43 King Street) and a phone number, with opening hours below.

The new length difference between mobile and desktop title tags probably presents the most interesting challenge for SEOs. There has always been a bit of a difference in the way that title tags display on desktop versus mobile, with mobile title tags displaying across two lines instead of one line on desktop.

But if the longer title tags that we’ve been seeing on desktop and mobile both become hard changes, then many site owners will have to make a call: optimise for desktop, or for mobile? Or try to find a middle ground?

Ultimately it will depend on context: the type of business, the information that needs to be included in the title tag, and the types of customers that the business most wants to cater towards. A business with primarily mobile traffic might decide that it’s worth sacrificing a bit of readability on desktop in order to improve their chances on mobile.

There’s also the possibility that more website owners will choose to spin off their mobile presence into a separate site – m.example.com – instead of simply employing a responsive design for their main site, in order to have the best of both worlds.

Paint it Black

Last but not least, we have the news that Google has been testing different link colours on some of its users, with searchers reporting seeing a different shade of blue in their search results, as well as –much more outrageously – all black links.

Instead of the usual purplish colour to indicate a previously visited link, the black search results are reported to turn a lighter grey when clicked on.

Setting aside the depressing visual impact of an almost-all-black results page, Google’s colour change experiments have some implications in search accessibility for users who are colour blind or have low vision.

Back in 2014 when Google removed the underlines in hyperlinks for its search results, leaving colour as the only visual clue as to which text was a link, Adrian Roselli wrote that: “Google misses the mark [for accessibility] in that the blue hyperlinks don’t have sufficient contrast with the rest of the text on the page.”

The darker blue text that Google is trialling in some places is an improvement on this, but the black link text that has caused so much upset comes with an additional set of problems.

With the exception of URLs just below the link text, which are still green, all of the text on Google’s SERPs is the same colour, with no other visual cues to distinguish links from regular text. Roselli explains why Google’s changes to the SERP fail accessibility guidelines on multiple levels:

“By dropping underlines, Google already ran afoul of Success Criterion 1.4.1 (which is only Level A) which requires something other than color to be used to convey information (such as the presence of a link).

By making the links the same color as the surrounding text, it now pushes that failure even further by not providing a 3:1 contrast ratio between the links and the surrounding text (which is outlined in Techniques for WCAG 2.0 as item G183).

Again, please don’t follow Google’s lead.”

The impact of this change, at least on the black SERPs that Google is trialling, could manifest in decreased click-through rates from users who find the new colour scheme visually confusing – or who find the new black links too depressing to click on.

View full post on Search Engine Watch

Why you NEED to raise organic CTR (and 4 ways to do it)

Does organic click-through rate (CTR) data impact page rankings on Google? This has been a huge topic of speculation for years within the search industry.

Why is there such a debate? Well, often people get hung up on details and semantics (are we talking about a direct or indirect ranking factor?), Google patents (which may or may not even be in use), and competing theories (everyone’s got an opinion based off something they heard or read). To make matters more confusing, Google is less than forthcoming about the secrets of their algorithm.

But if CTR truly does impact Google’s organic search rankings, shouldn’t we be able to measure it? Yes!

In this post, I’ll share some intriguing data on the relationship between Google CTR and rankings. I’ll also share four tips for making sure your Google click-through rates on the organic SERPs are where they need to be.

four-organic-ctr-hacks

To be clear: my goal with this post is to provide just a brief background and some actionable insights about the topic of organic click-through rates on Google. We won’t dissect every tweet or quote ever made by anyone at Google, dive deep into patents, or refute all the SEO theories about whether CTR is or isn’t a ranking factor. I’m sharing my own theory based on what I’ve seen, and my recommendations on how to act on it.

Google CTR & rankings: Yes! No! Who bloody knows!

Eric Enge of Stone Temple Consulting recently published a post with a headline stating that CTR isn’t a ranking factor. He clarifies within that post that Google doesn’t use CTR as a direct ranking factor.

What’s the difference between a direct and indirect ranking factor? Well, I suggest you watch Rand Fishkin’s awesome video on this very topic.

Basically, we know certain things directly impact rankings (I got a link from a reputable website, hooray!), but there are many other things that don’t have a direct impact, but nevertheless do impact ranking (some big-time influencer tweeted about my company and now tons of people are searching for us and checking out our site, awesome!).

It’s essentially the same issue as last touch attribution, which assigns all the credit to the last interaction. But in reality, multiple channels (PPC, organic, social, email, affiliates, etc.) can play important roles in the path to conversion.

The same is true with ranking. Many factors influence ranking.

So here’s my response: Direct, indirect, who cares? CTR might not be a “direct core ranking signal,” but if it impacts rank (and I believe it does), then it matters. Further, even if it doesn’t impact rank, you should still care!

But don’t take my word for it that Google has the technology. Check out these slides from Google engineer Paul Haahr, who spoke at SMX:

how-does-google-use-click-data

google-ctr-live-experiments

Also, AJ Kohn put together a good post about Google click-through rate as a ranking signal last year. He included a couple eye-opening quotes that I’ll share here because they are important. The first from Edmond Lau, a former Google engineer:

“It’s pretty clear that any reasonable search engine would use click data on their own results to feed back into ranking to improve the quality of search results. Infrequently clicked results should drop toward the bottom because they’re less relevant, and frequently clicked results bubble toward the top. Building a feedback loop is a fairly obvious step forward in quality for both search and recommendations systems, and a smart search engine would incorporate the data.”

The second from Marissa Mayer in 2007 talking about how Google used CTR as a way to determine when to display a OneBox:

“We hold them to a very high click-through rate expectation and if they don’t meet that click-through rate, the OneBox gets turned off on that particular query. We have an automated system that looks at click-through rates per OneBox presentation per query. So it might be that news is performing really well on Bush today but it’s not performing very well on another term, it ultimately gets turned off due to lack of click-through rates. We are authorizing it in a way that’s scalable and does a pretty good job enforcing relevance.”

Also, check out this amazing excerpt from an FTC document that was obtained by the WSJ:

“In addition, click data (the website links on which a user actually clicks) is important for evaluating the quality of the search results page. As Google’s former chief of search quality Udi Manber testified:

‘The ranking itself is affected by the click data. If we discover that, for a particular query, hypothetically, 80 percent of people click on Result No. 2 and only 10 percent click on Result No. 1, after a while we figure out, well, probably Result 2 is the one people want. So we’ll switch it.’

Testimony from Sergey Brin and Eric Schmidt confirms that click data is important for many purposes, including, most importantly, providing ‘feedback’ on whether Google’s search algorithms are offering its users high quality results.”

Why organic Google CTR matters

If you have great positions in the SERPs, that’s awesome. But even high rankings don’t guarantee visits to your site.

What really matters is how many people are clicking on your listing (and not bouncing back immediately). You want to attract more visitors who are likely to stick around and then convert.

In 2009, the head of Google’s webspam team at the time, Matt Cutts, was asked about the importance of maximizing your organic CTR. Here’s a key quote that says it all:

“It doesn’t really matter how often you show up. It matters how often you get clicked on and then how often you … convert those to whatever you really want (sales, purchases, subscriptions)… Do spend some time looking at your title, your URL, and your snippet that Google generates, and see if you can find ways to improve that and make it better for users because then they’re more likely to click. You’ll get more visitors, you’ll get better return on your investment.”

In another video, he talked about the importance of titles, especially on your important web pages: “you want to make something that people will actually click on when they see it in the search results – something that lets them know you’re gonna have the answer they’re looking for.”

Bottom line: Google cares a lot about overall user engagement with the results they show in the SERPs. So if Google is testing your page for relevancy to a particular keyword search, and you want that test to go your way, you better have a great CTR (and great content and great task completion rates). Otherwise, you’ll fail the quality test and someone else will get chosen.

Testing the real impact of organic CTR on Google

Rand Fishkin conducted one of the most popular tests of the influence of CTR on Google’s search results. He asked people to do a specific search and click on the link to his blog (which was in 7th position). This impacted the rankings for a short period of time, moving the post up to 1st position.

imec-lab-google-ctr-test

But these are all temporary changes. The rankings don’t persist because the inflated CTR’s aren’t natural.

It’s like how you can’t increase your AdWords Quality Scores simply by clicking on your own ads a few times. This is the oldest trick in the book and it doesn’t work. (Sorry.)

Isn’t CTR too easy to game?

The results of another experiment appeared on Search Engine Land last August and concluded that CTR isn’t a ranking factor. But this test had a pretty significant flaw ­– it relied on bots artificially inflating CTRs and search volume (and this test was only for a single two-word keyword: “negative SEO”). So essentially, this test was the organic search equivalent of click fraud.

I’ve seen a lot of people saying Google will never use CTR in organic rankings because “it’s too easy to game” or “too easy to fake.” I disagree. Google AdWords has been fighting click fraud for 15 years and they can easily apply these learnings to organic search. There are plenty of ways to detect unnatural clicking. What did I just say about old tricks?

Before we look at the data, a final “disclaimer.” I don’t know if what this data reveals is due to RankBrain, or another machine-learning-based ranking signal that’s already part of the core Google algorithm. Regardless, there’s something here – and I can most certainly say with confidence that CTR is impacting rank.

NEW DATA: Does organic CTR impact SEO rankings?

Google has said that RankBrain is being tested on long-tail terms, which makes sense. Google wants to start testing its machine-learning system with searches they have little to no data on – and 99% of pages have zero external links pointing to them.

How is Google able to tell which pages should rank in these cases?

By examining engagement and relevance. CTR is one of the best indicators of both.

High-volume head terms, as far as we know, aren’t being exposed to RankBrain right now. So by observing the differences between the organic search CTRs of long-tail terms versus head terms, we should be able to spot the difference:

google-ctr-vs-organic-search-position-data

So here’s what we did: We looked at 1,000 keywords in the same keyword niche (to isolate external factors like Google shopping and other SERP features that can alter CTR characteristics). The keywords are all from my own website: wordstream.com.

I compared CTR versus rank for one- or two-word search terms, and did the same thing for long-tail keywords (search terms between 4 to 10 words).

Notice how the long-tail terms get much higher average CTRs for a given position. For example, in this data set, the head term in position 1 got an average CTR of 17.5%, whereas the long-tail term in position 1 had a remarkably high CTR, at an average of 33%.

You’re probably thinking: “Well, that makes sense. You’d expect long-tail terms to have stronger query intent, thus higher CTRs.” That’s true, actually.

But why is that long-tail keyword terms with high CTRs are so much more likely to be in top positions versus bottom-of-page organic positions? That’s a little weird, right?

OK, let’s do an analysis of paid search queries in the same niche. We use organic search to come up with paid search keyword ideas and vice versa, so we’re looking at the same keywords in many cases.

google-ctr-vs-position-paid-search

Long-tail terms in this same vertical get higher CTRs than head terms. However, the difference between long-tail and head term CTR is very small in positions 1–2, and becomes huge as you go out to lower positions.

So in summary, something unusual is happening:

  • In paid search, long-tail and head terms do roughly the same CTR in high ad spots (1–2) and see huge differences in CTR for lower spots (3–7).
  • But in organic search, the long-tail and head terms in spots 1–2 have huge differences in CTR and very little difference as you go down the page.

Why are the same keywords behaving so differently in organic versus paid?

The difference (we think) is that pages with higher organic click-through rates are getting a search ranking boost.

How to beat the expected organic search CTR

CTR and ranking are codependent variables. There’s obviously a relationship between the two, but which is causing what? In order to get to the bottom of this “chicken versus egg” situation, we’re going to have to do a bit more analysis.

The following graph takes the difference between an observed organic search CTR minus the expected CTR, to figure out if your page is beating — or being beaten by — the expected average CTR for a given organic position.

By only looking at the extent by which a keyword beats or is beaten by the predicted CTR, you are essentially isolating the natural relationship between CTR and ranking in order to get a better picture of what’s going on.

google-ctr-rankbrain-rewards-penalties

We found that, on average, if you beat the expected CTR, then you’re far more likely to rank in more prominent positions. Failing to beat the expected CTR makes it more likely you’ll appear in positions 6–10.

So, based on our example of long-tail search terms for this niche, if a page:

  • Beats the expected CTR for a given position by 20 percent, you’re likely to appear in position 1.
  • Beats the expected CTR for a given position by 12 percent, then you’re likely to appear in position 2.
  • Falls below the expected CTR for a given position by 6 percent, then you’re likely to appear in position 10.

And so on.

Here’s a greatly simplified rule of thumb:

The more your pages beat the expected organic CTR for a given position, the more likely you are to appear in prominent organic positions.

If your pages fall below the expected organic Google search CTR, then you’ll find your pages in lower organic positions on the SERP.

Want to move up by one position in Google’s rankings? Increase your CTR by 3%. Want to move up another spot? Increase your CTR by another 3%.

If you can’t beat the expected click-through rate for a given position, you’re unlikely to appear in positions 1–5.

Essentially, you can think of all of this as though Google is giving bonus points to pages that have high click-through rates. The fact that it looks punitive is just a natural side effect.

If Google gives “high CTR bonus points” to other websites, then your relative performance will decline. It’s not that you got penalized; it’s just that you didn’t get the rewards.

Four crucial ways to raise your Google CTRs

Many “expert” SEOs will tell you not to waste time trying to maximize your CTRs since it’s supposedly “not a direct ranking signal.” “Let’s build more links and make more infographics,” they say.

I couldn’t disagree more. If you want to rank better, you need to get more people to your website. (And getting people to your website is the whole point of ranking anyway!)

AdWords and many other technologies look at user engagement signals to determine page quality and relevance. We’ve already seen evidence that CTR is important to Google.

So how do you raise your Google CTRs – not just for a few days, but in a sustained way? You should focus your efforts in four key areas:

  1. Optimize pages with low “organic Quality Scores.” Download all of your query data from the Google Search Console. Sort your data, figure out which of your pages have below average CTRs, and prioritize those. Don’t risk turning one of your unicorn pages with an awesome CTR into a donkey with a terrible CTR! It’s far less risky turning a donkey into unicorn!
  2. Combine your SEO keywords with emotional triggers to create irresistible headlines. Emotions like anger, disgust, affirmation, and fear are proven to increase click-through rates and conversion rates. If everyone who you want to beat already has crafted optimized title tags, then packing an emotional wallop will give you the edge you need and make your listing stand out.
  3. Work to improve other user engagement metrics. Like click-through rate, we believe you need to have better-than-expected engagement metrics (e.g. time on site and bounce rate). This is a critical relevance signal! Google has enough data to know the expected conversion and engagement rates based on a variety of factors (e.g. industry, query, location, time of day, device type). If your content performs well, you’re likely going to get a rankings boost. If your content does poorly, there’s not necessarily a penalty, but you definitely won’t get any bonus points.
  4. Use social media ads and remarketing to increase search volume and CTR. Paid social ads and remarketing display ads can generate serious awareness and exposure for a reasonable cost (no more than $50 a day). If people aren’t familiar with your brand, bombard your target audience with Facebook and Twitter ads. People who are familiar with your brand are 2x more likely to click through and to convert!

Just say no to low Google CTRs!

You want to make sure your pages get as many organic search clicks as possible. Doing so means more people are visiting your site, which will send important signals to Google that your page is relevant and awesome.

Our research also shows that above-expected user engagement metrics result in better organic rankings, which results in even more clicks to your site.

Don’t settle for average CTRs. Be a unicorn in a sea of donkeys! Raise your CTRs and engagement rates! Get optimizing now!

This article was originally published on the Word Stream blog, reprinted with permission.

View full post on Search Engine Watch

Why you NEED to raise organic CTR’s (and how to do it)

Does organic click-through rate (CTR) data impact page rankings on Google? This has been a huge topic of speculation for years within the search industry.

Why is there such a debate? Well, often people get hung up on details and semantics (are we talking about a direct or indirect ranking factor?), Google patents (which may or may not even be in use), and competing theories (everyone’s got an opinion based off something they heard or read). To make matters more confusing, Google is less than forthcoming about the secrets of their algorithm.

But if CTR truly does impact Google’s organic search rankings, shouldn’t we be able to measure it? Yes!

In this post, I’ll share some intriguing data on the relationship between Google CTR and rankings. I’ll also share four tips for making sure your Google click-through rates on the organic SERPs are where they need to be.

four-organic-ctr-hacks

To be clear: my goal with this post is to provide just a brief background and some actionable insights about the topic of organic click-through rates on Google. We won’t dissect every tweet or quote ever made by anyone at Google, dive deep into patents, or refute all the SEO theories about whether CTR is or isn’t a ranking factor. I’m sharing my own theory based on what I’ve seen, and my recommendations on how to act on it.

Google CTR & rankings: Yes! No! Who bloody knows!

Eric Enge of Stone Temple Consulting recently published a post with a headline stating that CTR isn’t a ranking factor. He clarifies within that post that Google doesn’t use CTR as a direct ranking factor.

What’s the difference between a direct and indirect ranking factor? Well, I suggest you watch Rand Fishkin’s awesome video on this very topic.

Basically, we know certain things directly impact rankings (I got a link from a reputable website, hooray!), but there are many other things that don’t have a direct impact, but nevertheless do impact ranking (some big-time influencer tweeted about my company and now tons of people are searching for us and checking out our site, awesome!).

It’s essentially the same issue as last touch attribution, which assigns all the credit to the last interaction. But in reality, multiple channels (PPC, organic, social, email, affiliates, etc.) can play important roles in the path to conversion.

The same is true with ranking. Many factors influence ranking.

So here’s my response: Direct, indirect, who cares? CTR might not be a “direct core ranking signal,” but if it impacts rank (and I believe it does), then it matters. Further, even if it doesn’t impact rank, you should still care!

But don’t take my word for it that Google has the technology. Check out these slides from Google engineer Paul Haahr, who spoke at SMX:

how-does-google-use-click-data

google-ctr-live-experiments

Also, AJ Kohn put together a good post about Google click-through rate as a ranking signal last year. He included a couple eye-opening quotes that I’ll share here because they are important. The first from Edmond Lau, a former Google engineer:

“It’s pretty clear that any reasonable search engine would use click data on their own results to feed back into ranking to improve the quality of search results. Infrequently clicked results should drop toward the bottom because they’re less relevant, and frequently clicked results bubble toward the top. Building a feedback loop is a fairly obvious step forward in quality for both search and recommendations systems, and a smart search engine would incorporate the data.”

The second from Marissa Mayer in 2007 talking about how Google used CTR as a way to determine when to display a OneBox:

“We hold them to a very high click-through rate expectation and if they don’t meet that click-through rate, the OneBox gets turned off on that particular query. We have an automated system that looks at click-through rates per OneBox presentation per query. So it might be that news is performing really well on Bush today but it’s not performing very well on another term, it ultimately gets turned off due to lack of click-through rates. We are authorizing it in a way that’s scalable and does a pretty good job enforcing relevance.”

Also, check out this amazing excerpt from an FTC document that was obtained by the WSJ:

“In addition, click data (the website links on which a user actually clicks) is important for evaluating the quality of the search results page. As Google’s former chief of search quality Udi Manber testified:

‘The ranking itself is affected by the click data. If we discover that, for a particular query, hypothetically, 80 percent of people click on Result No. 2 and only 10 percent click on Result No. 1, after a while we figure out, well, probably Result 2 is the one people want. So we’ll switch it.’

Testimony from Sergey Brin and Eric Schmidt confirms that click data is important for many purposes, including, most importantly, providing ‘feedback’ on whether Google’s search algorithms are offering its users high quality results.”

Why organic Google CTR matters

If you have great positions in the SERPs, that’s awesome. But even high rankings don’t guarantee visits to your site.

What really matters is how many people are clicking on your listing (and not bouncing back immediately). You want to attract more visitors who are likely to stick around and then convert.

In 2009, the head of Google’s webspam team at the time, Matt Cutts, was asked about the importance of maximizing your organic CTR. Here’s a key quote that says it all:

“It doesn’t really matter how often you show up. It matters how often you get clicked on and then how often you … convert those to whatever you really want (sales, purchases, subscriptions)… Do spend some time looking at your title, your URL, and your snippet that Google generates, and see if you can find ways to improve that and make it better for users because then they’re more likely to click. You’ll get more visitors, you’ll get better return on your investment.”

In another video, he talked about the importance of titles, especially on your important web pages: “you want to make something that people will actually click on when they see it in the search results – something that lets them know you’re gonna have the answer they’re looking for.”

Bottom line: Google cares a lot about overall user engagement with the results they show in the SERPs. So if Google is testing your page for relevancy to a particular keyword search, and you want that test to go your way, you better have a great CTR (and great content and great task completion rates). Otherwise, you’ll fail the quality test and someone else will get chosen.

Testing the real impact of organic CTR on Google

Rand Fishkin conducted one of the most popular tests of the influence of CTR on Google’s search results. He asked people to do a specific search and click on the link to his blog (which was in 7th position). This impacted the rankings for a short period of time, moving the post up to 1st position.

imec-lab-google-ctr-test

But these are all temporary changes. The rankings don’t persist because the inflated CTR’s aren’t natural.

It’s like how you can’t increase your AdWords Quality Scores simply by clicking on your own ads a few times. This is the oldest trick in the book and it doesn’t work. (Sorry.)

Isn’t CTR too easy to game?

The results of another experiment appeared on Search Engine Land last August and concluded that CTR isn’t a ranking factor. But this test had a pretty significant flaw ­– it relied on bots artificially inflating CTRs and search volume (and this test was only for a single two-word keyword: “negative SEO”). So essentially, this test was the organic search equivalent of click fraud.

I’ve seen a lot of people saying Google will never use CTR in organic rankings because “it’s too easy to game” or “too easy to fake.” I disagree. Google AdWords has been fighting click fraud for 15 years and they can easily apply these learnings to organic search. There are plenty of ways to detect unnatural clicking. What did I just say about old tricks?

Before we look at the data, a final “disclaimer.” I don’t know if what this data reveals is due to RankBrain, or another machine-learning-based ranking signal that’s already part of the core Google algorithm. Regardless, there’s something here – and I can most certainly say with confidence that CTR is impacting rank.

NEW DATA: Does organic CTR impact SEO rankings?

Google has said that RankBrain is being tested on long-tail terms, which makes sense. Google wants to start testing its machine-learning system with searches they have little to no data on – and 99% of pages have zero external links pointing to them.

How is Google able to tell which pages should rank in these cases?

By examining engagement and relevance. CTR is one of the best indicators of both.

High-volume head terms, as far as we know, aren’t being exposed to RankBrain right now. So by observing the differences between the organic search CTRs of long-tail terms versus head terms, we should be able to spot the difference:

google-ctr-vs-organic-search-position-data

So here’s what we did: We looked at 1,000 keywords in the same keyword niche (to isolate external factors like Google shopping and other SERP features that can alter CTR characteristics). The keywords are all from my own website: wordstream.com.

I compared CTR versus rank for one- or two-word search terms, and did the same thing for long-tail keywords (search terms between 4 to 10 words).

Notice how the long-tail terms get much higher average CTRs for a given position. For example, in this data set, the head term in position 1 got an average CTR of 17.5%, whereas the long-tail term in position 1 had a remarkably high CTR, at an average of 33%.

You’re probably thinking: “Well, that makes sense. You’d expect long-tail terms to have stronger query intent, thus higher CTRs.” That’s true, actually.

But why is that long-tail keyword terms with high CTRs are so much more likely to be in top positions versus bottom-of-page organic positions? That’s a little weird, right?

OK, let’s do an analysis of paid search queries in the same niche. We use organic search to come up with paid search keyword ideas and vice versa, so we’re looking at the same keywords in many cases.

google-ctr-vs-position-paid-search

Long-tail terms in this same vertical get higher CTRs than head terms. However, the difference between long-tail and head term CTR is very small in positions 1–2, and becomes huge as you go out to lower positions.

So in summary, something unusual is happening:

  • In paid search, long-tail and head terms do roughly the same CTR in high ad spots (1–2) and see huge differences in CTR for lower spots (3–7).
  • But in organic search, the long-tail and head terms in spots 1–2 have huge differences in CTR and very little difference as you go down the page.

Why are the same keywords behaving so differently in organic versus paid?

The difference (we think) is that pages with higher organic click-through rates are getting a search ranking boost.

How to beat the expected organic search CTR

CTR and ranking are codependent variables. There’s obviously a relationship between the two, but which is causing what? In order to get to the bottom of this “chicken versus egg” situation, we’re going to have to do a bit more analysis.

The following graph takes the difference between an observed organic search CTR minus the expected CTR, to figure out if your page is beating — or being beaten by — the expected average CTR for a given organic position.

By only looking at the extent by which a keyword beats or is beaten by the predicted CTR, you are essentially isolating the natural relationship between CTR and ranking in order to get a better picture of what’s going on.

google-ctr-rankbrain-rewards-penalties

We found that, on average, if you beat the expected CTR, then you’re far more likely to rank in more prominent positions. Failing to beat the expected CTR makes it more likely you’ll appear in positions 6–10.

So, based on our example of long-tail search terms for this niche, if a page:

  • Beats the expected CTR for a given position by 20 percent, you’re likely to appear in position 1.
  • Beats the expected CTR for a given position by 12 percent, then you’re likely to appear in position 2.
  • Falls below the expected CTR for a given position by 6 percent, then you’re likely to appear in position 10.

And so on.

Here’s a greatly simplified rule of thumb:

The more your pages beat the expected organic CTR for a given position, the more likely you are to appear in prominent organic positions.

If your pages fall below the expected organic Google search CTR, then you’ll find your pages in lower organic positions on the SERP.

Want to move up by one position in Google’s rankings? Increase your CTR by 3%. Want to move up another spot? Increase your CTR by another 3%.

If you can’t beat the expected click-through rate for a given position, you’re unlikely to appear in positions 1–5.

Essentially, you can think of all of this as though Google is giving bonus points to pages that have high click-through rates. The fact that it looks punitive is just a natural side effect.

If Google gives “high CTR bonus points” to other websites, then your relative performance will decline. It’s not that you got penalized; it’s just that you didn’t get the rewards.

Four crucial ways to raise your Google CTRs

Many “expert” SEOs will tell you not to waste time trying to maximize your CTRs since it’s supposedly “not a direct ranking signal.” “Let’s build more links and make more infographics,” they say.

I couldn’t disagree more. If you want to rank better, you need to get more people to your website. (And getting people to your website is the whole point of ranking anyway!)

AdWords and many other technologies look at user engagement signals to determine page quality and relevance. We’ve already seen evidence that CTR is important to Google.

So how do you raise your Google CTRs – not just for a few days, but in a sustained way? You should focus your efforts in four key areas:

  1. Optimize pages with low “organic Quality Scores.” Download all of your query data from the Google Search Console. Sort your data, figure out which of your pages have below average CTRs, and prioritize those. Don’t risk turning one of your unicorn pages with an awesome CTR into a donkey with a terrible CTR! It’s far less risky turning a donkey into unicorn!
  2. Combine your SEO keywords with emotional triggers to create irresistible headlines. Emotions like anger, disgust, affirmation, and fear are proven to increase click-through rates and conversion rates. If everyone who you want to beat already has crafted optimized title tags, then packing an emotional wallop will give you the edge you need and make your listing stand out.
  3. Work to improve other user engagement metrics. Like click-through rate, we believe you need to have better-than-expected engagement metrics (e.g. time on site and bounce rate). This is a critical relevance signal! Google has enough data to know the expected conversion and engagement rates based on a variety of factors (e.g. industry, query, location, time of day, device type). If your content performs well, you’re likely going to get a rankings boost. If your content does poorly, there’s not necessarily a penalty, but you definitely won’t get any bonus points.
  4. Use social media ads and remarketing to increase search volume and CTR. Paid social ads and remarketing display ads can generate serious awareness and exposure for a reasonable cost (no more than $50 a day). If people aren’t familiar with your brand, bombard your target audience with Facebook and Twitter ads. People who are familiar with your brand are 2x more likely to click through and to convert!

Just say no to low Google CTRs!

You want to make sure your pages get as many organic search clicks as possible. Doing so means more people are visiting your site, which will send important signals to Google that your page is relevant and awesome.

Our research also shows that above-expected user engagement metrics result in better organic rankings, which results in even more clicks to your site.

Don’t settle for average CTRs. Be a unicorn in a sea of donkeys! Raise your CTRs and engagement rates! Get optimizing now!

This article was originally published on the Word Stream blog, reprinted with permission.

View full post on Search Engine Watch

How Google fights webspam and what you need to learn from this

Google has this week revealed its annual report on how it has policed the internet over the last 12 months. Or at least how it policed the vast chunk of the internet it allows on its results pages.

Although it’s self-congratulatory stuff, and as much as you can rightfully argue with some of Google’s recent penalties, you do need to understand what Google is punishing in terms of ‘bad quality’ internet experiences so you can avoid the same mistakes.

It’s important to remember that Google for some people IS the internet, or at least the ‘front door’ to it (sorry Reddit), but it’s equally important to remember that Google is still a product; one that needs to make money to survive and (theoretically) provide the best possible experience for its users, or else it is off to DuckDuckGo they… uh… go.

So therefore Google has to ensure the results it serves on its SERPs (search engine results pages) are of the highest quality possible. Algorithms are built and manual reviews by actual human beings are carried out to ensure crappy websites with stolen/thin/manipulative/harmful content stay hidden.

Here’s how Google is currently kicking ass and taking names… and how you can avoid falling between its crosshairs.

google webspam

How Google fought webspam

According to Google, an algorithmic update helped remove the amount of webspam in search results, impacting 5% of queries.

The remaining spam was tackled manually. Google sent more than 4.3 million messages to webmasters notifying them of manual actions it had imposed on sites affected by spam.

Following this, Google saw a 33% increase in the number of sites that went through a spam clean-up “towards a successful reconsideration process.” It’s unclear whether the remaining sites are still in the process of appealing, or have been booted off the face of the internet.

Who watches the watchmen?

More than 400,000 spam reports were manually submitted by Google users around the world. Google acted on 65% of them, and considered 80% of those acted upon to be spam.

Hacking

There was a huge 180% increase in websites being hacked in 2015, compared to the previous year. Hacking can take on a number of guises, whether its website spam or malware, but the result will be the same. You’ll be placed ‘in quarantine’ and your site will be flagged or removed.

Google has a number of official guidelines on how to help avoid being hacked. These include:

  • Strengthen your account security with lengthy, difficult to guess or crack passwords and not reusing those passwords across platforms.
  • Keep your site’s software updated, including its CMS and various plug-ins.
  • Research how your hosting provider handles security issues and check its policy when it comes to cleaning up hacked sites. Will it offer live support if your site is compromised?
  • Use tools to stay informed of potential hacked content on your site. Signing up to Search Console is a must, as it’s Google’s way of communicating any site issues with you.

google spam fighting

Thin, low quality content

Google saw an increase in the number of sites with thin, low quality content, a substantial amount likely to be provided by scraper sites.

Unfortunately there is very little you can do if your site is being scraped, as Google has discontinued its reporting tool and believes this problem to be your own fault. You just have to be confident that your own site’s authority, architecture and remaining content is enough to ensures it ranks higher than a scraper site.

If you have been served a manual penalty for ‘thin content with little or no added value’ there are things you can do to rectify it, which can mostly be boiled down to ‘stop making crappy content, duh’.

1) Start by checking your site for the following:

  • Auto-generated content: automatically generated content that reads like it was written by a piece of software because it probably was.
  • Thin content pages with affiliate links: affiliate links in quality articles are fine, but pages where the affiliates contain descriptions or reviews copied directly from the original retailer without any added original content are bad. As a rule, affiliates should form only a small part of the content of your site.
  • Scraped content: if you’re a site that automatically scrapes and republishes entire articles from other websites without permission then you should just flick the off-switch right away.
  • Doorway pages: these are pages which can appear multiple times for a particular query’s search results but ultimately lead users to the same destination. The purpose of doorway pages are purely to manipulate rankings.

2) Chuck them all in the bin.

3) If after all that you’re 100% sure your site somehow offers value, then you can resubmit to Google for reconsideration.

For more information on Google’s fight against webspam, read its official blog-post.

And finally, I’ll leave you with this terrifying vision of things to come…

robots and people

View full post on Search Engine Watch

Go to Top
Copyright © 1992-2016, DC2NET All rights reserved