Posts tagged they

Facebook To Give Users More Control Over The Ads They See by @mattsouthern

In an announcement made today, Facebook is taking a cue from its users and taking two major steps to make ads better. In the first step, Facebook will be introducing interest-based advertising to users in the US: When we ask people about our ads, one of the top things they tell us is that they want to see ads that are more relevant to their interests. Today, we learn about your interests primarily from the things you do on Facebook, such as Pages you like. So, for example, if you’re in the market for a new TV and start shopping […]

The post Facebook To Give Users More Control Over The Ads They See by @mattsouthern appeared first on Search Engine Journal.

View full post on Search Engine Journal

SEO and Content Marketing, and How They Work Hand in Hand – iMedia Connection (blog)

SEO and Content Marketing, and How They Work Hand in Hand
iMedia Connection (blog)
When it comes to online marketing, it is important to grab the attention of people and search engines. In order for businesses to do that right, it takes a concerted focus on both search engine optimization (SEO) and content marketing, and it is

and more »

View full post on SEO – Google News

PPC & SEO: How They Work Together to Maximize Results – Dealer Marketing Magazine

PPC & SEO: How They Work Together to Maximize Results
Dealer Marketing Magazine
With the ongoing battle to beat the competition and climb to the top of Google's first page, a common question we hear is: “Which is more important for my dealership, PPC or SEO?” To answer this question, it's important to understand PPC and SEO have
Adwords/PPC SpecialistBizcommunity.com
Partial PPC Management is an Epidemic and it's Killing Your Businesses RevenueBusiness 2 Community

all 3 news articles »

View full post on SEO – Google News

5 Googley SEO Terms – Do They Mean What You Think They Mean? – Search Engine Watch

5 Googley SEO Terms – Do They Mean What You Think They Mean?
Search Engine Watch
Many terms are used within the search industry. These words all have very specific meanings when it comes to search engine optimization (SEO) and how you implement your strategy. However, many of these terms are often used incorrectly. Sometimes this …

View full post on SEO – Google News

5 Googley SEO Terms – Do They Mean What You Think They Mean?

Robots.txt. Google DNS. Penguin, Panda, and penalties. Duplicate content filter. PageRank. You’ve heard all these terms, but these words have very specific meanings and are among the most commonly misunderstood terms when it comes to Google and SEO.

View full post on Search Engine Watch – Latest

Link Building ≠ Content Marketing. But Here’s How They Fit Together

Link building has gone through a lot in the past 18 months — and when content marketing became the buzzword of the day, there were troves of articles saying that content marketing is the new link building or that content marketing has replaced link building. That couldn’t be more false….



Please visit Search Engine Land for the full article.

View full post on Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Penguin Penalties: Do Webmasters Respond the Way They Should?

Posted by russvirante

Penalization has become a regular part of the search engine optimization experience. Hell, it has changed the entire business model of Virante to building tools and services around penalty recovery and not just optimization. While penalties used to be a crude badge of honor worn by those leaning towards the black-hat side of the SEO arts, it is now a regular occurrence that seems to impact those with the best intentions. At Virante, we have learned a lot about penalties over the last few years—discerning between manual and algorithmic, Panda and Penguin, recovery methodologies and risk mitigation—but not much study has been done on the general response of websites to penalizations. We have focused more on what webmasters
ought to do without studying what webmasters actually do in response to various penalties.

How webmasters respond matters

As much as we often feel a communion among other SEOs in our resistance to Google, the reality is that we are engaged in a competitive industry where we fight for customers in a very direct manner. This duality of competition—with Google and with each other—plays out in a very unique way when Google penalizes a competitor. We learn a great deal in the following months about the competition, such as the sophistication of their team (how quickly they respond, how many links they remove, how quickly they recover), their financial strength (do they increase ad spend, how much and on what terms), and whether they eventually recover.

It is also important from a wider perspective of understanding Google’s justifications for particular types of penalties that seem sweeping and inconsistent. Conspiracy theories abound regarding Penguin updates; I can’t count how many times I have heard someone say that penalties are placed to encourage webmasters to switch to AdWords.

So, I decided to investigate the behavior of webmasters post-Penguin from a macro perspective to determine what kinds of responses we are likely to see, and perhaps even answer some questions about Google’s motivations in the process.

The methodology

  1. Collect examples: I collected a list of 100 domains that were penalized by Penguin 2.0 last year and confirmed their penalization through SEMRush.
  2. Establish controls: For each penalized site, I identified one website that ranked in the top 10 for their primary keyword that was not penalized.
  3. Get rankings and AdWords data: For each site (both penalized and control), we grabbed their historical rankings and AdWords spend from SEMRush for the months leading up to and following Penguin 2.0
  4. Get historical link data: For each site (both penalized and control), we grabbed their historical link data from Majsetic SEO for the months leading up to and following Penguin 2.0.
  5. Analyze results: Using simple regression models, we identified patterns among penalized sites that differed significantly from the control sites.

Do webmasters remove bad links?

After a Penguin 2.0 update, it is imperative to identify and remove bad links or, at minimum, disavow them. While we can’t measure disavow data, we can measure link acquisition data quite easily. So, do webmasters in general follow the expectations of link removal following a penalty?

Aggressive link removal: It appears that aggressive link removal is a common response to Penguin, as expected. However, we have to be careful with the statistics to make sure we correctly examine the degree and frequency with which link removal is employed. The control group on average increased their root linking domains by 41 following Penguin 2.0, but that could best be explained by a few larger sites increasing their links. When looking at an average of link proportions, only about 22% of the control sites actually saw an increase in links in the three months post-Penguin. The sites that were penalized saw a drop of 578 root linking domains. However, once again, this statistic is impacted by the link graph size of the individual penalized sites. 15% of those penalized still saw an increase in links in the three months following Penguin.

So, approximately 22% of domains not impacted by Penguin 2.0 had more root linking domains three months after the penalty, while only 15% of those penalized had more root linking domains post-Penguin. Notice how small the discrepancy is here. Webmasters responded differently only by 7% depending on whether or not they were penalized. While certainly those penalized removed more links, the practice of link building in general was very similarly affected. In the three months following Penguin, 78% of the control websites either dropped links or at least stopped link building and lost them through attribution. This is remarkable. There appears to be a deadening effect related to Penguin that impacts all sites—not just those that are penalized. While many of us expected Penguin to have a profound impact on link growth as webmasters respond to fears of future penalties, it is still amazing to see it borne out in the numbers.

Deadening Link Growth

What I find more interesting is the variation in webmaster responses to Penguin 2.0. Some penalized webmasters actually doubled down on link building, likely attributing their rankings loss to having too few links, rather than being penalized. We can tease this type of behavior out of the numbers by looking at the variances in percentage link change over time.

The variance among link fluctuations for sites that were not penalized was .08, but the variance among sites that were penalized was .38. This means that the behavior of websites after being penalized was far more erratic than those that were not. Some penalized sites made the poor decisions to greatly increase their links, although more sites made the decision to greatly decrease their links. If all webmasters responded uniformly to penalties, one would not expect to see such an increase in variance.

As SEOs, we clearly have our work cut out for ourselves in teaching webmasters that the appropriate response to a penalty is very much NOT adding more and more links to your profile, because this behavior is actually more common than link removal post-penalty. It is worth pointing out that it is possible that the webmasters disavowed links rather than removing them. We do not have access to that data, so we cannot be certain regarding that procedure. It is possible that some webmasters chose to disavow while others removed, and that the net impact on link value was identical, thus making the variance calculation false.

Do webmasters increase their ad spend?

I’ll admit, I had my fingers crossed on this one. Honestly, who doesn’t want to show that Google is just penalizing webmasters because it helps their bottom line? Wouldn’t it be great to catch the search quality team not being honest with us about their fiduciary independence?

Well, unfortunately it just doesn’t bear out. The evidence is fairly clear that there is no reason to believe that webmasters increase ad-spend following a Penguin 2.0 penalty. Let’s look at the numbers.

Ad Traffic Increase

First, across our data set, no one who was an advertiser prior to Penguin 2.0 stopped advertising in AdWords in the three months after. Of the sites that were not advertisers prior to Penguin 2.0, 10% of those not penalized ended up becoming advertisers in AdWords, while only 4% of those penalized became advertisers. Sites that weren’t penalized were far more likely to join the AdWords program than those that were.

It wasn’t only true that those unaffected by Penguin 2.0 were more likely to sign up for AdWords; they increased their average Ad-spend, too. There was a 78% greater increase in ad-spend by those unaffected by Penguin 2.0 than those who were. Moreover, bidding shifts for those not impacted by Penguin remained similar in two month intervals across multiple randomly selected three-month differences, meaning that there appeared to be no related impact whatsoever.

We can safely conclude from this that there does not appear to be a direct, causal relationship between Penguin penalties and increased AdWords spending. Now, one could of course make the argument that better search results might increase ad revenue in the future as Google attracts more users to a better search engine, but accusations of a fiduciary motivation for releasing updates like Penguin 2.0 cannot be substantiated with this data.

Do they recover?

By the 5th month, approximately 24% of sites that were penalized were at or above their pre-Penguin 2.0 traffic. This is an exciting outcome because it does show recovery from Penguin is possible. Perhaps most important, sites that were penalized and removed links on average recovered 28% more traffic in the five months after Penguin than those that did not remove links. We have good evidence to suggest at least a correlation between post-penalty link removal and traffic recovery. Of course, we do have to take this with a grain of salt for a number of reasons:

  • Sites that removed links may have been more likely to use the disavow tool as well.
  • Sites that removed links may have been more SEO-savvy in general and fixed on-site issues.
  • Sites that did not remove links may have had more intractable penalties, thus their lack of removal was a conscious decision related to the futility of a removal campaign.

These types of alternate explanations should always be entertained when using correlative statistics. What we do have good evidence of is that traffic recovery is possible for sites hit by Penguin, although it is by no means guaranteed or universal. Penguin 2.0 needn’t be a death sentence.

Takeaways

So, in a few weeks, we are likely to see another Penguin update, assuming Google follows its late-spring release date. When Penguin hits, be ready—even if you aren’t going to be penalized. Here are some things you should be doing…

  1. Know your bad links already. There is no reason to wait to be prepared for removal or disavowal. While I personally think that preemptive disavowal is likely the best practice, there is no excuse to just wait.
  2. Don’t worry about AdWords. There is no statistical evidence that your competition will surge post-Penguin in any meaningful fashion. The competitors who might come to depend move on AdWords also have less organic revenue to invest in the first place. At best, these even out.
  3. Don’t double down. While we can’t be certain that link removal gets you out of penalties (it is merely correlated), we can be certain that even a correlation doesn’t exist for increasing links and earning recovery post-Penguin penalties.
  4. Never assume. The behavior of your competitors and of Google itself is far more complex than off-the-cuff assumptions like “Google just penalizes sites to force people into AdWords” or that your business will know intuitively to remove or disavow links post-Penguin.

Hopefully, this time around we will all be more prepared for the appropriate response to Google’s next big update—whether we are hit or not.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

View full post on Moz Blog

LinkedIn Announces They Have Reached 300 Million Members Worldwide

LinkedIn announced today  they have reached the milestone of over 300 million members worldwide, more than half of […]

Author information

Matt Southern

Matt Southern is a marketing, communications and public relations professional. He provides strategic digital marketing services at an agency called Bureau in Ontario, Canada. He has a bachelors degree in communication and an unparalleled passion for helping businesses get their message out.

The post LinkedIn Announces They Have Reached 300 Million Members Worldwide appeared first on Search Engine Journal.

View full post on Search Engine Journal

6 Changes We Always Thought Google Would Make to SEO that They Haven’t Yet – Whiteboard Friday

Posted by randfish

From Google’s interpretation of rel=”canonical” to the specificity of anchor text within a link, there are several areas where we thought Google would make a move and are still waiting for it to happen. In today’s Whiteboard Friday, Rand details six of those areas. Let us know where you think things are going in the comments!

For reference, here’s a still of this week’s whiteboard!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. Today, I’m going to tackle a subject around some of these changes that a lot of us in the marketing and SEO fields thought Google would be making, but weirdly they haven’t.

This comes up because I talk to a lot of people in the industry. You know, I’ve been on the road the last few weeks at a number of conferences – Boston for SearchLove and SMX Munich, both of which were great events – and I’m going to be heading to a bunch more soon. People have this idea that Google must be doing these things, must have made these advancements over the years. It turns out, in actuality, they haven’t made them. Some of them, there are probably really good reasons behind it, and some of them it might just be because they’re really hard to do.

But let’s talk through a few of these, and in the comments we can get into some discussion about whether, when, or if they might be doing some of these.

So number one, a lot of people in the SEO field, and even outside the field, think that it must be the case that if links really matter for SEO, then on-topic links matter more than off-topic links. So, for example, if I’m linking to two websites here about gardening resources, A and B, both about gardening resources, and one of those comes from a botany site and the other one comes from a site about mobile gaming, well, all other things being true, it must be that the one about botany is going to provide a stronger link. That’s just got to be the case.

And yet, we cannot seem to prove this. There doesn’t seem to be data behind it or to support it. Anyone who’s analyzed this problem in-depth, which a number of SEOs have over the years — a lot of people, who are very advanced, have gone through the process of classifying links and all this kind of stuff — seem to come to the same conclusion, which is Google seems to really think about links in a more subject/context agnostic perspective.

I think this might be one of those times where they have the technology to do it. They just don’t want to. My guess is what they’ve found is if they bias to these sorts of things, they get a very insular view on what’s kind of popular and important on the Web, and if they have this more broad view, they can actually get better results. It turns out that maybe it is the case that the gardening resources site that botanists love is not the one with mass appeal, is not the one that everyone is going to find useful and valuable, and isn’t representing the entirety of what the Web thinks about who should be ranking for gardening resources. So they’ve kind of biased against this.

That is my guess. But from every observable input we’ve been able to run, every test I’ve ever seen from anybody else, it seems to be the case that if there’s any bias, it’s extremely slight, almost unnoticeable. Fascinating.

Number two, I’m actually in this camp. I still think that someday it’s coming, that anchor text influence will eventually decline. Yet it seems to be that, yes, while other signals have certainly risen in importance, and there have been lots of other things, it seems that anchor text inside a link is still far more important and better than generic anchor text.

Getting specific, targeting something like “gardening supplies” when I link to A, as opposed to on the same page saying something like, “Oh, this is also a good resource for gardening supplies,” but all I linked with was the text “a good resource” over to B, that A is going to get a lot more ranking power. Again, all other things being equal, A will rank much higher than B, because this anchor text is still pretty influential. It has a fairly substantive effect.

I think this is one of those cases where a lot of SEOs said, “Hey, anchor text is where a lot of manipulation and abuse is happening. It’s where a lot of Web spam happens. Clearly Google’s going to take some action against this.”

My guess, again, is that they’ve seen that the results just aren’t as good without it. This speaks to the power of being able to generate good anchor text. A lot of that, especially when you’re doing content marketing kinds of things for SEO, depends on nomenclature, naming, and branding practices. It’s really about what you call things and what you can get the community and your world to call things. Hummingbird has made advancements in how Google does a lot of this text recognition, but for these tough phrases, anchor text is still strong.

Number three, 302s. So 302s have been one of these sort of long-standing kind of messes of the Web, where a 302 was originally intended as a temporary redirect, but many, many websites and types of servers default to 302s for all kinds of pages that are moving.

So A301 redirects to B, versus C302 redirecting to D. Is it really the case that the people who run C plan to change where the redirect points in the future, and is it really the case that they do so more than A does with B?

Well, a lot of the time, probably not. But it still is the case, and you can see plenty of examples of this happening out in the search results and out on the Web, that Google interprets this 301 as being a permanent redirect. All the link juice from A is going to pass right over to B.

With C and D, it appears, with big brands, when the redirect’s been in place for a long time and they have some trust in it, maybe they see some other signals, some other links pointing over here, that yes, some of this does pass over, but it is not nearly what’s happening with a 301. This is like a directive, and this is sort of a nudge or a hint. It just seems to be important to still get those 301s, those right kinds of redirects right.

By the way, there are also a lot of other kinds of 30X status codes that can be issued on the Web and that servers might fire. So be careful. You see a 305, a 307, 309, something weird, you probably want a 301 if you’re trying to do a permanent redirect. So be cautious of that.

(Number four): Speaking of nudges and hints versus directives, rel=”canonical” has been an interesting one. So when rel=”canonical” first launched, what Google said about rel=”canonical” is rel=”canonical” is a hint to us, but we won’t necessarily take it as gospel.

Yet, every test we saw, even from those early launch days, was, man, they are taking it as gospel. You throw a rel=”canonical” on a trusted site accidentally on every page and point it back to the homepage, Google suddenly doesn’t index anything but the homepage. It’s crazy.

You know what? The tests that we’ve seen run and mistakes — oftentimes, sadly, it’s mistakes that are our examples here — that have been made around rel=”canonical” have shown us that Google still has this pretty harsh interpretation that a rel=”canonical” means that the page at A is now at B, and they’re not looking tremendously at whether the content here is super similar. Sometimes they are, especially for manipulative kinds of things. But you’ve got to be careful, when you’re implementing rel=”canonical”, that you’re doing it properly, because you can de-index a lot of pages accidentally.

So this is an area of caution. It seems like Google still has not progressed on this front, and they’re taking that as a pretty basic directive.

Number five, I think, for a long time, a lot of us have thought, hey, the social web is rising. Social is where a lot of the great content is being shared, a lot of where people are pointing to important things, and where endorsements are happening, more so, potentially, than the link graph. It’s sort of the common man’s link graph has become the social web and the social graph.

And yet, with the exception of the two years where Google had a very direct partnership with Twitter and those tweets and indexation, all that kind of stuff was heavily influential for Google search results, since that partnership broke up, we haven’t seen that again from Google. They’ve actually sort of backtracked on social, and they’ve kind of said, “Hey, you know, tweets, Facebook shares, likes, that kind of stuff, it doesn’t directly impact rankings for everyone.”

Google+ being sort of an exception, especially in the personalized results. But even the tests we’ve done with Google+ for non-personalized results have appeared to do nothing, as yet.

So these shares that are happening all over social, I think what’s really happening here is that Google is taking a look and saying, “Hey, yes, lots of social sharing is going on.” But the good social sharing, the stuff that sticks around, the stuff that people really feel is important is still, later on at some point, earning a citation, earning a link, a mention, something that they can truly interpret and use in their ranking algorithm.

So they’re relying on the fact that social can be a tip-off or a tipping point for a piece of content or a website or a brand or a product, whatever it is, to achieve some popularity, but that will eventually be reflected in the link graph. They can wait until that happens rather than using social signals, which, to be fair, there’s some potential manipulation, I think that they’re worried about exposing themselves too. There’s also, of course, the case that they don’t have direct access. Well, they don’t have API-level access and partnerships with Facebook and Twitter anymore, and so that could be causing some of that too.

Number six, last one. I think a lot of us felt like, as Google was cleaning up web spam, for a long time they talked about cleaning up web spam, from ’06, ’07 to about 2011, 2012, it was pretty sketchy. It was tough.

When they did start cleaning up web spam, I think a lot of us thought, “Well, eventually they’re going to get to PPC too.” I don’t mean pay-per-click. I mean porn, pills, and casino.

But it turns out, as Matt Brown, from Moz, wisely and recently pointed out in his SearchLove presentation in Boston, that, yes, if you look at the search results around these categories, whatever it is — Buy Cialis online, Texas hold-’em no limit poker, removed for content, because Whiteboard Friday is family-friendly, folks — whatever the search is that you’re performing in these spheres, this is actually kind of the early warning SERPS of the SEO world.

You can see a lot of the changes that Google’s making around spam and authority and signal interpretation. One of the most interesting ones that you probably observed, if you study this space, is a lot of those hacked .edu pages, or barnacle SEO that was happening on sub-domains of more trusted sites that had gotten a bunch of links, that kind of stuff, that is ending a little bit. We’re seeing a little bit more of the rise, again, of like the exact match domains and some of the affiliate sites and getting links from more creative places, because it does seem like Google’s gotten quite a bit better at which links they consider and in how they judge the authoritativeness of pages that might be hanging on or clinging onto a domain, but aren’t well linked to internally on some of those more trusted sites.

So, that said, I’m looking forward to some fascinating comments. I’m sure we’re going to have some great discussions around these. We’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

View full post on Moz Blog

Google Responds To Misattributed Content Saying They May Show The Canonicalized URL

A few weeks ago, we reported on cases where Google was misattributing content for select news publishers. We showed examples of Google showing URLs of large news publishers but those URLs listed were not the source URLs and thus, redirecting from Google’s search results to a completely…



Please visit Search Engine Land for the full article.

View full post on Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Go to Top
Copyright © 1992-2014, DC2NET All rights reserved