Posts tagged Avoid

Common PPC mistakes and how to avoid them

Columnist Laura Collins lays out five mistakes she commonly sees in paid search and explains how to avoid them.

The post Common PPC mistakes and how to avoid them appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.

View full post on Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Five common keyword research mistakes you need to avoid

Before you dive into keyword research for your site, you should know about these common mistakes that many businesses and SEO firms make.

Avoiding these mistakes can save you time, help you re-think your marketing strategy, and drive the right customers to your site.

1) Picking keywords that are irrelevant to your customers

People often pick keywords with high search volumes in their field, but don’t pay enough attention to the relevancy of these keywords to their target customers. You need to choose keywords that match your customer’s concerns.

For example, if you’re targeting affluent families who are searching for good schools for their kids, you shouldn’t pick keywords like “low cost public schools ny” or “affordable schools ny”. These families aren’t searching for those keywords. Instead, you should optimize for keywords such as “best schools ny” and “elite boarding schools ny”

Each of your target customers have different needs and concerns, and they use different words when they search. You need to understand your customers and the language they use. Remember, each searcher has an intent, and is looking for something. Your page needs to provide the answer.

2) Focusing on too specific keywords

If you have a large site with lots of possible keywords combinations, you might be tempted to optimize for every little combination you can, in an attempt to cover them all. For example: type, color, price, size, etc. Do the math. This can lead to an unlimited amount of possible keywords.

Many of these lengthy combinations have low search volumes, with no one even searching for them. Also, targeting too many keywords can distract you from most important keywords.

Focus on the keywords that have good search volumes and the potential to drive business. Keyword quality is more important than keyword quantity.

keyword research

Don’t aim at generic keywords, or too specific keywords. It’s best to start with niche keywords that people use to search and buy your products/services.

These are the “low-hanging fruit” keywords that lead to customers who are easiest to close first. What are the highly specific long-tail keywords that pertain to your industry? Google calls this the “I want to buy” moment.

Once you have your best keyword groups, you can always expand them to target broader groups with various search intents.

art class search page

Example of a niche keyword: “art class for kids”

3) Selecting only a few big keywords

Another mistake that large websites often make is focusing on only a few top keywords. You only see this kind of approach in black-hat SEO claims, like “get top rankings for 30 big keywords” because black hat tactics (such as link networks) are often used to push rankings for a single keyword at a time.

leather womens shoes ppc

Trying to compete for “leather womens shoes” would be a waste of time for many businesses

An online marketplace with over 100,000 pages of content once asked us to do SEO for their list of 30 keywords. This SEO strategy just doesn’t make sense. 100,000 pages should be optimized for 300,000 – 500,000 keywords, in order to drive a big amount of traffic and grow the business.

We often follow a simple rule of thumb: each page should be optimized for 3-5 keywords, so the number of keywords is roughly planned by the amount of content.

4) Finding keywords based on existing site structure

When beginning keyword research, most people look at the main pages and major sections of their website, and then start to look for keywords for those pages.

They then optimize those same pages for the keywords they found. The problem is that you can miss out on a lot of great keywords that the current site structure and site content hasn’t covered.

The purpose of good keyword research is to find all possible keywords that your prospective customers are using to find you, and that has nothing to do with your site structure.

Your customers might be looking for very relevant content to your business, that’s not on your website at all! When doing good SEO, you should actually have to modify your site structure, and create entirely new sections and pages that are better optimized for the right keyword groups.

For a school consulting website, we found many strong keywords for a specific audience, like “boarding school for boys” and “boarding school for girls” which were not on the site at all.

We created new sections and pages for these important keywords. Had we relied on the existing site structure, we’d miss out on many of these valuable keywords.

keyword allocation part one

keyword allocation

5) Putting the wrong keywords on the wrong pages

Once you’ve grouped your keywords, you need to figure out where to place them on your website. This process is called keyword allocation, and it’s a critical step in the keyword research process.

A common mistake is adding irrelevant keywords to pages whose content doesn’t match the keywords, or pages that don’t match the search intent.

For example, users who search for “boarding schools in usa” are usually from overseas. Therefore, the page optimized for that keyword should indicate the value for international families who want to send their kids to US schools.

On the flip side, users who search for “top private schools in upper east side nyc” are usually people who understand the neighborhood, so the page content should adhere to their different needs. The keyword “boarding schools for girls” should be allocated to a page that concerns school girls.

It’s not simply about putting keywords on a page, it’s about matching each keyword with search intent and web page copy.

Mike Le is the Co-Founder and COO of CB/I Digital, a digital agency in New York that offers digital marketing and digital product services for clients. You can connect with Mike on LinkedIn and Twitter.

View full post on Search Engine Watch

Five common website redesign and rebranding mistakes to avoid

Far too often I see brands migrating over to a new web design or new domain name without considering their current SEO standing, and therefore completely undermining all their previous efforts which helped them become an authority figure in their industry.

There is a lot of pre-planning and execution needed from an SEO perspective to ensure a website retains the keyword rankings and organic traffic you have built up in the past.

Using the SearchMetrics compare tool, you can see over time how successful a redesign or rebrand is, an example of a poorly executed domain switch, would look like this:

searchmetrics compare

(brand name excluded, as I don’t want to name & shame!)

This is not what you want, this brand launched on a new domain, as part of their re-brand. As you can see they have not maintained all of their keyword rankings.

Ideally, when re-branding or changing the structure of the website, you want to maintain or even grow your search visibility, a good example of this is:

searchmetrics compare tool

To prevent keyword losses & organic drops, see below for common mistakes to avoid when rebranding/redesigning your website.

Alternatively download our checklist on the Zazzle Media website (registration required) to help you with your redesign/rebrand.

zazzle media tool

1) Not benchmarking

First things first, how are you going to measure the success of your redesign/rebrand if you don’t benchmark your current performance within search? I would recommend getting the following data before launching:

  • Average organic sessions over the past 12 months
  • Average organic users over the past 12 months
  • Average organic bounce rate over the past 12 months
  • Average organic order value over the past 12 months if applicable
  • Average organic revenue over the past 12 months if applicable
  • Current rankings within the top 3, 10 & 20 using SEMRush
  • Current search visibility score using SearchMetrics

You can then compare these figures over the coming months once launched to track performance accurately.

Bonus tips:

  • Grab a crawl of the current website using Screaming Frog.
  • If you are moving to a new domain, get a landing page on it and make it crawable for Google.
  • Setup Search Console for all of the domains & different variations, for example: example.com, www.example.com, https://www.example.com, https://example.com.

2) Removing sections/categories off the website

It is so important you have a clear structure of the new website early on. This is so you can estimate if you are going to lose any organic traffic based on any landing pages you are going to get rid of, so it is not a surprise when you relaunch.

If you are planning on removing any sections of your website on the redesign/rebrand, you can view how much traffic you could potentially lose by looking in Google Analytics:

Behaviour > Site Content > Landing Pages

behaviour in analytics

If you get the average sessions and users over 12 months for the section you are going to remove, then subtract this figure from the previous benchmarking figures, you can then forecast organic traffic figures when launching.

3) Messing up redirects

Once you have a clear view of how the new website will look, you need to map out the old website to the new website, a clear way to do this, is to set this out in an excel spreadsheet.

In our checklist, we have created an apache server redirect file template, which will generate the code needed to put in your htaccess file when launching, all you need to do is input the old URL & where it should redirect to & the code will be generated in the end column.

Important: Do not redirect everything to the homepage, make the redirects, as relevant as possible, this will help pass previous page rank & rankings more effectively.

zazzle redirect tool

Avoid redirect chains, for example when you crawl the website using screaming frog, download the redirect chain report, for example:

screaming frog

This will allow you redirect each URL to the correct place to eliminate Google spending time crawling through all the URLs to find the final one.

4) Removing keyword optimisation

Another problem that often occurs is that when a brand launches a new website, all the previous keyword optimisation on the title tags, header tags, content, alt tags and meta descriptions are removed.

Ensure you have a clear keyword mapping strategy in place when relaunching, you should know which pages you want to rank for which keywords.

You can use your previous benchmarking export of SEMRush to help you understand your keyword priorities.

Then you can use your previous Screaming Frog export to see all of the necessary meta data that needed to be transferred to the new page.

5) Don’t remove or change your Google Analytics code!

Use the same GA code on your new website, as you need to compare data when launching. Instead of creating a new Google Analytics profile update the address when you launch, by going to admin > property settings:

property settings

What to do when launching!

So you have a redirect & keyword strategy in place for the new website, you have done development testing and everything is good to go, now it is time to launch onto the live website! Here are some tasks to do to ensure nothing falls through the cracks:

First of all, run the old Screaming Frog crawl make sure all of the redirects are working as expected.

Then run a crawl on live website, to find any internal 404 errors that might need updating, you can do this by going to: Bulk Export > Response Codes > Client Error (4XX) Inlinks

screaming frog inlinks

To ensure Google can crawl the website correctly, I would recommend doing a fetch & render on each type of page on your website, for example homepage, category page, sub category page and product page.

You do this by going to Search Console > Crawl > Fetch As Google

fetch as google

Then click into the result to see how Googlebot & Visitors can see the page, make sure you are not blocking any important javascript of CSS.

googlebot rendering

You can check the fetching tab for more information on the rendering, Google will tell you if there are any obvious issues like a noindex tag or x-robots tag blocking your website.

fetching

Once the website is launched, to pick up any errors straight away, you can use Google Real Time to find 404 errors your users are finding, you do this by going on:

Real Time > Content > Page Views (Last 30 mins)

live on page GA

Next, submit the new sitemap within Search Console, by going to:

Crawl > Sitemaps > Add/Test Sitemap

add test sitemap

Once you are happy with the launch and fixed all of the initial bugs which are bound to happen, you should submit a change of Address in Google: https://support.google.com/webmasters/answer/83106?hl=en

request address change

Finally, put an annotation within Google analytics when you launch, so you can easily see when you switched over to a new domain or new design.

sessions GA

Summary

Taking these steps will give you the best chances to retain all of your rankings and organic traffic when changing your website or domain name. I hope this has given you some tips to include when you are relaunching a website, download our checklist to help further.

View full post on Search Engine Watch

Google reiterates suggested 33-character limit in ETA headlines to avoid truncation

For those writing headlines for the new expanded text ad, the most coveted letter in the alphabet will be “i.”

The post Google reiterates suggested 33-character limit in ETA headlines to avoid truncation appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.

View full post on Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

How will smart cities avoid data overload?

Shanghai city network technology

Smart cities will soon churn out so much information that flash memory data centers might be the only way to control the environmental footprint.

In an essay on IT ProPortal, NetApp’s Laurence James makes the case for smart cities switching to solid state drives (SSD), or flash memory, in their quickly growing data centers. NetApp is a Sunnyvale, California-headquartered maker of data storage and management systems.

Gartner predicts 1.6 billion connected devices will be hooked up to the larger smart city infrastructure by the end of this year.

And while we are in currently in the relatively early days of smart cities, fears are spreading that we could soon be overwhelmed in a data tsunami from these fast-proliferating connected devices.

“To deal with the day-to-day and long-term data demands of a smart city, massive amounts of data need to be stored cheaply, locally and with rapid access,” says James.

The current approach of mass data storage is usually through vast server farms in remote locations that are big power hogs.  But James warns that distant data farms that consume massive amounts of electricity are not be compatible with smart cities that need fast data access with minimal environmental impact.

“One of the challenges smart cities poses for IT and storage infrastructure decision makers is an environmental one,” he says, adding that “…dramatic increases in data volume will be accompanied by a huge upsurge in data centre footprint, which from a power usage and preservation perspective is undesirable.”

Solid state = smart advances

But rapid improvements in Solid-state disk (SSD) data storage may hold the solution.

Toshiba predicts that by 2020 technological advances in SSD systems will allow them to store 128 terabytes (TB) of data compared to hard drive storage capacity of 40TB. Better still, today’s high capacity SSDs only draw a tenth of the power that equivalent capacity hard drives do, with only 6% of the physical footprint.

“With SSDs outpacing Moore’s Law in terms of gigabytes per inch, rather than colossal data centres sprouting up all over the country, the all-Flash data centre paves the way for much needed stability in terms of power requirements and footprint,” James says.

Perhaps all-flash data centers will make all these data overload fears a distant memory, and a solid state one at that.

The post How will smart cities avoid data overload? appeared first on ReadWrite.

View full post on ReadWrite

Profit by Search now offers SEO services to avoid the impact of new Google mobile usability testing tool – PR Newswire (press release)


Search Engine Journal
Profit by Search now offers SEO services to avoid the impact of new Google mobile usability testing tool
PR Newswire (press release)
NEW YORK, July 1, 2016 /PRNewswire/ — #1 SEO Company in India, Profit by Search is now offering search engine optimization services to help businesses in avoiding the impact of new Google mobile usability testing tool. Presently, beta testing is being …
Where SEO and User Experience (UX) CollideSearch Engine Journal
SoCal Marketing Consultants say SEO Improves Visibility, Rankings, Traffic, Leads & Sales for BusinessesDigital Journal

all 8 news articles »

View full post on seo optimization – Google News

Profit by Search now offers SEO services to avoid the impact of new Google mobile usability testing tool – Yahoo Finance


Search Engine Journal
Profit by Search now offers SEO services to avoid the impact of new Google mobile usability testing tool
Yahoo Finance
NEW YORK, July 1, 2016 /PRNewswire/ — #1 SEO Company in India, Profit by Search is now offering search engine optimization services to help businesses in avoiding the impact of new Google mobile usability testing tool. Presently, beta testing is being …
Where SEO and User Experience (UX) CollideSearch Engine Journal
SoCal Marketing Consultants say SEO Improves Visibility, Rankings, Traffic, Leads & Sales for BusinessesDigital Journal

all 8 news articles »

View full post on seo optimization – Google News

Houston SEO Tips: 5 Common Optimization Mistakes To Avoid – Huffington Post


Huffington Post
Houston SEO Tips: 5 Common Optimization Mistakes To Avoid
Huffington Post
We know SEO is extremely important to brand management, which is why it's also vital to perform these optimization techniques correctly. If you don't know how to write accurately for search engines, you could be doing some things that may be hurting
topseos.com Selects SEO Brand as the Top Search Engine Optimization Company for June 2016Marketwired (press release)
Two tried and true methods to make social media impact SEOSearch Engine Land
topseos.com Unveils SEO Brand as the Best Search Engine Optimization Company for the Month of June 2016EIN News (press release)
Forbes –Business 2 Community –Digital Journal
all 23 news articles »

View full post on seo optimization – Google News

Go to Top
Copyright © 1992-2016, DC2NET All rights reserved