Posts tagged help
Studying how end users, not algorithms, search for solutions online can help improve your SEO efforts.
Coming from a background in neuroscience, I’m still learning about the technical side of search engine optimization.
I know a little about the terminology thanks to an array of online communities and experts like my ConsumerAffairs colleague and SEO expert, Jessica Sanford.
I’ve read the Google Search Quality Rating Guidelines from cover to cover and understand more about the content and signals Google looks for to signify quality websites.
I know what’s at stake when it comes to the impact of SEO…
User behaviour on the first SERP
Around 84% of all clicks on search engine results pages (SERPs) go to the ads and results above the fourth organic result. Of those, more than 32% go to the first organic result, with the second result claiming fewer than 12% and click through rates (CTR) decreasing dramatically from there.
Dropping from the first to the second or third spot on SERPs can have a tremendous impact on revenue, leaving businesses scrambling to find the cause.
And just so you know ConsumerAffairs has skin in this game—more than 80% of the traffic to ConsumerAffairs comes from organic search.
What I’m still not sold on is the obsession we seem to have with figuring out and gaming Google’s well-guarded search algorithm.
What we get wrong about Google
Entire companies have been built (and have subsequently crumbled) around exploitable quirks in Google’s algorithms—stuffing pages full of keywords at the cost of making sense, gathering backlinks through less-than-honest methods, etc.—and it seems like every time Matt Cutts says something into a microphone it lights up the blogosphere with 1,000-word posts dissecting every word.
The reason this focus on figuring out how Google calculates SERP rankings feels wrong to me is that, for all of our effort, we forget about one thing that Google never seems to: the user.
Moz co-founder Rand Fishkin wrote about this disconnect as early as 2007:
“We need to realize that search engines are a tool—a resource driven by intent… The search box is fundamentally different than a visit to a bookmark…it’s unique from a click on the “stumble” button…or a visit to your favorite blog—searches have a direct intent behind them; the user wants to find something.”
One of the reasons Google has been so wildly successful (Google owns more than 65% of the organic search market, with its closest competitor, Bing, controlling around 33%) is that it never seems to lose track of this fact.
Google’s goal, as a business, is to understand exactly what search users are looking for and to provide the most accurate answer to their questions.
In his study on the search giant, John Battelle described Google’s goal to become a “database of intentions” able to understand your “desires, needs, wants and preferences.”
Google pursues this goal maniacally, and I would wager that having to serve SERPs with more than the exact item you’re looking for (which would be shown as a single link) is probably a mark of shame as it strives to make perfect predictions.
With Google’s intentions so clear, it baffles me that so many companies are still focused on figuring out the algorithm rather than creating quality content designed to answer audience members’ questions.
And yet, here we are, with brands poised to spend $65 billion on SEO in 2016, much of which, according to Foxtail Marketing CEO Mike Templeman, will be wasted on efforts that are either fruitless and will eventually be penalized by Google’s engineers.
Defining search engine intent and values of queries for your business
There have been some earlier attempts by search engine experts to provide a taxonomy of search engine intent (SEI). Fishkin segmented queries into four groups of intent:
- Navigational queries – Consumers use organic search as a white pages, to navigate to a particular site when they don’t necessarily know the URL.
- Informational queries – Queries focused on finding specific information, whether local weather, the street address of the best ice cream parlor in town or in which films Meg Ryan appeared with Tom Hanks.
- Commercial investigation queries – Focused primarily on research for future purchases, finding the best brand of scuba goggles or the best cat food for cats with eczema.
- Transactional queries – Searches aimed directly at making a purchase, branded queries or queries that meet an immediate need (where is the best fried catfish restaurant in this neighborhood).
Although the first two types of queries could be important to your business, the last two are queries with high purchase intent—meaning the most likely to end with consumers handing over their credit card numbers—which makes aiming the content of your site and your Adwords campaigns to rank for these queries very important.
How does one figure out the relevant search queries with high intent?
Based on traffic or average cost per click (CPC), based on what previous competitors use for keywords (via SEMrush or Spyfu), or based on budget limitations that force you into pursuing only low-volume long tailed keywords?
My proposed solution, although it seems simplistic, is to ask end users what their search queries would be based on certain intentions (so they make the keyword list for you) and in asking what they think is a “fair price” (internal reference price) for the item that they are trying to find.
With that intentional information plus the traffic/costs information you have for those search queries, you can decide if those keywords are worth going after based on making a profit or loss with the fair price they have in mind.
That’s what I think would be efficient. But talk is cheap and I suppose, being a data scientist, I should probably have some numbers to backup my claims. What I have collected is far from definitive, but it does provide a nice pilot to give you a feel for applying search engine intentions (SEI) to your SEO practices.
Studying search engine intent
After a screening process which included a battery of questions testing basic SEO knowledge, I was left with 57 participants who completed the survey and “failed” the SEO test.
I wanted to study participants who failed the test (which included the question “what does SEO stand for?”) because I wanted to results that represented the average consumer, rather than those who work tirelessly to understand how to get their page to rank higher in SERPs.
I gave participants the following task: Below are some scenarios that I want you to imagine yourself in (even if you don’t have a kid or a pet in real life). In the two spaces provided for each scenario, I want you to answer the following:
- Exactly what would you type into a search engine to find it?
- What do you believe would be a fair price* for the item/service?
With that task in mind, I gave participants 20 intentions to provide answers to, from needing to buy a new washing machine to wanting to buy gold as an investment.
We then used Google Adwords and Bing Ads keyword planners to look up the traffic, estimated clicks and estimated cost per click for every search query our participants came up with. We calculated the profit/loss for each search query based on the following equation:
Profit/Loss = [Fair Price x Daily Clicks x Assumed Click to Sale Rate] – [Daily Clicks x Average CPC]
What we learned about search engine intent
1) Click-to-sale rate determines if you should use Google Adwords or Bing Ads for your PPC campaigns
One particularly interesting conclusion I teased out of this data set was how click-to-sale rate is critical for determining if you should run your paid campaigns on Google Adwords, Bing Ads, both or neither.
For low-converting verticals, Bing will routinely lead to more profit than Adwords (which will usually be a net loss). But, when the click-to-sale rate gets closer to 10%, Adwords is clearly the more profitable platform.
The big caveat is that even at a 10% click-to-sale rate, there are still categories that net a loss on one or both platforms.
2) Organic search is more profitable than PPC
I used paid advertising metrics to help valuate what these search queries are worth in the organic search context of SEO.
If we extrapolate these numbers based on our internal data, the profits from the organic would be much higher than what is stated in these sheets, and this is an important point.
Wordstream gives a good list of high-intent keywords in two categories: buy now keywords, and product keywords. Buy now keywords which include queries like:
- Free shipping
These are typically expensive campaigns to run in Adwords and difficult to rank for in organic search. Product keywords include comparison queries like:
These keywords, although highly competitive and difficult to rank for on your own, present a unique opportunity.
As marketing manager Danica Jones wrote in a recent post on Search Engine Watch, third-party review sites rank high for coveted root and consumer-focused queries, so using these listings to present a positive and transparent brand experience is one of the most cost-effective ways to climb the ranks and increase SERP (search engine results page) real estate quicker than more traditional SEO efforts.
Investing in third-party listings can be more cost effective than running pay-per-click advertising for high-intent keywords.
And while an ad may reach the right person at the wrong time, consumers who use review sites are by and large using review websites to actively research before a purchase (70% according to an internal survey we conducted).
There is still so much that can be taken from this study, and our team wanted to share it with peers in the spirit of focusing more on our end users and less on algorithms.
We want to empower you to run similar studies for your relevant verticals (where you can collect thousands of responses to intentions).
Ultimately this method could prove helpful for the individual needs and focus of your own company. Study the end user’s intentions and stop spending an outsized amount of time trying figure out search engine algorithms.
This is the real focus of Google’s efforts to perfect their search engine, and if the experts are focusing on the end user, we should probably follow suit.
View full post on Search Engine Watch
Pokémon Go is hot right now, but how can you use this new mobile gaming craze to your advantage? Columnist Tony Edward has some suggestions.
The post How Pokémon Go can help generate SEO and foot traffic appeared first on Search Engine Land.
Please visit Search Engine Land for the full article.
The sales process is all about nurturing long-term relationships and building trust, and luckily there are some great tools to help!
There is no magic spell for better sales, no matter how much we wish there were. Likewise there is no special set of words, prospect investigation device, or rated handshake that is going to get someone to buy your products.
Good old connections are pretty much all you have to go on. That includes relationship building, which is one of the most important factors in gaining long-term customers who will come to you again and again.
Here are seven tools that can help you build those relationships and increase your sales.
Salesmate creates and organizes a solid ground for your relationship by collecting all kinds of interactions you have had with your lead. Additionally, it saves you time entering the data by automatically generating and cross-referencing details and interaction history between lists and databases.
It’s also one of the most affordable sales management tools available, so give it a try!
Get information on your contacts right in your Gmail through Rapportive. They connect through contact information to LinkedIn profiles, and show the latest info on each prospect. You will get a picture, work information, former work information, and more. All right there in your Gmail account.
It is a super easy way to customize your communications, by drawing out any relevant information that you may want to include in your message. You would be surprised by what a difference those personal touches make.
3. Charlie App
Want even more info? Charlie App is an amazing tool that will do all the contact research on your prospects for you, and then create a report for you to use. I love using this one before meetings, where there are multiple people involved in the process.
You can get your report, be prepared when you walk in, and impress them with your extensive knowledge on who they are, what they do, and even their latest social media posts. It is a great way to personalize meetings, and be on the right foot the moment you enter the door.
Convert people right from your website using Intercom‘s live chat feature. Communicate in the long term with those customers to keep them happy. Give awesome customer service from the same platform.
The entire point of this tool is to make your customers feel valued and individually catered to. No automated CRM’s, no bots, just one on one communication between your team and the people who make your business grow.
Wouldn’t it be nice if you could let your prospects pick the best time to meet, while still meeting your busy schedule? Calendly is just that magical tool. It lets you set your availability, then link anyone. They can pick a time that works best for them, and set a meeting when you are able to have one.
Then it automatically adds that meeting to your calendar so you never have a scheduling faux pas. Your whole team can use this tool and better manage their meetings and appointments, whether they are in person, on the phone, or online.
6. Click Meeting
Click Meeting lets you set up sales meeting and demo calls easily. It integrates with numerous apps including Youtube, Google Calendar and more. You can get analytics and other important data on your meetings. They also offer free mobile apps which is so important these days.
While it might cost a bit, it is worth it if you are serious about offering audio and video conference calls. You don’t want to blow a sale because you were trying to connect over Skype and kept dropping the call.
One of the most annoying things about Gmail is that they still don’t have a feature that allows you to schedule emails to be sent at a later date. If you want to do that, you have to have another tool. Luckily Boomerang is there.
It is a Chrome extension that lets you schedule those messages, as well as ‘postpone’ messages until a later time that you aren’t ready to have in your inbox. So that message you got after hours that you know will drive you crazy if it is sitting in your inbox, taunting you? Hide it with a click!
Do you have any tools to add? Anything that helps build relationships for sales teams? Let us know in the comments!
View full post on Search Engine Watch
No more arguing about what Google means by positions in the Google Search Analytics report. Google has defined the multitude of metrics and cases in a help document.
The post New Google help document defines Search Analytics impressions, position and clicks appeared first on Search Engine Land.
Please visit Search Engine Land for the full article.
Looking to get the most SEO value out of your content marketing efforts? Columnist Tamar Weinberg explains how to breathe new life into your content assets.
The post How your old content can help with SEO appeared first on Search Engine Land.
Please visit Search Engine Land for the full article.
People like fast websites and so does Google.
In fact, your website’s speed is a ranking factor in Google search engine results.
If your site loads quickly, it’s more likely to appear when people search for your brand. This along with the knowledge that a fast site provides a better user experience (UX), means that a faster website can lead to higher conversions.
If your website isn’t loading as quickly as you’d like, it’s very likely that your images are to blame.
Here are a few common mistakes people make regarding optimising images for their website.
Images are too big
Many marketers and publishers like to use big, high-resolution images on their site, believing that these images will provide a better user experience.
The problem is that high-res images often have a very large file size, and take a long time to load, especially when there are multiple images on the same web page.
We’ve seen many publishers uploading images in the range of 2Mb to 5Mb in their blog or content posts. This image size is way too large for the web, and is one of the most common mistakes that slows down websites.
If your image is larger than 500kb, something might be wrong, and the image could be compressed.
Before you upload new images to your web page or blog post, remember these tips:
- Before you upload any image, double-check the file size (right click the image, and choose properties)
- Keep image files sizes below 500kb (and below 100kb if possible)
- There are many online tools that can help you compress your images to get a smaller file size, such as io, CompressJPG, and TinyPNG.
- If you use Photoshop to prepare your images, keep an eye on the dimensions and make sure the DPI is set to 72dpi (Image/Image Size) and remember to ‘Save for Web’ in order to control the final outputted file size.
- Convert your images to the proper file types. In most cases, you’ll want to use JPG. However, if your image uses transparency (such as an image with a “see through” background) you’ll need to use PNG. There are some rare cases when GIF is best, but, when in doubt, always use JPG.
A specific example: An exclusive online designer footwear brand uses a lot of large banners and products images on its fashion site that dragged the Google PageSpeed score down to just 20/100.
We created a daily cron job (automated task that runs daily) to automatically resize big images down to smaller web standards, while maintaining a good quality.
In the screenshot below, we reduced the file size of an image from 1.3MB to only 142KB.
Simply by reducing image file size, we increased the Google PageSpeed score from 20/100 to 58/100.
Another common mistake with images, is auto-scaling large images so they display smaller than they really are.
Doing this is often more convenient for the developer and content creators, but can really slow down a website.
For example, a big photo banner in a post might also be used as a small thumbnail elsewhere on the site.
The developer, rather than creating multiple versions of the image (e.g. 1000×425 for the banner and 64×64 for a side column), uses code to auto-scale the same big image to display as a small thumbnail. So a big image is being loaded unnecessarily. This shortens development time, but the page speed pays the price.
Not to mention, auto-scaled images can end up looking distorted because they’ve been stretched with code. For example, the thumbnail below is auto-scaled from 1000×425 pixels down to 64×64 pixels, and becomes distorted.
Keep an eye out for times when the same image is used many times on your site. If your site requires 12 different size variations used in 12 locations (something like 25×25, 40×40, 200×200, 658×258, 56×56, 64×64, 92×92, 150×156, 110×110, 160×160, and 180×180) that’s probably too many, and you might want to limit that down to less than four.
Then create a separate image for each different size, and load the correctly-sized image version rather than auto-scaling large images to look smaller than they really are.
Lack of image caching
Even if you use proper image compression, and serve properly-scaled images, a page that’s very image-heavy can still take a long time to load. Since images are static content, a great way to speed up the load time is to use CDN caching.
Caching (pronounced “cashing”) is the process of storing data in a temporary storage area called a cache. For example, you’ve probably noticed that a website you’ve visited in the past will load more quickly than a site you’ve never been to. This is because the visited website is cached by your computer.
A CDN (Content Delivery Network) is a network of servers that delivers cached content (such as images) from websites to users, based on the geographic location of the user.
For example, if you’re in New York, and you’re looking at a website from India, you can load the images from a server that’s actually in New York, rather than loading images from halfway around the world.
A site using CDN caching can deliver images and other static content much faster, especially in peak traffic time, because images are not loaded directly from the web server, but from a cached server with much faster speed.
On top of this, a CDN also helps you serve more visitors at the same time. If your site experiences a sudden or unexpected spike in traffic, a CDN can keep your site functioning effectively.
View full post on Search Engine Watch
MasterCard is partnering with 100 Resilient Cities (100RC) to boost digital payments as a way of making smart cities more efficient and disaster-proof.
According to Pyments.com, the U.S. credit card behemoth is teaming up with the 100RC group which provides free expertise, tools and services to member cities. The group, led by the Rockefeller Foundation, is comprised of various non-profits, businesses and government agencies.
This comes as financial services companies are increasingly looking to develop the “banking of things” which seeks to harness the torrents of data being produced in smart cities.
MasterCard said it will use its resources to promote digital payments as an alternative to riskier cash-based transactions in 100RC cities.
It also will use its technology to help these cities develop their own digital payments strategies to “help cities around the world become more resilient to the physical, social and economic challenges that are a growing part of the 21st century world.”
“In the face of rapid urbanization, removing cash from the economy has been shown to create far-reaching and cumulative benefits to all — citizens, businesses, tourists and governments,” said MasterCard’s vice president of government services Craig Driver. “Working with 100RC, we can tap into our expertise creating digital payment solutions for governments to help cities achieve greater cost savings and efficiencies, drive revenue, reduce crime, establish digital identities for their citizens, expand financial inclusion and improve overall quality of life.”
Payments alternatives can help during calamities
From the 100RC perspective, MasterCard’s expertise not only assists smart cities in becoming more economically efficient, but will improve their ability to weather unforeseen calamities.
“The complex problems facing cities in the 21st century require thinking and partnerships from experts across all sectors,” said 100RC president Michael Berkowitz. “MasterCard is uniquely positioned to assist 100RC members build efficiencies in their local economies and create cities better able to thrive and withstand both sudden and long-term disasters.”
MasterCard is currently working with 60 governments around the world on such payment issues as transit, procurement, government payroll and social benefits.
For example, Transport for London (TfL) worked with MasterCard to develop a fare collection system that accepts contactless bank cards. The result was a 5% drop in costs in fare collection in just over a year as well as simplified access to buses and trains for riders.
Meanwhile local businesses in Toronto and other Canadian cities have teamed up with MasterCard to glean digital insights that will improve competitiveness, business performance and understanding of customer behavior.
The post Can digital payments help disaster-proof smart cities? appeared first on ReadWrite.
View full post on ReadWrite
Search Engine Journal
Help! I just launched a new website and my search rankings tanked!
Search Engine Land
Don't make the mistake of forgetting to optimize the new images, or else the pages on your new site may load so slowly that potential customers exit before viewing any of the content. Relying on Flash elements also can cause huge problems for SEO and …
Profit by Search now offers SEO services to avoid the impact of new Google mobile usability testing tool
Where SEO and User Experience (UX) Collide
SoCal Marketing Consultants say SEO Improves Visibility, Rankings, Traffic, Leads & Sales for Businesses
View full post on seo optimization – Google News
Inadvertently ruining your newly redesigned website’s SEO can be a nightmare. Columnist Will Scott explains four mistakes you must avoid before you launch.
The post Help! I just launched a new website and my search rankings tanked! appeared first on Search Engine Land.
Please visit Search Engine Land for the full article.
New Zealand is in the midst of an Internet of Things (IoT) craze that will drive data traffic to double by the year 2020.
A New Zealand Reseller News article cites the results of a recent Cisco Visual Networking Index report that predicts the nation’s IP traffic will double by 2020 to reach 50 Gigabytes per capita, which represents a compound annual growth rate (CAGR) of 15%. The report covers traffic that includes Internet data along with managed networks like video-on-demand services.
As an illustration of the sheer volume of New Zealand Internet traffic, the data will grow to an equivalent of more than 72,000 DVDs every hour in four years.
“Digital transformation is happening quickly, and has the potential to improve the way New Zealanders live and work,” says Cisco New Zealand’s Head of Digital Transformation Glen Bearman.
The strongest trends that will drive growth in data traffic are expected to be the Internet of Things and digital transformation.
Millions more connected devices expected in New Zealand
The number of connected devices in New Zealand is expected to grow from 20 million in 2015 to 37 million by 2020, which will significantly increase the data load on the nation’s IP networks.
Among the applications that are expected to proliferate and drive traffic are such devices as digital health monitors, video surveillance and smart meters. As well, machine-to-machine connections will notably increase their presence, growing to a 70% share of the country’s total IP connections. Cisco expects to see this growth reach 7.7 devices / connections per capita.
The study also said increased immigration to New Zealand will drive IP data growth as 500,000 new citizens go online.
Bearman also cites faster broadband speeds as a potent accelerant of IP traffic growth.
“Higher speeds mean the consumption of rich media will continue to rise, with video being the dominant application across the globe,” he said.
The post IoT will help New Zealand double web traffic by 2020 appeared first on ReadWrite.
View full post on ReadWrite
The number of digital skills you need in order to be a functional and useful member of your organisation are increasing at a rate you might be struggling to keep up with.
As well as the ability to understand your analytics and be fully aware of basic SEO skills, you need to be able to present information and data in the clearest manner possible to members of your team and, of course, your senior management.
Luckily you don’t have to be a graphic design wizard to achieve this.
Here is a list of various free and premium visualisation tools that will help you communicate your ideas in a variety of formats, for a range of different experience levels. Hopefully you’ll pick up some impressive new skills here too.
With Silk you can publish attractive looking webpages featuring a variety of different interactive visualisations, based on your data-sets. These can exist as standalone pages, linking back to your own site for a little SEO benefit, or you can embed them wherever you like.
Think of Sketch as a much easier to use, far more intuitive and BS-free version of Photoshop that’s also a damn sight cheaper. I’m including it here because after a recommendation from my learned friend Chris Lake, within the afternoon I had installed Sketch, messed around for a couple of hours and finished a fairly complex but crystal-clear multichannel content marketing plan. I love it.
Google Fusion Tables
Fusion Tables is a web app that allows you to gather, visualize, and share data tables.
You can filter and summarize across thousands of rows, then adapt the data to an embeddable and shareable chart, map or custom layout. Plus all your data organization is automatically saved in Google Drive.
I’ve recommended Piktochart so many times – as has everyone else on the internet in the business of making your data-vizs and infographics look brilliant. It’s just so easy to use and the templates help you achieve results very quickly.
Gephi is an open-source data-viz tool for graphs and networks. It also allows for exploratory data, link and social analysis.
Easle.ly possibly has the most satisfyingly meta-textual name on this list. It also has 1,000s of infographic templates at your disposal, as well as the ability to create one from scratch.
Need a simple bar chart, line chart, venn diagram or graph rustled up in a flash, without any extra complicated bells and whistles? Hohli should have everything you need.
Gliffy allows you to make great looking flowcharts and diagrams, but its secret weapon is the fact it has collaboration at its core.
Infogr.am has a really beautiful collection of templates for data-visualisation, possibly some of the best looking here, and it’s very easy to use.
With Leaflet, you can create incredible looking maps, that are fully interactive and mobile friendly, with tonnes of customisable features.
D3 basically stands for data driven documents, but there is little basic about this tool. In fact, you should probably only use this one if you have some expertise already. However the results can be more than worth the work.
The analytics workspace, Bime has a great eye for stylish design, and its multi-device capabilities are impressive. Although it does come at a premium.
The “world’s easiest” bar chart building app, plus also one of the nicest looking and quickest to use too.
Dygraphs lets you make interactive charts which you can mouse over to highlight individual values, then click and drag to zoom-in, zoom-out or pan around.
With Timeline you can create embeddable sequential timelines by uploading your data from a Google Spreadsheet. Each timeline is customisable and interactive, plus even though it’s open source it’s relatively easy to use for beginners.
View full post on Search Engine Watch