Posts tagged just

YAP’s beacon just wants your shopping to be fun

IMG_0542

While at the Mobile World Congress Shanghai conference, I had an opportunity to walk through a day like a Yapper, and it’s an eye-opening experience.

Launched in 2014, South Korea-based Yap is a location-based integrated online-to-offline (O2O) commerce and payments platform — and now, that means beacons.

Calling it a “breakthrough innovation for the indolent people of the world,” once deployed in stores or with other outlets, their Yap beacon will broadcast pop-up coupons, offers, memberships, other useful information, as well as an opportunity to purchase, as long as the Android app is installed but not necessarily turned on.

The beacon also uses a unique tone to remind the user and the store of each others whereabouts. It was the first hybrid beacon in Korea, using Bluetooth and ultrasound to communicate with passing users.

YAP and the buying decision

The service is currently available in South Korea and Vietnam, and is coming to China soon. A North American rollout is also expected.

While the U.S. is just getting around to the widespread adoption of not having to swipe decades-old magnetic strip technology stuck to piece of plastic, the battle in the payments arena will likely focus on “beacons versus NFC” in the near future.

YAP may be an concept who’s time is now, especially in North America. What the connected consumer experience looks like for brand-conscious buyers could wind up centering more around payments systems than impulse indulgence.

Being reminded that you’d like to buy something without instant fulfillment could doom some branded IoT technologies to fad status — really, Amazon, how much Play-Doh do we need — rather than actually enabling the impulse decision to purchase with a seamless transaction process.

It may be a consumer’s emotional “chicken and egg” – do we browse for fun and buy because we have money or are we always ready to buy and our financial resources are our only limit? Outstanding credit analysis of the U.S. consumer marketplace might provide clues, but the fewer step needs to incentivize a customer to buy certainly makes that decision to buy much easier.

And, of course, YAP could use the beacon to educate users about wise purchasing choices, too. It will be an interesting to see how technologies like these play out.

The post YAP’s beacon just wants your shopping to be fun appeared first on ReadWrite.

View full post on ReadWrite

Google’s Keyword Planner tool just became even more inaccurate

You’re probably familiar with the Keyword Planner tool, which is one of the best sources we have to spot opportunities and make the business case for an investment into paid or organic search campaigns.

One of the things it provides is guidance on the volume of searches for any given query. The numbers reported in the tool have always been somewhat vague. They are rounded up and numbers end with at least one zero. A pinch of salt has always been required when digesting the data.

It turns out that these numbers are now even more imprecise.

Jennifer Slegg spotted that Google has started to combine related terms, pooling them all together and reporting one (bigger) number.

No longer can you separate the data for keyword variants, such as plurals, acronyms, words with space, and words with punctuation.

As such it would be easy to get a false impression of search volumes, unless you’re aware of the change. No sudden jump in search queries, just an amalgamated number. Be warned.

Here are a couple of examples…

Bundling together anagrams and regional spellings

Screen Shot 2016-06-29 at 11.10.33

Lumping together plurals and phrases without spaces

Screen Shot 2016-06-29 at 11.08.47

The problem could be exacerbated by third party tools. Jennifer says:

“For those that don’t notice the change – or worse, pulling the data from tools that haven’t updated to take into account the change – this means that some advertisers and SEOs are grossly overestimating those numbers, since many tools will combine data, and there is no notification alert on the results to show that how Google calculates average monthly searches has been changed.”

So yeah, this isn’t exactly good news. In fact, I can’t think of any benefit to the end user, but Google has a history of obfuscating data, so perhaps it shouldn’t come as a surprise.

That said, it once again pushes the focus towards relevance and context rather than pure volume. Advertisers and content creators would do well to focus on optimising clickthrough rate and landing page performance, rather than just shotgun marketing.

Guesstimated data aside, you can use Search Console to make sense of actual performance. Map your page impressions to organic (or paid) positions and you’ll get a sense of how accurate the Keyword Planner data is for any given term.

It’s also worth remembering that there are seasonal factors at play with the reported data. Volumes shown are an approximate figure based on 12 months search data. You might get a better idea of more accurate monthly figures if you cross-reference data from with Google Trends, which will show seasonal spikes (February is a big month for flowers).

Screen Shot 2016-06-29 at 10.48.33

Keyword Planner replaced Google’s Keyword Tool and Traffic Estimator about three years ago. Users of the old tools initially complained about missing the broad match and phrase match options. Now, they’re going to miss even more detail around keywords and data.

Proceed with caution, as ever.

View full post on Search Engine Watch

Just because they’re sharing, it doesn’t mean they’re reading

If you’re visiting this article before or after sharing it on a social channel, then may I offer you a warm welcome to an increasingly exclusive club. For you are just one of the 41% of people who not only shared the article but actually read it too. 

In news that will embolden some, depress others and possibly surprise nobody, a new study by computer scientists at Columbia University and the French National Institute reveals that 59% of links shared on social media have never actually been clicked.

As the Washington Post put it this weekend in one of their best headlines ever – 6 in 10 of you will share this link without reading it, a new, depressing study says.

Back in 2014, it was estimated that social media referral was responsible for 30% of total visits to websites. However according to the research published by HAL (yes, a group of computer scientists publish their research under the name HAL, what of it? Why are you terrified?) and using a dataset amounting to 2.8 million shares, 75 billion potential views and 9.6 million actual clicks to 59,088 unique resources, most people just retweet news without ever reading it.

According to the study’s co-author, Arnaud Legout, “This is typical of modern information consumption. People form an opinion based on a summary, or a summary of summaries, without making the effort to go deeper.”

These blind retweeters are also, worryingly, shaping the news agenda, by sharing what is already ‘viral’ and adding to social platform’s ‘trend-watching algorithms’ without first stopping and reading what they’re actually doing.

Or are our favourite news sources so trustworthy that we can put blind faith in anything they publish? To be honest, Facebook will probably just ignore much of this anyway.

For proof of this, you need not look any further than May 26, when a certain social media manager (*cough*) tweeted the following headline but accidentally forgot to include the link to the article.

And yet the tweet enjoyed 25 retweets and 28 likes. That’s one of our most popular tweets, and yet not one person noticed the lack of link. Maybe that’s the key for us… black and white photo + non sequitur headline – link = engagement gold.

A peek behind the wizard’s curtain

To add our own two pence (or cents depending on where you are right now) to this debate, let’s open up our own analytics and let you see what influence SEW’s social channel on traffic to the site.

Let’s look at our own Twitter analytics for May 2016…

twitter analytics May

A ‘robust’ 2.5 million impressions from only 465 tweets is pretty good. But what about actual click-through rate (CTR)?

Let’s take a look at our top tweet in May…

twitter analytics top tweet

The impressions gained from its 29 retweets resulted in 18,202 impressions and ultimately 40 link clicks. This means it had a CTR of 0.2% which is about our average. Sadly, this is a little lower than the industry average.

According to Hubspot the average Twitter CTR is 1.64%, and the more followers you have, the fewer clicks you’ll receive on your tweets.

  • Users with 50 – 1,000 followers had a 6.16% CTR.
  • Users with 1,000 – 5,000 followers had a 1.45% CTR.
  • Users with 5,000 – 10,000 followers had a 0.55% CTR.
  • Users with 10,000+ followers had a 0.45% CTR.

And according to this Quora forum on Twitter CTR, links shared by Mashable’s Twitter account, despite its 7+ million followers, results in a CTR of just 0.11%.

If that’s not enough to be completely down heartened, let’s open up Google Analytics and see how much traffic social drove to SEW in May.

social analytics

Over the course of 31 days in May, only 4% of our total traffic came from social. The majority of our traffic comes from organic search (as you would hope and expect from a site with ‘search engine’ in the title), with direct, email and referral all coming in above social.

To break it down further by social channel, it’s 47% from Twitter, 24% from Facebook, 11% from LinkedIn and, uh, 0.3% from Pinterest.

However if we look at Twitter, the most popular social channel we operate, it drove more than 16,000 sessions to the site, 40% of which are from unique users.

So despite a low CTR, these aren’t inconsiderable numbers, and certainly the research presented by HAL should not be used as an excuse to ‘switch off’ your social activity.

You should just be aware that, if your boss is asking for metrics to measure your engagement, a simple number of retweets isn’t good enough.

View full post on Search Engine Watch

Why technical seo is so much more than just ‘make-up’

Technical SEO is vitally important it is for your website, and provides the foundations for an effective search strategy. 

I was prompted to write this article having read a post on the subject this week, written by Clayburn Griffin.

The author of that article doesn’t seem to value technical SEO so much. He says that “Technical SEO is easy, breezy, beautiful, but it’s no game-changer.” For me, this falls a long way from explaining what technical SEO actually is, and how important it is for your website.

In my role I will help clients with a huge range of SEO issues, and a large majority of those technical in nature. These can range from an erroneous implementation of hreflang or JSON-LD to helping a client target territories with no ISO country code – where GEO IP would only get you ~60% accuracy and browser location isn’t an option.

As an SEO who specialises in technical SEO, for me it is more of a process than a set of “esoteric skills” – although those skills do come in handy – it is about working with the clients needs, the developers needs and coming up with creative solutions for SEO, as it is not always possible to implement best practices.

The world without tech SEO

At the core, the assertion of the post is: Technical SEO can tart your site up for search engines, but won’t bring in the money.

And it is this assertion that I would like to challenge most vehemently, taking four examples from the technical SEO world.

1. Botched migration

We’ve all been there. A client is going through a migration but doesn’t want to shell out for a full service, saying “Our development team has done loads of these.” Only to watch from the side-lines as some fairly basic errors cost them all of their visibility, and only then to be asked for your help to fix it.

There is so much technically that can go wrong in a migration;

  • No 301 mapping
  • .htaccess rules not written correctly/efficiently
  • Staging environment gets indexed
  • Staging robots.txt or meta robots data brought over to live
  • Different versions of PHP/Apache/jQuery between environments

I could go on, but the point is without technical SEO any one of these numerous issues could kill your site dead, overnight.

site migration finance

2. Faceted search

This is a common technical SEO problem for anyone working with ecommerce. How to deal with your faceted navigation.

  • Are you creating duplicate content?
  • Should you use canonicals or noindex in robots.txt?
  • If so, where and which ones?
  • Which faceted navs can be indexed and which should not?
  • Rel prev/next on pagination or canonicals?
  • Add more facets or remove?
  • How should url re-writing be handled?
  • How many products per page?
  • Do your facets mirror your IA?

These are just some of the questions that come up for just this aspect of the site, and there is no definitive correct answer to them all; you can’t just check Google’s guidelines.

The answers to these questions change with every site, its needs and its current visibility.

3. Hreflang implementation

Here’s one for anyone who has gone international. Hreflang is not the simplest thing to implement and is often done incorrectly. People forget to self-reference, use the wrong codes, use incomplete codes, miss just one territory from their list, don’t include all versions of the page or simply forget to regularly audit their implementation.

Any one of those mistakes can break the entire implementation and leave you with duplicating content and rapidly falling rankings.

4. Optimising page load

This is perhaps one of the most all-encompassing technical issues an SEO can deal with. It is also one of the most frequently overlooked and poorly addressed; due in part to the wide range of skills required to fully address it, but also due to the cost of a lot of the solutions.

I always find this surprising as an old study from Amazon.com found that for every 100ms they improved page load their conversion rate increased by 1%!! It doesn’t take a genius to work out that’s a whole load of extra PlayStations getting sold at Christmas.

Page load speed can be effected from simple things like large image sizes, too many HTTP requests and multiple DNS lookups all the way to poorly configured servers, inefficient or badly curated code.

Truly improving page load speed could, and has, required completely rebuilding a website. Re-platforming, migrating subdomains and servers. And at every step of the way there are serious technical SEO considerations that must be made.

What is technical SEO?

Technical SEO, for me, is the science of search. It is bringing together what you know and testing it. Google won’t always tell you the truth about what works best, and may not necessarily know themselves.

You have to pore through technical specifications, stack exchange and webmaster central as well as conducting your own experiments and testing what you know.

More practically it is not just about going through the same tired checklist of robots.txt, 301 redirects and pipes vs hyphens, but about working with all other elements of a broader SEO strategy and with the client’s development, marketing, senior management and web teams to present the best possible version of their site to the world.

I do agree with some points in the article though. For one, there are plenty of people and agencies who say they can understand technical SEO and can help you with it, only to come unstuck when someone asks them the best solution for becoming mobile friendly.

And this does no favours for those of us still trying desperately to get away from the image of snake-oil salesmen and spammy, shoddy tactics.

However, without technical SEO your site will not rank for your keywords on Google. I agree that it should always be part of a broader SEO strategy encompassing content and offsite optimisation. It’s far more than mere make-up.

Image credit: PI Datametrics

View full post on Search Engine Watch

Go to Top
Copyright © 1992-2016, DC2NET All rights reserved