Tag Archives: Moz Blog

The Anatomy of a $97 Million Page: A CRO Case Study

Posted by jkuria

In this post, we share a CRO case study from Protalus, one of the fastest-growing footwear companies in the world. They make an insole that corrects the misalignment suffered by roughly 85% of the population. Misalignment is the cause of most back, knee, and foot pain. Back pain alone is estimated to be worth $100 billion a year.


Summary

  • We (with Protalus’ team) increased direct sales by 91% in about 6 months through one-click upsells and CRO.
  • Based on the direct sales increase, current run-rate revenue, the “Virtuous Cycle of CRO”-fueled growth rate, and revenue multiple for their industry, we estimate this will add about $97 million to the company’s valuation over the next 12–18 months*.
  • A concrete example of the Virtuous Cycle of CRO: Before we increased the conversion rate and average order value, Google Adwords was not a viable channel. Now it is, opening a whole new floodgate of profitable sales! Ditto for at least two other channels. In part due to our work, Protalus’ annual run-rate revenue has grown by 1,212% in less than a year.

* Protalus’ core product is differentiated, patent protected, and high margin. They also have a strong brand and raving fans. In the Shoes & Apparel category, they’re most similar to Lululemon Athletica, which has a 4x plus revenue multiple. While Nike and Under Armor engage in a bloody price war and margin-eroding celebrity endorsements, Lululemon commands significantly higher prices than its peers, without big-name backers! Business gurus Warren Buffett and Charlie Munger often say that the true test of a defensive moat around a business is “Can you raise prices without hurting sales?” Protalus has this in spades. They’ve raised prices several times while simultaneously increasing units sold — from $39 to $49 to $59 to $69 to $79 to $99 to $119.


One-click upsells: A 21% sales boost

When we do engagements, the first order of business to uncover low-hanging fruit growth opportunities. This accomplishes two things:

  1. It helps the client get an immediate ROI on the engagement
  2. It earns us goodwill and credibility within the company. We then have wide latitude to run the big, bold experiments that produce huge conversion lifts

In Protalus’ case, we determined they were not doing post-purchase one-click upsells. Adding these immediately boosted sales by 21%. Here’s how we did it:

  • On their main sales landing page, Protalus has an offer where you get $30 off on the second pair of insoles, as well as free expedited shipping for both. About 30% of customers were taking this offer.
  • For those who didn’t, right after they purchased but BEFORE they got to the “Thank You” page, we presented the offer again, which led to the 21% sales increase.

Done correctly, one-click upsells easily boost sales, as customers do not have to re-enter credit card details. Here’s the best way to do them: The Little Secret that Made McDonalds a $106 Billion Behemoth.

Below is the final upsell page that got the 21% sales increase:

A screenshot of a cell phone Description generated with very high confidence

We tested our way to it. The key effective elements are:

1. Including “free upgrade to expedited shipping” in the headline: 145% lift

The original page had it lower in the body copy:

Google Experiments screenshot showing 145% lift

2. Adding celebrity testimonials: 60% lift

Google Experiments screenshot showing a 60% lift

Elisabeth Howard’s (Ms. Senior America) unsolicited endorsement is especially effective because about 60% of Protalus’ customers are female and almost one-third are retired. We uncovered these gems by reviewing all 11,000 (at the time) customers’ testimonials.

3. Explaining the reasons why other customers bought additional insoles.

See the three bulleted reasons on the first screenshot (convenience, different models, purchasing for loved ones).


Radical re-design and long-form page: A 58% conversion lift

With the upsells producing positive ROI for the client, we turned to re-designing the main sales page. The new page produced a cumulative lift of 58%, attained in two steps.

[Step 1] 35% lift: Long-form content-rich page

Optimizely screenshot shows 35% lift at 99% statistical significance

Note that even after reaching 99% statistical significance, the lift fluctuated between 33% and 37%, so we’ll claim 35%.

[Step 2] 17% lift: Performance improvements

The new page was quite a bit longer, so its “fully loaded” time increased a lot — especially on mobile devices with poor connections. A combination of lazy loading, lossless image shrinking, CSS sprites, and other ninja tactics led to a further 17% lift.

These optimizations reduced the page load time by 40% and shrunk the size by a factor of 4x!

The total cumulative lift was therefore 58% (1.35 x 1.17 = 1.58).

With the earlier 21% sales gain from one-click upsells, that’s a 91% sales increase (1.21 x 1.35 x 1.17 = 1.91).


Dissecting the anatomy of the winning page

To determine what vital few elements to change, we surveyed the non-converting visitors. Much of the work in A/B testing is the tedious research required to understand non-converting visitors.

“Give me six hours to chop a tree and I’ll spend the first four sharpening the axe.” – Abraham Lincoln

All CRO practitioners would do well to learn from good, ol’ honest Abe! We used Mouseflow’s feedback feature to survey bouncing visitors from the main landing page and the check-out page. The top objection themes were:

  1. Price is too high/product too expensive
  2. Not sure it will work (because others didn’t work before)
  3. Not sure it will work for my specific condition
  4. Difficulty in using website

We then came up with specific counter-objections for each: A landing page is a “salesmanship in digital print,” so many of the techniques that work in face-to-face selling also apply.

On a landing page, though, you must overcorrect because you lack the back- and-forth conversation in a live selling situation. Below is the list of key elements on the winning page.

1. Price is too high/product is too expensive

This was by far the biggest objection, cited by over 50% of all respondents. Thus, we spent a disproportionate amount of effort and page real estate on it.

Protalus’ insoles cost $79, whereas Dr. Scholls (the 100-year-old brand) cost less than $10. When asked what other products they considered, customers frequently said Dr. Scholls.

Coupled with this, nearly one-third of customers are retired and living on a fixed income.

“I ain’t gonna pay no stinkin’ $79! They cost more than my shoes,” one visitor remarked.

To overcome the price objection, we did a couple of things.

Articulated the core value proposition and attacked the price from the top

When prospects complain about price it simply means that they do not understand or appreciate the the product’s value proposition. They are seeing this:

The product’s cost exceeds the perceived value

To effectively deal with price, you must tilt the scale so that it looks like this instead:

The perceived value exceeds cost

While the sub-$10 Dr. Scholls was the reference point for many, we also learned that some customers had tried custom orthotics ($600 to $3,000) and Protalus’ insoles compared favorably.

We therefore decided our core value proposition would be:

“Avoid paying $600 for custom orthotics. Protalus insoles are almost as effective but cost 87% less.”

…forcing the $600 reference point, instead of the $10 for Dr. Scholls. In the conversion rate heuristic we use, the value proposition is the single biggest lever.

We explained all this from a “neutral” educational standpoint (rather than a salesy one) in three steps:

1. First, we use “market data” to explain the cause of most pain and establish that custom orthotics are more effective than over-the-counter insoles. Market data is always more compelling than product data, so you should lead with it.

C:\Users\jkuri\AppData\Local\Temp\SNAGHTML32c02fc1.PNG

2. Next, like a good trial lawyer, we show why Protalus insoles are similar to custom orthotics but cost 87% less:

C:\Users\jkuri\AppData\Local\Temp\SNAGHTML32c1e5dd.PNG

3. Finally, we deal with the “elephant in the room” and explain how Protalus insoles are fundamentally different from Dr. Scholls:

C:\Users\jkuri\AppData\Local\Temp\SNAGHTML32c39c19.PNG

We also used several verbatim customer testimonials to reinforce this point:

C:\Users\jkuri\AppData\Local\Temp\SNAGHTML32c7042b.PNG

C:\Users\jkuri\AppData\Local\Temp\SNAGHTML32c8a047.PNG

Whenever possible, let others do your bragging!

Attacked price from the bottom

Here, we used a technique known as “break the price down to the ridiculous.” $79 is just 44 cents per day, less than a K-cup of coffee — which most people consume once or twice a day! This makes the price more palatable.

C:\Users\jkuri\AppData\Local\Temp\SNAGHTML32cd1f37.PNG

Used the quality argument

The quality technique is from Zig Ziglar’s Sales Training. You say to a prospect:

“Many years ago, our company/founder/founding team made a basic decision. We decided it would be easier to use the highest quality materials and explain price one time than it would be to apologize for low quality forever. When you use the product/service, you’ll be glad we made that decision.”

It’s especially effective if the company has a well-known “maker” founder (like Yvon Chouinardat at Patagonia). It doesn’t work as well for MBAs or suits, much as we need them!

Protalus’ founder Chris Buck designed the insoles and has a cult-like following, so it works for him.

Dire outcomes of not taking action

Here we talked about the dire outcomes if you do not get the insoles; for example, surgery, doctors’ bills, and lost productivity at work! Many customers work on their feet all day (nurses, steelworkers, etc.) so this last point is highly relevant.

C:\Users\jkuri\AppData\Local\Temp\SNAGHTML3717c03d.PNG

Microsoft employed this technique successfully against Linux in the early 2000s. While Linux was free, the “Total Cost of Ownership” for not getting Windows was much higher when you considered support, frequent bugs, less accountability, fewer feature updates, and so on.

2. Not sure the product will work

For this objection, we did the following:

Used Dr. Romansky

We prominently featured Dr. Romansky, Protalus’ resident podiatrist. A consultant to the US Men’s and Women’s soccer teams and the Philadephia Phillies baseball team, he has serious credibility.

C:\Users\jkuri\AppData\Local\Temp\SNAGHTML371d6ed4.PNG

The “educational” part of the landing page (above the fold) is done in “his voice.” Before, only his name appeared on a rarely visited page. This is an example of a “hidden wealth” opportunity!

Used celebrity testimonials on the main landing page

Back in 1997, a sports writer asked Phil Knight (Nike’s founder): “Is there no better way for you to spend $100 million?”

You see, Knight had just paid that staggering sum to a young Tiger Woods — and it seemed extravagant!

Knight’s answer? An emphatic “No!” That $100 million would generate several billion dollars in sales for Nike over the next decade!

Celebrity testimonials work. Period.

Since our celebrity endorsements increased the one-click upsell take-rate by 60%, we also used them on the main page:

C:\Users\jkuri\AppData\Local\Temp\SNAGHTML372a0993.PNG

C:\Users\jkuri\AppData\Local\Temp\SNAGHTML3728f545.PNG

Used expert reviews

We solicited and included expert reviews from industry and medical professionals. Below are two of the four we used:

C:\Users\jkuri\AppData\Local\Temp\SNAGHTML372ff274.PNG

C:\Users\jkuri\AppData\Local\Temp\SNAGHTML37315c55.PNG

These also helped address the price concern because some site visitors had expressed discomfort paying so much for an over-the-counter product without doctor recommendation.

3. Not sure the product will work for me

This is different from “Not sure the product will work” and needs to be treated separately. If there’s one thing we’ve learned over the years, it is that everyone thinks their situation is one-in-a-million unique!

We listed all the conditions that Protalus insoles address, as well as those they do not.

C:\Users\jkuri\AppData\Local\Temp\SNAGHTML37353580.PNG

In addition, we clearly stated that the product does not work for 15% of the population.

By conspicuously admitting this (NOT just in the fine print section!) you are more credible. This is expressed in the Prospect’s Protest as:

“First tell me what your product CANNOT do and I might believe you when you tell me what it can do!”

4. Difficulty in using the site

Several visitors reported difficulty using the site, so we used Mouseflow’s powerful features to detect and fix usability issues.

Interestingly, the visitor session recordings confirmed that price was a big issue as we could clearly see prospects navigate to the price, stare incredulously, and then leave!

Accentuate the customers’ reasons for buying

Most of the opportunity in CRO is in the non-converting visitors (often over 90%), but understanding converting ones can yield crucial insights.*

For Protalus, the top reasons for buying were:

  • Desperation/too much leg, knee, or back pain/willing to try anything (This is the 4M, for “motivation,” in the strategic formula we use)
  • The testimonials were persuasive
  • Video was convincing

On the last point, the Mouseflow heatmaps showed that those who watched the video bought at a much higher rate, yet few watched it.

We therefore placed the video higher above the fold, used an arrow to draw attention, and inserted a sub-headline:

C:\Users\jkuri\AppData\Local\Temp\SNAGHTML373cd9dc.PNG

A million-dollar question we ask buyers is:

“Was there any reason you ALMOST DID NOT buy?”

Devised by Cambridge-educated Dr. Karl Blanks, who coined the term “conversion rate optimization” in 2006, this question earned him a knighthood from the Queen of England! Thanks, Sir Karl!

It’s a great question because its answer is usually the reason many others didn’t buy. For every person who almost didn’t buy for reason X, I guarantee at least three others did not buy!

Given the low response rates when surveying non-converting visitors, this question helps get additional intelligence. In our case, price came up again.

*Sometimes the customers’ reasons for buying will surprise you. One of our past clients is in the e-cigarette/vaping business and a common reason cited by men for vaping was “to quit smoking because of my young daughter.” They almost never said “child” or “son”! Armed with this knowledge, we converted a whole new segment of smokers who had not considered vaping.

Speed testimonials

One of the most frequently asked questions was “How soon can I expect relief?” While Protalus addressed this in their Q&A section, we included conspicuous “speed testimonials” on the main page:

C:\Users\jkuri\AppData\Local\Temp\SNAGHTML37a1de17.PNG

For someone in excruciating pain, the promise of fast relief is persuasive!

Patent protection exclusivity & social proof

C:\Users\jkuri\AppData\Local\Temp\SNAGHTML37494993.PNG

Many of Protalus’ site visitors are older and still prefer to buy in physical stores, as we learned from our survey. They may like the product, but then think “I’ll buy them at the store.” We clarified that the product is only available on Protalus’ site.

Mentioning the patent-protection added exclusivity, one of the two required elements for a compelling value proposition.

At its core, landing page optimization isn’t about optimizing pages. A page just happens to be the medium used to optimize thought sequences in the prospect’s mind.

Dr. Flint likes to say, “The geography of the page determines the chronology of thought sequences in the prospect’s mind.” As shown above, we repeated the social proof elements at the point of purchase.

Tying it all together

After systematically addressing each objection and adding various appeal elements, we strung them all in the cohesive long-form page below.

We start with a powerful headline and Elisabeth’s story because it’s both intriguing and relevant to Protalus’ audience, which skews female and over 55. The only goal of a headline is to get visitors to read what comes next — NOT to sell.

The product’s price is not mentioned until we have told a compelling story, educated visitors and engaged them emotionally.

Note that the winning page is several times longer than the control. There is a mistaken belief that you “just need to get to the point” because people won’t read long pages. In fact, a previous consultant told Protalus that their sales were low because the “buy button” wasn’t high enough on the page. 🙂

Nothing could be further from the truth. For a high-priced product, you must articulate a compelling value proposition before you sell!

But also note the page is “as long as necessary, but as short as possible.” Buy buttons are sprinkled liberally after the initial third of the page so that those who are convinced needn’t “sit through the entire presentation.”


Acknowledgement

We’d like to thank team Protalus for giving us wide latitude to conduct bold experiments and for allowing us to publish this. Their entrepreneurial culture has been refreshing. We are most grateful to Don Vasquez, their forward-thinking CMO (and minority owner), for trusting the process and standing by us when the first test caused some revenue loss.

Thanks to Hayk Saakian, Nick Jordan, Yin-so Chen, and Jon Powell for reading drafts of this piece.


Free CRO audit

I can’t stress this enough: CRO is hard work. We spent countless hours on market research, studied visitor behavior, and reviewed tens of thousands of customer comments before we ran a single A/B test. We also solicited additional testimonials from industry experts and doctors. There is no magical silver bullet — just lots of little lead ones!

Results like this don’t happen by accident. If you are unhappy with your current conversion rate for sales, leads or app downloads, first, we encourage you to review the tried-and-true strategic formula. Next, we would like to offer Moz readers a free CRO audit. We’ll also throw in a free SEO (Search Engine Optimization) review. While we specialize in CRO, we’ve partnered with one of the best SEO firms due to client demand. Lastly, we are hiring. Review the roles and reasons why you should come work for us!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Moz Blog https://moz.com/blog/cro-case-study
via IFTTT

from Blogger http://imlocalseo.blogspot.com/2017/09/the-anatomy-of-97-million-page-cro-case.html
via IFTTT

Advertisements

Understanding and Harnessing the Flow of Link Equity to Maximize SEO Ranking Opportunity – Whiteboard Friday

Posted by randfish

How does the flow of link equity work these days, and how can you harness its potential to help improve your rankings? Whether you’re in need of a refresher or you’ve always wanted a firmer grasp of the concept, this week’s Whiteboard Friday is required watching. Rand covers the basic principles of link equity, outlines common flow issues your site might be encountering, and provides a series of action items to ensure your site is riding the right currents.

https://fast.wistia.net/embed/iframe/oes88397ak?videoFoam=true

https://fast.wistia.net/assets/external/E-v1.js

Link equity flow

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat about understanding and harnessing link equity flow, primarily internal link equity flow, so that you can get better rankings and execute on your SEO. A big thank you to William Chou, @WChouWMX on Twitter, for suggesting this topic. If you have a topic or something that you would like to see on Whiteboard Friday, tweet at me. We’ll add it to the list.

Principles of link equity

So some principles of link equity first to be aware of before we dive into some examples.

1. External links generally give more ranking value and potential ranking boosts than internal links.

That is not to say, though, that internal links provide no link equity, and in fact, many pages that earn few or no external links can still rank well if a domain itself is well linked to and that page is on that site and has links from other good, important pages on the domain. But if a page is orphaned or if a domain has no links at all, extremely difficult to rank.

2. Well-linked-to pages, both internal and external, pass more link equity than those that are poorly linked to.

I think this makes intuitive sense to all of us who have understood the concept of PageRank over the years. Basically, if a page accrues many links, especially from other important pages, that page’s ability to pass its link equity to other pages, to give a boost in ranking ability is stronger than if a page is very poorly linked to or not linked to at all.

3. Pages with fewer links tend to pass more equity to their targets than pages with more links.

Again, going off the old concept of PageRank, if you have a page with hundreds or thousands of links on it, each of those receives a much more fractional, smaller amount of the link equity that could be passed to it than if you have a page with only a few links on it. This is not universally… well, I just want to say this doesn’t scale perfectly. So it’s not the case that if you were to trim down your high link earning pages to having only one link and point to this particular page on your site, then you suddenly get tremendously more benefit than if you had your normal navigation on that page and you link to your homepage and About page and products page. That’s not really the case. But if you had a page that had hundreds of links in a row and you instead made that page have only a few links to the most important, most valuable places, you’ll get more equity out of that, more rank boosting ability.

4. Hacks and tricks like “nofollow” are often ineffective at shaping the flow of link equity.

Using rel=”no follow” or embedding a remotely executable JavaScript file that makes it so that browsers can see the links and visitors can, but Google is unlikely to see or follow those links, to shape the flow of your link equity is generally (a) a poor use of your time, because it doesn’t affect things that much. The old-school PageRank algorithm not that hugely important anymore. And (b) Google is often pretty good at interpreting and discounting these things. So it tends to not be worth your time at all.

5. Redirects and canonicalization lose a small amount of link equity. Non-ideal ones like 302s, JS redirects, etc. may lose more than 301, rel=canonical, etc.

So if I have a 301 or a rel=canonical from one page to another, those will lose or cost you a small, a very small amount of link equity. But more potentially costly would be using non-ideal types of redirects or canonicalization methods, like a JavaScript-based redirect or a 302 or a 307 instead of a 301. If you’re going to do a redirect or if you’re going to do canonicalization, 301s or rel=canonicals are the way to go.

So keeping in mind these principles, let’s talk through three of the most common link equity flow issues that we see websites facing.

Common link equity flow issues

A. A few pages on a large site get all the external links:

You have a relatively large site, let’s say thousands to tens of thousands, maybe even hundreds of thousands of pages, and only a few of those pages are earning any substantial quantity of external links. I have highlighted those in pink. So these pages are pointing to these pink ones. But on this website you have other pages, pages like these purple ones, where you essentially are wanting to earn link equity, because you know that you need to rank for these terms and pages that these purple ones are targeting, but they’re not getting the external links that these pink pages are. In these cases, it’s important to try a few things.

  1. We want to identify the most important non-link earning pages, these purple ones. We’ve got to figure out what these actually are. What are the pages that you wish would rank that are not yet ranking for their terms and phrases that they’re targeting?
  2. We want to optimize our internal links from these pink pages to these purple ones. So in an ideal world, we would say, “Aha, these pages are very strong. They’ve earned a lot of link equity.” You could use Open Site Explorer and look at Top Pages, or Ahrefs or any of our other competitors and look at your pages, the ones that have earned the most links and the most link equity. Then you could say, “Hey, can I find some relevance between these two or some user stories where someone who reaches this page needs something over here, and thus I’m going to create a link to and from there?” That’s a great way to pass equity.
  3. Retrofitting and republishing. So what I mean by this is essentially I’m going to take these pages, these purple ones that I want to be earning links, that are not doing well yet, and consider reworking their content, taking the lessons that I have learned from the pink pages, the ones that have earned link equity, that have earned external links and saying, “What did these guys do right that we haven’t done right on these guys, and what could we do to fix that situation?” Then I’m going to republish and restart a marketing, a link building campaign to try and get those links.

B. Only the homepage of a smaller site gets any external links.

This time we’re dealing with a small site, a very, very small site, 5 pages, 10 pages, maybe even up to 50 pages, but generally a very small site. Often a lot of small businesses, a lot of local businesses have this type of presence, and only the homepage gets any link equity at all. So what do we do in those cases? There’s not a whole lot to spread around. The homepage can only link to so many places. We have to serve users first. If we don’t, we’re definitely going to fall in the search engine rankings.

So in this case, where the pink link earner is the homepage, there are two things we can do:

  1. Make sure that the homepage is targeting and serves the most critical keyword targets. So we have some keyword targets that we know we want to go after. If there’s one phrase in particular that’s very important, rather than having the homepage target our brand, we could consider having the homepage target that specific query. Many times small businesses and small websites will make this mistake where they say, “Oh, our most important keyword, we’ll make that this page. We’ll try and rank it. We’ll link to it from the homepage.” That is generally not nearly as effective as making a homepage target that searcher intent. If it can fit with the user journey as well, that’s one of the best ways you can go.
  2. Consider some new pages for content, like essentially saying, “Hey, I recognize that these other pages, maybe they’re About and my Terms of Service and some of my products and services and whatnot, and they’re just not that link-worthy. They don’t deserve links. They’re not the type of pages that would naturally earn links.” So we might need to consider what are two or three types of pages or pages that we could produce, pieces of content that could earn those links, and think about it this way. You know who the people who are already linking to you are. It’s these folks. I have just made up some domains here. But the folks who are already linking to your homepage, those are likely to be the kinds of people who will link to your internal pages as well. So I would think about them as link targets and say, “What would I be pretty confident that they would link to, if only they knew that it existed on our website?” That’s going to give you a lot of success. Then I would check out some of our link building sections here on Whiteboard Friday and across the Moz Blog for more tips.

C. Mid-long tail KW-targeting pages are hidden or minimized by the site’s nav/IA.

So this is essentially where I have a large site, and I have pages that are targeting keywords that don’t get a ton of volume, but they’re still important. They could really boost the value that we get from our website, because they’re hyper-targeted to good customers for us. In this case, one of the challenges is they’re hidden by your information architecture. So your top-level navigation and maybe even your secondary-level navigation just doesn’t link to them. So they’re just buried deep down in the website, under a whole bunch of other stuff. In these cases, there are some really good solutions.

  1. Find semantic and user intent relationships. So semantic is these words appeared on those pages. Let’s say one of these pages here is targeting the word “toothpaste,” for example, and I find that, oh, you know what, this page over here, which is well linked to in our navigation, mentions the word “toothpaste,” but it doesn’t link over here yet. I’m going to go create those links. That’s a semantic relationship. A user intent relationship would be, hey, this page over here talks about oral health. Well, oral health and toothpaste are actually pretty relevant. Let me make sure that I can create that user journey, because I know that people who’ve read about oral health on our website probably also later want to read about toothpaste, at least some of them. So let’s make that relationship also happen between those two pages. That would be a user intent type of relationship. You’re going find those between your highly linked to external pages and your well-linked-to internal pages and these long tail pages that you’re trying to target. Then you’re going to create those new links.
  2. Try and leverage the top-level category pages that you already have. If you have a top-level navigation and it links to whatever it is — home, products, services, About Us, Contact, the usual types of things — it’s those pages that are extremely well linked to already internally where you can add in content links to those long-tail pages and potentially benefit.
  3. Consider new top-level or second-level pages. If you’re having trouble adding them to these pages, they already have too many links, there’s no user story that make good sense here, it’s too weird to jam them in, maybe engineering or your web dev team thinks that that’s ridiculous to try and jam those in there, consider creating new top-level pages. So essentially saying, “Hey, I want to add a page to our top-level navigation that is called whatever it is, Additional Resources or Resources for the Curious or whatever.” In this case in my oral health and dentistry example, potentially I want an oral health page that is linked to from the top-level navigation. Then you get to use that new top-level page to link down and flow the link equity to all these different pages that you care about and currently are getting buried in your navigation system.

All right, everyone. Hope you’ve enjoyed this edition of Whiteboard Friday. Give us your tips in the comments for how you’ve seen link equity flow, the benefits or drawbacks that you’ve seen to try and controlling and optimizing that flow. We’ll see again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Moz Blog https://moz.com/blog/harnessing-link-equity
via IFTTT

from Blogger http://imlocalseo.blogspot.com/2017/09/understanding-and-harnessing-flow-of.html
via IFTTT

Pros and Cons of HTTPS Services: Traditional vs Let’s Encrypt vs Cloudflare

Posted by jrridley

If you have a website property verified in Google Search Console, and the website is not HTTPS-secured, you’ve likely seen some form of the following message in your dashboard recently:

After months of talk and speculation, Google has finally started to move forward with its plan to secure the web by enforcing HTTPS. Although HTTPS had previously only been a concern for e-commerce sites or sites with login functionality, this latest update affects significantly more sites. The vast majority of websites have a contact page (or something similar) that contains a contact or subscription form. Those forms almost always contain text input fields like the ones Google warns about in the message above. The “NOT SECURE” warning has already been appearing on insecure sites that collect payment information or passwords. It looks like this in a user’s URL bar:

Now that this warning will be displaying for a much larger percentage of the web, webmasters can’t put off an HTTPS implementation any longer. Unfortunately, Google’s advice to webmasters for solving this problem is about as vague and unhelpful as you might imagine:

Thanks, Google.

Implementing HTTPS is not a simple process. The Washington Post published a blog post outlining their 10-month HTTPS migration back in 2015, and numerous sites (including Moz) have reported experiencing major traffic fluctuations following their migrations. The time and resources required to migrate to HTTPS are no minor investment; we’re talking about a substantial website overhaul. In spite of these obstacles, Google has shown little sympathy for the plight of webmasters:

http://platform.twitter.com/widgets.js

Google’s singular focus in this area is to provide a better user experience to web visitors by improving Internet security. On its surface, there’s nothing wrong with this movement. However, Google’s blatant disregard for the complexities this creates for webmasters leaves a less-than-pleasant taste in my mouth, despite their good intentions.

Luckily, there’s a bit of a silver lining to these HTTPS concerns. Over the last few years, we’ve worked with a number of different clients to implement HTTPS on their sites using a variety of different methods. Each experience was unique and presented its own set of challenges and obstacles. In a previous post, I wrote about the steps to take before, during, and after a migration based on our experience. In this post, my focus is instead on highlighting the pros and cons of various HTTPS services, including non-traditional implementations.

Here are the three methods we’ve worked with for our clients:

  1. Traditional HTTPS implementation
  2. Let’s Encrypt
  3. Cloudflare

Method 1: Traditional HTTPS implementation

A traditional HTTPS implementation starts with purchasing an SSL certificate from a trusted provider, like Digicert or GeoTrust (hint: if a site selling SSL certificates is not HTTPS-secured, don’t buy from them!). After that, you’ll need to verify the certificate with the Certificate Authority you purchased it from through a Certificate Signing Request (CSR); this just proves that you do manage the site you claim to be managing. At this point, your SSL certificate will be validated, but you’ll still have to implement it across your site. Namecheap has a great article about installing SSL certificates depending on your server type. Once that SSL certificate has been installed, your site will be secured, and you can take additional steps to enable HSTS or forced HTTPS rewrites at this point.

Pros

  1. Complete security. With a fully validated SSL certificate installed on your root server, there is no possibility of having a compromised connection between your server and site, or between your site and the site visitor.
  2. Customizable. One of the features of a full SSL implementation is that you can purchase an Extended Validation (EV) SSL certificate. This not only provides your green padlock in the browser bar, but also includes your company name to provide further assurance to visitors that your site is safe and secure.
  3. Easier to implement across multiple subdomains. If you have multiple subdomains, what you’ll likely need for your HTTPS implementation is either a separate SSL certificate for each subdomain or a wildcard certificate for all variations of your domain. A traditional SSL service is often the easiest way to set up a wildcard certificate if you need to secure several variations.

Cons

  1. Expensive. Though basic SSL certificates may be available for as little as $150, depending on the complexity of your site, these costs can quickly increase to several thousand dollars if you need more advanced security features, a better CDN network, etc. This also doesn’t include the cost of having developers implement the SSL certificate, which can be extensive as well.
  2. Time to implement. As mentioned above, it took the Washington Post 10 months to complete their HTTPS migration. Other companies have reported similar timeframes, especially for larger, more complex websites. It’s very hard to know in advance what kinds of issues you’ll have to resolve with your site configuration, what kinds of mixed content you may run into, etc., so plan lots of extra time to address these issues if you go with a standard implementation.

Method 2: Let’s Encrypt

Let’s Encrypt is a free nonprofit service provided by the Internet Security Research Group to promote web security by providing free SSL certificates. Implementing Let’s Encrypt is very similar to a traditional HTTPS implementation: You still need to validate the Certificate Authority, install the SSL certificate on your server, then enable HSTS or Forced HTTPS rewrites. However, implementing Let’s Encrypt is often much simpler through the help of services like Certbot, which will provide the implementation code needed for your particular software and server configuration.

Pros

  1. Free. The cost is zero, zippo, nada. No fine print or hidden details.
  2. Ease of implementation. Let’s Encrypt SSL is often much simpler to implement on your site than a traditional HTTPS implementation. Although not quite as simple as Cloudflare (see below), this ease of implementation can solve a lot of technical hurdles for people looking to install an SSL certificate.
  3. Complete security. Like with a traditional HTTPS implementation, the entire connection between site visitor and site server is secure, leaving no possibility of a compromised connection.

Cons

  1. Compatibility issues. Let’s Encrypt is known to be incompatible with a few different platforms, though the ones it is incompatible with are not likely to be a major source of traffic to your site (Blackberry, Nintendo 3DS, etc.).
  2. 90-day certificates. While traditional SSL certificates are often valid for a year or more, Let’s Encrypt certificates are only valid for 90 days, and they recommend renewing every 60 days. Forgetting to renew your certificate with this necessary frequency could put your site in a compromising situation.
  3. Limited customization. Let’s Encrypt will only offer Domain Validation certificates, meaning that you can’t purchase a certificate to get that EV green bar SSL certificate. Also, Let’s Encrypt does not currently offer wildcard certificates to secure all of your subdomains, though they’ve announced this will be rolling out in January 2018.

Method 3: Cloudflare

This is one of my favorite HTTPS implementations, simply because of how easy it is to enable. Cloudflare offers a Flexible SSL service, which removes almost all of the hassle of implementing an SSL certificate directly on your site. Instead, Cloudflare will host a cached version of your site on their servers and secure the connection to the site visitors through their own SSL protection. You can see what this looks like in the picture below:

In doing so, Cloudflare makes this process about as simple as you can ask for. All you have to do is update your DNS records to point to Cloudflare’s nameservers. Boom, done. And as with Let’s Encrypt, the process is entirely free.

Pros

  1. Free. The cost is zero, zippo, nada. No fine print or hidden details. Cloudflare does offer more advanced features if you upgrade to one of their paid plans, but the base SSL service comes completely free.
  2. Easiest implementation. As I mentioned above, all that’s required for implementing Cloudflare’s SSL service is creating an account and updating your DNS records. There’s no update to the server configuration and no time spent trying to resolve additional configuration issues. Additionally, implementing HSTS and forced HTTPS rewrites can be done directly through the Cloudflare dashboard, so there’s really almost no work involved on your end.
  3. PageSpeed optimizations. In addition to SSL security, Cloudflare’s HTTPS implementation also provides several additional services that can preserve PageSpeed scores and page load times. While a traditional HTTPS implementation (or Let’s Encrypt) can often have negative consequences for your site’s page load times, Cloudflare offers the ability to auto-minify JS, CSS, and HTML; Accelerated Mobile Pages (AMP); and a Rocket loader for faster JS load times. All of these features (along with Cloudflare serving a cached version of your site to visitors) will help prevent any increase in page load times on your site.

Cons

  1. Incomplete encryption. As you can see in the picture above, Cloudflare encrypts the connection between the visitor and the cached version of your site on Cloudflare, but it doesn’t encrypt the connection between your site and your server. While this means that site visitors can feel secure while visiting your site, there is still the chance that your server connection will be compromised. While you can upgrade to a full SSL implementation that does enable this setup, that is not part of the free service.
  2. Security concerns. Cloudflare was infamously hacked earlier this year, exposing lots of sensitive user information. While it appears they have resolved and tightened security since then, it’s still important to be aware of this development.
  3. Lack of customization. Like with Let’s Encrypt, Cloudflare’s free SSL service doesn’t provide any kind of EV green bar SSL for your site. While you can upgrade to full SSL which does provide this functionality, the service is no longer free at that point.

Which type of HTTPS implementation is best?

It really depends on your site. Smaller sites who just need enough security that Google won’t punish the site in Chrome can likely use Cloudflare. The same goes for agencies providing HTTPS recommendations to clients where you don’t have development control of the site. On the other hand, major e-commerce or publication sites are going to want a fully customized HTTPS implementation through traditional means (or via Let’s Encrypt’s wildcard certificate, when that happens next year). Ultimately, you’ll have to decide which implementation makes the most sense for your situation.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Moz Blog https://moz.com/blog/traditional-vs-lets-encrypt-vs-cloudflare
via IFTTT

from Blogger http://imlocalseo.blogspot.com/2017/09/pros-and-cons-of-https-services.html
via IFTTT

Announcing 5 NEW Feature Upgrades to Moz Pro’s Site Crawl, Including Pixel-Length Title Data

Posted by Dr-Pete

While Moz is hard at work on some major new product features (we’re hoping for two more big launches in 2017), we’re also working hard to iterate on recent advances. I’m happy to announce that, based on your thoughtful feedback, and our own ever-growing wish lists, we’ve recently launched five upgrades to Site Crawl.

1. Mark Issues as Fixed

It’s fine to ignore issues that don’t matter to your site or business, but many of you asked for a way to audit fixes or just let us know that you’ve made a fix prior to our next data update. So, from any issues page, you can now select items and “Mark as fixed” (screens below edited for content).

Fixed items will immediately be highlighted and, like Ignored issues, can be easily restored…

Unlike the “Ignore” feature, we’ll also monitor these issues for you and warn you if they reappear. In a perfect world, you’d fix an issue once and be done, but we all know that real web development just doesn’t work out that way.

2. View/Ignore/Fix More Issues

When we launched the “Ignore” feature, many of you were very happy (it was, frankly, long overdue), until you realized you could only ignore issues in chunks of 25 at a time. We have heard you loud and clear (seriously, Carl, stop calling) and have taken two steps. First, you can now view, ignore, and fix issues 100 at a time. This is the default – no action or extra clicks required.

3. Ignore Issues by Type

Second, you can now ignore entire issue types. Let’s say, for example, that Moz.com intentionally has 33,000 Meta Noindex tags (for example). We really don’t need to be reminded of that every week. So, once we make sure none of those are unintentional, we can go to the top of the issue page and click “Ignore Issue Type”:

Look for this in the upper-right of any individual issue page. Just like individual issues, you can easily track all of your ignored issues and start paying attention to them again at any time. We just want to help you clear out the noise so that you can focus on what really matters to you.

4. Pixel-length Title Data

For years now, we’ve known that Google cut display titles by pixel length. We’ve provided research on this subject and have built our popular title tag checker around pixel length, but providing this data at product scale proved to be challenging. I’m happy to say that we’ve finally overcome those challenges, and “Pixel Length” has replaced Character Length in our title tag diagnostics.

Google currently uses a 600-pixel container, but you may notice that you receive warnings below that length. Due to making space to add the “…” and other considerations, our research has shown that the true cut-off point that Google uses is closer to 570 pixels. Site Crawl reflects our latest research on the subject.

As with other issues, you can export the full data to CSV, to sort and filter as desired:

Looks like we’ve got some work to do when it comes to brevity. Long title tags aren’t always a bad thing, but this data will help you much better understand how and when Google may be cutting off your display titles in SERPs and decide whether you want to address it in specific cases.

5. Full Issue List Export

When we rebuilt Site Crawl, we were thrilled to provide data and exports on all pages crawled. Unfortunately, we took away the export of all issues (choosing to divide those up into major issue types). Some of you had clearly come to rely on the all issues export, and so we’ve re-added that functionality. You can find it next to “All Issues” on the main “Site Crawl Overview” page:

We hope you’ll try out all of the new features and report back as we continue to improve on our Site Crawl engine and UI over the coming year. We’d love to hear what’s working for you and what kind of results you’re seeing as you fix your most pressing technical SEO issues.

Find and fix site issues now

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Moz Blog https://moz.com/blog/site-crawl-upgrades
via IFTTT

from Blogger http://imlocalseo.blogspot.com/2017/09/announcing-5-new-feature-upgrades-to.html
via IFTTT

The Beginner’s Guide to Structured Data for SEO: How to Implement Structured Data

Posted by bridget.randolph

Part 2: How to implement structured data for SEO

Welcome to Part 2 of The Beginner’s Guide to Structured Data: How to Implement Structured Data for SEO. In Part 1, we focused on gaining a high-level understanding of what structured data is and how it can be used to support SEO efforts.

(If you missed Part 1, you can go check it out here).

In Part 2, we’ll be looking at the steps to identify opportunities and implement structured data for SEO on your website. Since this is an introductory guide, I’ll be focusing on the most basic types of markup you can add and the most common use cases, and providing resources with additional detail for the more technical aspects of implementation.

Is structured data right for you?

Generally speaking, implementing structured data for SEO is worthwhile for most people. However, it does require a certain level of effort and resources, and you may be asking yourself whether it’s worth prioritizing.

Here are some signs that it’s a good time to prioritize structured data for SEO:

  • Search is a key value-driving channel for your business
  • You’ve recently audited your site for basic optimization issues and you know that you’ve achieved a competitive baseline with your keyword targeting, backlinks profile, site structure, and technical setup
  • You’re in a competitive vertical and need your results to stand out in the SERPs
  • You want to use AMP (Accelerated Mobile Pages) as a way to show up in featured areas of the SERP, including carousels
  • You have a lot of article-style content related to key head terms (e.g. 10 chicken recipes) and you’d like a way to display multiple results for those terms in the SERP
  • You’re ranking fairly well (position 15 or higher) already for terms with significant search volume (5000–50,000 searches/month)*
  • You have solid development resources with availability on staff and can implement with minimal time and financial investment
  • You’re in any of the following verticals: e-commerce, publishing, educational products, events/ticketing, creative production, TV/movie/book reviews, job listings, local business

*What is considered significant volume may vary according to how niche your market is.

If you said yes to any of these statements, then implementing structured data is particularly relevant to you! And if these criteria don’t currently apply to you, of course you can still go ahead and implement; you might have great results. The above are just a few of the most common indicators that it’s a worthwhile investment.

Implementing structured data on your site

In this guide, we will be looking solely at opportunities to implement Schema.org markup, as this is the most extensive vocabulary for our purposes. Also, because it was developed by the search engine companies themselves, it aligns with what they support now and should continue to be the most supported framework going forward.

How is Schema.org data structured?

The way that the Schema.org vocabulary is structured is with different “types” (Recipe, Product, Article, Person, Organization, etc.) that represent entities, kinds of data, and/or content types.

Each Type has its own set of “properties” that you can use to identify the attributes of that item. For example, a “Recipe” Type includes properties like “image,” “cookTime,” “nutritionInformation,” etc. When you mark up a recipe on your site with these properties, Google is able to present those details visually in the SERP, like this:

Image source

In order to mark up your content with Schema.org vocabulary, you’ll need to define the specific properties for the Type you’re indicating.

For example:

If you’re marking up a recipe page, you need to include the title and at least two other attributes. These could be properties like:

  • aggregateRating: The averaged star rating of the recipe by your users
  • author: The person who created the recipe
  • prepTime: The length of time required to prepare the dish for cooking
  • cookTime: The length of time required to cook the dish
  • datePublished: Date of the article’s publication
  • image: An image of the dish
  • nutritionInformation: Number of calories in the dish
  • review: A review of the dish
  • …and more.

Each Type has different “required” properties in order to work correctly, as well as additional properties you can include if relevant. (You can view a full list of the Recipe properties at Schema.org/Recipe, or check out Google’s overview of Recipe markup.)

Once you know what Types, properties and data need to be included in your markup, you can generate the code.

The code: Microdata vs JSON-LD

There are two common approaches to adding Schema.org markup to your pages: Microdata (in-line annotations added directly to the relevant HTML) and JSON-LD (which uses a Javascript script tag to insert the markup into the head of the page).

JSON-LD is Google’s recommended approach, and in general is a cleaner, simpler implementation… but it is worth noting that Bing does not yet officially support JSON-LD. Also, if you have a WordPress site, you may be able to use a plugin (although be aware that not all of WordPress’ plugins work they way they’re supposed to, so it’s especially important to choose one with good reviews, and test thoroughly after implementation).

Whatever option you choose to use, always test your implementation to make sure Google is seeing it show up correctly.

What does this code look like?

Let’s look at an example of marking up a very simple news article (Schema.org/NewsArticle).


Here’s the article content (excluding body copy), with my notes about what each element is:

[posted by publisher ‘Google’]
[headline]Article Headline
[author byline]By John Doe
[date published] Feb 5, 2015
[description] A most wonderful article
[image]
[company logo]

And here’s the basic HTML version of that article:

Article headline

By John Doe


If you use Microdata, you’ll nest your content inside the relevant meta tags for each piece of data. For this article example, your Microdata code might look like this (within the <body> of the page):

Article headline

A most wonderful article
<meta itemprop="name" content="Google"> </div> <meta itemprop="datePublished" content="2015-02-05T08:00:00+08:00"/> <meta itemprop="dateModified" content="2015-02-05T09:20:00+08:00"/> </div>

The JSON-LD version would usually be added to the <head> of the page, rather than integrated with the <body> content (although adding it in the <body> is still valid).

JSON-LD code for this same article would look like this:


{
  "@context": "http://schema.org",
  "@type": "NewsArticle",
  "mainEntityOfPage": {
    "@type": "WebPage",
    "@id": "https://google.com/article"
  },
  "headline": "Article headline",
  "image": {
    "@type": "ImageObject",
    "url": "https://google.com/thumbnail1.jpg",
    "height": 800,
    "width": 800
  },
  "datePublished": "2015-02-05T08:00:00+08:00",
  "dateModified": "2015-02-05T09:20:00+08:00",
  "author": {
    "@type": "Person",
    "name": "John Doe"
  },
   "publisher": {
    "@type": "Organization",
    "name": "Google",
    "logo": {
      "@type": "ImageObject",
      "url": "https://google.com/logo.jpg",
      "width": 600,
      "height": 60
    }
  },
  "description": "A most wonderful article"
}

This is the general style for Microdata and JSON-LD code (for Schema.org/Article). The Schema.org website has a full list of every supported Type and its Properties, and Google has created “feature guides” with example code for the most common structured data use cases, which you can use as a reference for your own code.

How to identify structured data opportunities (and issues)

If structured data has previously been added to your site (or if you’re not sure whether it has), the first place to check is the Structured Data Report in Google Search Console.

This report will tell you not only how many pages have been identified as containing structured data (and how many of these have errors), but may also be able to identify where and/or why the error is occurring. You can also use the Structured Data Testing Tool for debugging any flagged errors: as you edit the code in the tool interface, it will flag any errors or warnings.

If you don’t have structured data implemented yet, or want to overhaul your setup from scratch, the best way to identify opportunities is with a quick content audit of your site, based on the kind of business you have.

A note on keeping it simple

There are lots of options when it comes to Schema.org markup, and it can be tempting to go crazy marking up everything you possibly can. But best practice is to keep focused and generally use a single top-level Type on a given page. In other words, you might include review data on your product page, but the primary Type you’d be using is Schema.org/Product. The goal is to tell search engines what this page is about.

Structured data must be representative of the main content of the page, and marked up content should not be hidden from the user. Google will penalize sites which they believe are using structured data markup in scammy ways.

There are some other general guidelines from Google, including:

  • Add your markup to the page it describes (so Product markup would be added to the individual product page, not the homepage)
  • For duplicated pages with a canonical version, add the same markup to all versions of the page (not just the canonical)
  • Don’t block your marked-up pages from search engines
  • Be as specific as possible when choosing a Type to add to a page
  • Multiple entities on the same page must each be marked up individually (so for a list of products, each product should have its own Product markup added)
  • As a rule, you should only be adding markup for content which is being shown on the page you add it to

So how do you know which Schema.org Types are relevant for your site? That depends on the type of business and website you run.

Schema.org for websites in general

There are certain types of Schema.org markup which almost any business can benefit from, and there are also more specific use cases for certain types of business.

General opportunities to be aware of are:

  • Sitelinks Search Box: if you have search functionality on your site, you can add markup which enables a search box to appear in your sitelinks:

Image source

Image source

  • VideoObject: if you have video content on your site, this markup can enable video snippets in SERPs, with info about uploader, duration, a thumbnail image, and more:

A note about Star reviews in the SERP

You’ll often see recommendations about “marking up your reviews” to get star ratings in the SERP results. “Reviews” have their own type, Schema.org/Review, with properties that you’ll need to include; but they can also be embedded into other types using that type’s “review” property.

You can see an example of this above, in the Recipes image, where some of the recipes in the SERP display a star rating. This is because they have included the aggregate user rating for that recipe in the “review” property within the Schema.org/Recipe type.

You’ll see a similar implementation for other properties which have their own type, such as Schema.org/Duration, Schema.org/Date, and Schema.org/Person. It can feel really complicated, but it’s actually just about organizing your information in terms of category > subcategory > discrete object.

If this feels a little confusing, it might help to think about it in terms of how we define a physical thing, like an ingredient in a recipe. Chicken broth is a dish that you can make, and each food item that goes into making the chicken broth would be classified as an ingredient. But you could also have a recipe that calls for chicken broth as an ingredient. So depending on whether you’re writing out a recipe for chicken broth, or a recipe that includes chicken broth, you’ll classify it differently.

In the same way, attributes like “Review,” “Date,” and “Duration” can be their own thing (Type), or a property of another Type. This is just something to be aware of when you start implementing this kind of markup. So when it comes to “markup for reviews,” unless the page itself is primarily a review of something, you’ll usually want to implement Review markup as a property of the primary Type for the page.


In addition to this generally applicable markup, there are certain Schema.org Types which are particularly helpful for specific kinds of businesses:

  • E-commerce
    • including online course providers
  • Recipes Sites
  • Publishers
  • Events/Ticketing Sites
    • including educational institutions which offer courses
  • Local Businesses
  • Specific Industries (small business and larger organizations)
  • Creative Producers

Schema.org for e-commerce

If you have an e-commerce site, you’ll want to check out:

  • Product: this allows you to display product information, such as price, in the search result. You can use this markup on an individual product page, or an aggregator page which shows information about different sellers offering an individual product.
  • Offer: this can be combined with Schema.org/Product to show a special offer on your product (and encourage higher CTRs).
  • Review: if your site has product reviews, you can aggregate the star ratings for each individual product and display it in the SERP for that product page, using Schema.org/aggregateRating.

Things to watch out for…

  • Product markup is designed for individual products, not lists of products. If you have a category page and want to mark it up, you’ll need to mark up each individual product on the page with its own data.
  • Review markup is designed for reviews of specific items, goods, services, and organizations. You can mark up your site with reviews of your business, but you should do this on the homepage as part of your organization markup.
  • If you are marking up reviews, they must be generated by your site, rather than via a third-party source.
  • Course markup should not be used for how-to content, or for general lectures which do not include a curriculum, specific outcomes, or a set student list.

Schema.org for recipes sites

For sites that publish a lot of recipe content, Recipe markup is a fantastic way to add additional context to your recipe pages and get a lot of visual impact in the SERPs.

Things to watch out for…

If you’re implementing Recipe Rich Cards, you’ll want to be aware of some extra guidelines:

Schema.org for publishers

If you have an publisher site, you’ll want to check out the following:

  • Article and its subtypes,
    • NewsArticle: this indicates that the content is a news article
    • BlogPosting: similar to Article and NewsArticle, but specifies that the content is a blog post
  • Fact Check: If your site reviews or discusses “claims made by others,” as Google diplomatically puts it, you can add a “fact check” to your snippet using the Schema.org/ClaimReview.

Image source

  • CriticReview: if your site offers critic-written reviews of local businesses (such as a restaurant critic’s review), books, and /or movies, you can mark these up with Schema.org/CriticReview.
    • Note that this is a feature being tested, and is a knowledge box feature rather than a rich snippet enhancement of your own search result.

Image source

Things to watch out for…

Schema.org for events/ticketing sites

If your business hosts or lists events, and/or sells tickets, you can use:

  • Events: you can mark up your events pages with Schema.org/Event and get your event details listed in the SERP, both in a regular search result and as instant answers at the top of the SERP:

  • Courses: If your event is a course (i.e., instructor-led with a student roster), you can also use Schema.org/Course markup.

Things to watch out for…

  • Don’t use Events markup to mark up time-bound non-events like travel packages or business hours.
  • As with products and recipes, don’t mark up multiple events listed on a page with a single usage of Event markup.
    • For a single event running over several days, you should mark this up as an individual event and make sure you indicate start and end dates;
    • For an event series, with multiple connected events running over time, mark up each individual event separately.
  • Course markup should not be used for how-to content, or for general events/lectures which do not include a curriculum, specific outcomes, and an enrolled student list.

Schema.org for job sites

If your site offers job listings, you can use Schema.org/JobPosting markup to appear in Google’s new Jobs listing feature:

Note that this is a Google aggregator feature, rather than a rich snippet enhancement of your own result (like Google Flights).

Things to watch out for…

  • Mark up each job post individually, and do not mark up a jobs listings page.
  • Include your job posts in your sitemap, and update your sitemap at least once daily.
  • You can include Review markup if you have review data about the employer advertising the job.

Schema.org for local businesses

If you have a local business or a store with a brick-and-mortar location (or locations), you can use structured data markup on your homepage and contact page to help flag your location for Maps data as well as note your “local” status:

  • LocalBusiness: this allows you to specify things like your opening hours and payment accepted
  • PostalAddress: this is a good supplement to getting all those NAP citations consistent
  • OrderAction and ReservationAction: if users can place orders or book reservations on your website, you may want to add action markup as well.

You should also get set up with GoogleMyBusiness.

☆ Additional resources for local business markup

Here’s an article from Whitespark specifically about using Schema.org markup and JSON-LD for local businesses, and another from Phil Rozek about choosing the right Schema.org Type. For further advice on local optimization, check out the local SEO learning center and this recent post about common pitfalls.

Schema.org for specific industries

There are certain industries and/or types of organization which get specific Schema.org types, because they have a very individual set of data that they need to specify. You can implement these Types on the homepage of your website, along with your Brand Information.

These include LocalBusiness Types:

And a few larger organizations, such as:

Things to watch out for…

  • When you’re adding markup that describes your business as a whole, it might seem like you should add that markup to every page on the site. However, best practice is to add this markup only to the homepage.

Schema.org for creative producers

If you create a product or type of content which could be considered a “creative work” (e.g. content produced for reading, viewing, listening, or other consumption), you can use CreativeWork markup.

More specific types within CreativeWork include:

Schema.org new features (limited availability)

Google is always developing new SERP features to test, and you can participate in the testing for some of these. For some, the feature is an addition to an existing Type; for others, it is only being offered as part of a limited test group. At the time of this writing, these are some of the new features being tested:

Structured data beyond SEO

As mentioned in Part 1 of this guide, structured data can be useful for other marketing channels as well, including:

For more detail on this, see the section in Part 1 titled: “Common Uses for Structured Data.”

How to generate and test your structured data implementation

Once you’ve decided which Schema.org Types are relevant to you, you’ll want to add the markup to your site. If you need help generating the code, you may find Google’s Data Highlighter tool useful. You can also try this tool from Joe Hall. Note that these tools are limited to a handful of Schema.org Types.

After you generate the markup, you’ll want to test it at two stages of the implementation using the Structured Data Testing Tool from Google — first, before you add it to the site, and then again once it’s live. In that pre-implementation test, you’ll be able to see any errors or issues with the code and correct before adding it to the site. Afterwards, you’ll want to test again to make sure that nothing went wrong in the implementation.

In addition to the Google tools listed above, you should also test your implementation with Bing’s Markup Validator tool and (if applicable) the Yandex structured data validator tool. Bing’s tool can only be used with a URL, but Yandex’s tool will validate a URL or a code snippet, like Google’s SDT tool.

You can also check out Aaron Bradley’s roundup of Structured Data Markup Visualization, Validation, and Testing Tools for more options.

Once you have live structured data on your site, you’ll also want to regularly check the Structured Data Report in Google Search Console, to ensure that your implementation is still working correctly.

Common mistakes in Schema.org structured data implementation

When implementing Schema.org on your site, there are a few things you’ll want to be extra careful about. Marking up content with irrelevant or incorrect Schema.org Types looks spammy, and can result in a “spammy structured markup” penalty from Google. Here are a few of the most common mistakes people make with their Schema.org markup implementation:

Mishandling multiple entities

Marking up categories or lists of items (Products, Recipes, etc) or anything that isn’t a specific item with markup for a single entity

  • Recipe and Product markup are designed for individual recipes and products, not for listings pages with multiple recipes or products on a single page. If you have multiple entities on a single page, mark up each item individually with the relevant markup.

Misapplying Recipes markup

Using Recipe markup for something that isn’t food

  • Recipe markup should only be used for content about preparing food. Other types of content, such as “diy skin treatment” or “date night ideas,” are not valid names for a dish.

Misapplying Reviews and Ratings markup

Using Review markup to display “name” content which is not a reviewer’s name or aggregate rating

  • If your markup includes a single review, the reviewer’s name must be an actual organization or person. Other types of content, like “50% off ingredients,” are considered invalid data to include in the “name” property.

Adding your overall business rating with aggregateRating markup across all pages on your site

  • If your business has reviews with an aggregateRating score, this can be included in the “review” property on your Organization or LocalBusiness.

Using overall service score as a product review score

  • The “review” property in Schema.org/Product is only for reviews of that specific product. Don’t combine all product or business ratings and include those in this property.

Marking up third-party reviews of local businesses with Schema.org markup

  • You should not use structured data markup on reviews which are generated via third-party sites. While these reviews are fine to have on your site, they should not be used for generating rich snippets. The only UGC review content you should mark up is reviews which are displayed on your website, and generated there by your users.

General errors

Using organization markup on multiple pages/pages other than the homepage

  • It might seem counter-intuitive, but organization and LocalBusiness markup should only be used on the pages which are actually about your business (e.g. homepage, about page, and/or contact page).

Improper nesting

  • This is why it’s important to validate your code before implementing. Especially if you’re using Microdata tags, you need to make sure that the nesting of attributes and tags is done correctly.

So there you have it — a beginner’s guide to understanding and implementing structured data for SEO! There’s so much to learn around this topic that a single article or guide can’t cover everything, but if you’ve made it to the end of this series you should have a pretty good understanding of how structured data can help you with SEO and other marketing efforts. Happy implementing!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Moz Blog https://moz.com/blog/structured-data-for-seo-2
via IFTTT

from Blogger http://imlocalseo.blogspot.com/2017/09/the-beginners-guide-to-structured-data_11.html
via IFTTT

The 3 Easiest Link Building Tactics Any Website Can Use to Acquire Their First 50 Links – Whiteboard Friday

Posted by randfish

Without a solid base of links, your site won’t be competitive in the SERPs — even if you do everything else right. But building your first few links can be difficult and discouraging, especially for new websites. Never fear — Rand is here to share three relatively quick, easy, and tool-free (read: actually free) methods to build that solid base and earn yourself links.

https://fast.wistia.net/embed/iframe/f253a5sane?videoFoam=true

https://fast.wistia.net/assets/external/E-v1.js

Link Building Tactics to Acquire Your 50 First Links

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat about how to get those first few links that every website needs to be able to compete. Many folks I know when you get started with link building, it can seem daunting and overwhelming.

So let me walk you through what is essentially a half a day of work, maybe three or four hours of work to try these three tactics that will almost certainly get your business or your organization the first handful, let’s say 50 links that you need to start being able to compete. Content can you take you a long way. Keywords can take you a long way. Engagement and interaction can take you a long way. But you’ve got to have a base of links. So let’s get started here.

#1. Your brand name, domain name, and founder’s/execs names

The first one is basically looking for links that come from your own name, your brand name, your domain name potentially, and the names of the founders or people who run your company.

Step One: Search Google for the names in quotes.

So if it was me and Moz, you’d be searching for “Rand Fishkin” or “Moz.com” in quotes, not the domain name in the URL field. But in the Google search bar, I’d be searching for “Moz.com” in quotes or “Moz + SEO.” Moz also has other meanings, including the singer Morrissey, which makes for confusing types of things. If you have that, you’ll need to use your brand name plus some sort of signifier or identifier. It’s very rare that Morrissey gets mentioned along with search engine optimization. It’s very often that Moz gets mentioned along with SEO, and so I can combine those to search for it. So any of these searches will result in a big list of tons of Google results.

Step Two: Manually check the top let’s say 50 to 100 results to confirm that…

  1. They link to the right place, and if they don’t, if there are mentions of Rand Fishkin that don’t link to Moz, we should fix that. We’re going to contact those people.
  2. If you can control the anchor text and where the link location points, you can update it. For example, I can go to my LinkedIn. My LinkedIn has a link to Moz. I could update that if I were at a different company or if Moz’s domain name changed, for example when it did change from SEOmoz to just Moz.
  3. If it’s missing or wrong, I find the right people, I email them, and I fix it. As a result, I should have something like this. Every single mention in Google has a link on the page to my website. I can get that from brand name, from domain name, and from founders and executives. That’s a lot of great links.

#2. Sites that list your competition

So this is essentially saying we’re going to…

Step One: Identify your top 5 or 10 most visible on the web competitors.

This is a process that you can go through on your own to identify, well, these are the 5 or 10 that we see on the web very frequently for searches that we wish we competed for, or we see them mentioned in the press a ton, whatever it is.

Step Two: Search Google not for each one individually, but rather for combinations, usually two, three, or four of them all together.

For example, if I were making a new whiteboard pen company, I would look for the existing ones, like Pilot and Expo and Quartet and PandaBoard. I might search for Pilot and PandaBoard first. Then I might search for Pilot and Expo. Then I might search for PandaBoard and Quartet and all these various combinations of these different ones.

Step Three: Visit any sites in the SERPs that list multiple competitors in any sort of format (a directory structure, comparisons, a list, etc.)

Then in each of those cases, I would submit or I would try and contact or get in touch with whoever runs that list and say, “Hey, my company, my organization also belongs on here because, like these other ones you’ve listed, we do the same thing.” So if it’s here’s whiteboard pen brands, Expo, PandaBoard, Quartet, and your site, which should now link to YourSite.com.

This is a little more challenging. You won’t have as high a hit rate as you will with your own brand names. But again, great way to expand your link portfolio. You can usually almost always get 20 or 30 different sites that are listing people in your field and get on those lists.

#3. Sites that list people/orgs in your field, your geography, with your attributes.

This is sites that list people or organizations in a particular field, a particular region, with particular attributes, or some combination of those three. So they’re saying here are European-based whiteboard pen manufacturers or European-based manufacturers who were founded by women.

So you can say, “Aha, that’s a unique attribute, that’s a geography, and that’s my field. I’m in manufacturing. I make whiteboard pens. Our cofounder was a woman, and we are in Europe. So therefore we count in all three of those. We should be on that list.” You’re looking for lists like these, which might not list your competitors, but are high-quality opportunities to get good links.

Step One:

  1. List your organization’s areas of operation. So that would be like we are in technology, or we’re in manufacturing or software or services, or we’re a utility, or we’re finance tech, or whatever we are. You can start from macro and go down to micro at each of those levels.
  2. List your geography in the same format from macro to micro. You want to go as broad as continent, for example Europe, down to country, region, county, city, even neighborhood. There are websites that list, “Oh, well, these are startups that are based in Ballard, Seattle, Washington in the United States in North America.” So you go, “Okay, I can fit in there.”
  3. List your unique attributes. Were you founded by someone whose attributes are different than normal? Moz, obviously my cofounder was my mom, Gillian. So Moz is a cofounded-by-a-woman company. Are you eco-friendly? Maybe you buy carbon credits to offset, or maybe you have a very eco-friendly energy policy. Or you have committed to donating to charity, like Salesforce has. Or you have an all-remote team. Or maybe you’re very GLBTQIA-friendly. Or you have a very generous family leave policy. Whatever interesting attributes there are about you, you can list those and then you can combine them.

Step Two: Search Google for lists of businesses or websites or organizations that have some of these attributes in your region or with your focus.

For example, Washington state venture-backed companies. Moz is a venture-backed company, so I could potentially get on that list. Or the EU-based manufacturing companies started by women, and I could get on that list with my whiteboard pen company based there. You can find lots and lots of these if you sort of take from your list, start searching Google and discover those results. You’ll use the same process you did here.

You know what the great thing about all three of these is? No tools required. You don’t have to pay for a single tool. You don’t have to worry about Domain Authority. You don’t have to worry about any sort of link qualification process or paying for something expensive. You can do this manually by yourself with Google as your only tool, and that will get you some of those first early links.

If you’ve got additional suggestions, please leave them down in the comments. I look forward to chatting with you there. We’ll see you again next week for another edition of Whiteboard Friday.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Moz Blog https://moz.com/blog/link-building-tactics-to-acquire-50-links
via IFTTT

from Blogger http://imlocalseo.blogspot.com/2017/09/the-3-easiest-link-building-tactics-any.html
via IFTTT

The E-Commerce Benchmark KPI Study 2017: 15 Essential Takeaways

Posted by Alan_Coleman

Is your website beating, meeting, or behind the industry average?

Wolfgang Digital’s 2017 E-Commerce Benchmark KPI Study is out with an even bigger sample size than ever before. Analyzing 143 million website sessions and $531 million in online revenues, the study gives e-commerce marketers essential insights to help benchmark their business’s online performance and understand which metrics drive e-commerce success.

This study is our gift to the global e-commerce industry. The objective is to reveal the state of play in the industry over the last 12 months and ultimately help digital marketers make better digital marketing decisions by:

  1. Better understanding their website performance through comparing key performance indicators (KPIs) with industry benchmarks.
  2. Gaining insights into which key metrics will ensure e-commerce success

You can digest the full study here.

Skim through the key takeaways below:


1. Google remains people’s window to the web, but its dominance is in decline.

The search giant generates 62% of all traffic and 63% of all revenue. This is down from 69% of traffic and 67% of revenue in last year’s study. In numerical terms, Google is growing — it’s simply that the big G’s share of the pie is in decline.

2. Google’s influence is declining as consumers’ paths to purchase become more diverse, with “dark traffic” on the rise.

This occurs when Google Analytics doesn’t recognize a source by default, like people sharing links on WhatsApp. Dark traffic shows up as direct traffic in Google Analytics. Direct traffic grew from 17% to 18% of traffic.

3. Consumers’ paths to purchase have gotten longer.

It now takes 12% more clicks to generate a million euro online than it did 12 months ago, with 360,000 clicks being the magic million-euro number in 2017.

4. Mobile earns more share, yet desktop still delivers the dollars.

2017 is the first year mobile claimed more sessions (52%) than desktop (36%) and tablet (12%) combined. Desktop generates 61% of all online revenue, with users 164% more likely to convert than those browsing on mobile. Plus, when desktop users convert, they spend an average of 20% more per order than mobile shoppers.

5. The almighty conversion rate: e-commerce sites average 1.6%.

E-commerce websites averaged 1.6% overall. Travel came in at 2.4%. Online-only retailers saw 1.8% conversion rates, while their multichannel counterparts averaged 1.2%

6. Don’t shop if you’re hungry.

Conversion rates for food ordering sites are fifteen times those of typical retail e-commerce!

***Correlation explanation: The most unique and most useful part of our study is our correlation calculation. We analyze which website metrics correlate with e-commerce success. Before I jump into our correlation findings, let me explain how to read them. Zero means no correlation between the two metrics. One means perfect correlation; for example, “every time I sneeze, I close my eyes.” Point five (0.5) means that as one metric increases 100%, the other metric increases 50%. A negative correlation means that as one variable increases, the other decreases.

From our experience compiling these stats over the years, any correlation over .2 is worth noting. North of 0.4 is a very strong correlation. I’ve ranked the following correlations below in order of strength, starting with the strongest.

7. Sticky websites sell more (0.6).

The strongest correlation in the study was between time spent on a website and conversion rate (0.6 correlation). By increasing time on site by 16%, conversion rates ramp up 10%. Pages per session also correlated solidly with revenue growth (0.25).

8. People trust Google (0.48).

According to Forbes, Google is the world’s second most valuable brand. Our figures agree. People who got more than average organic traffic from Google enjoyed a savagely strong conversion rate (0.48). It seems that when Google gives prominent organic coverage to a website, that website enjoys higher trust and, in turn, higher conversion rates from consumers.

9. Tablet shoppers love a bit of luxury (0.4).

Higher-than-average tablet sessions correlated very strongly with high average order values (0.4). However, pricey purchases require more clicks, no matter the device.

10. Loyal online shoppers are invaluable (0.35).

Your best-converting customers are always your returning loyal customers. Typically they show up as direct traffic, high levels of which correlated very strongly with conversion rates (0.35).

11. Speed matters (0.25).

005Onsite Engagement.jpg

Average site speed was 6 seconds. This is far higher than the generally recommended 2 seconds. There was a strong inverse correlation between average page load time and revenue growth (0.25). Reducing the average load time by 1.6 seconds would increase annual revenue growth by 10%.

12. Mobile is a money-making machine (0.25).

009Revenue Growth.jpg

Websites that got more mobile pageviews (0.25) and more tablet pageviews (0.24) grew revenue faster.

13. Email pays dividends (0.24).

002Source-Rev.jpg

Email delivers three times as much revenue as Facebook on a last-click basis. Those who get more traffic from email also enjoy a higher AOV (0.24).

14. Bing CPC represents a quick win (0.22).

Websites with a higher share of Bing CPC traffic tend to see a higher AOV (0.22). This, coupled with lower CPCs, makes Bing an attractive low-volume high-profit proposition. Bing has made the route into Bing Ads much easier, introducing a simple one-click tool which will convert your AdWords campaigns into Bing Ad campaigns.

15. Pinterest can be powerful (0.22).

Websites with more Pinterest traffic enjoyed higher AOVs (0.22). This demonstrates Pinterest’s power as a visual research engine, a place where people research ideas before taking an action — for example, planning a wedding, designing a living room, or purchasing a pair of pumps. The good news for digital marketers is that Pinterest recently launched its self-service ad platform.


Black holes

We used Google Analytics to compile the report. Once installed correctly, Google Analytics is very accurate in the numbers it does reports. However, there are two areas it struggles to report on that digital marketers need to keep in mind:

  1. Offline conversions: For 99% of our data set, there is no offline conversion tracking setup. Google is introducing measures to make it easier to track this. Once marketing directors get visibility on the offline impact of their online spend, we expect more offline budget to migrate online.
  2. Cross-device conversions: It’s currently very difficult to measure cross device conversions. According to Google themselves, 90% of goals occur on more than one device. Yet Google Analytics favors the sturdy desktop, as it generates the most same-device conversions. The major loser here is social, with 9 out of 10 Facebook sessions being mobile sessions. Instagram and Snapchat don’t even have a desktop version of their app!

Google is preparing to launch enhanced reporting in the coming months, which will give greater visibility on cross-device conversions. Hopefully this will give us a clearer picture of social’s role in conversion for our 2018 study.

The full report is available here and I’d love to answer your questions in the comments section below.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Moz Blog https://moz.com/blog/ecommerce-benchmark-kpi-study-2017
via IFTTT

from Blogger http://imlocalseo.blogspot.com/2017/09/the-e-commerce-benchmark-kpi-study-2017.html
via IFTTT