Tag Archives: Moz Blog

Is the New, Most Powerful Ranking Factor “Searcher Task Accomplishment?” – Whiteboard Friday

Posted by randfish

Move over, links, content, and RankBrain — there’s a new ranking factor in town, and it’s a doozy. All kidding aside, the idea of searcher task accomplishment is a compelling argument for how we should be optimizing our sites. Are they actually solving the problems searchers seek answers for? In today’s Whiteboard Friday, Rand explains how searcher task accomplishment is what Google ultimately looks for, and how you can keep up.

https://fast.wistia.net/embed/iframe/owwsbo80qz?videoFoam=true

https://fast.wistia.net/assets/external/E-v1.js

Searcher Task Accomplishment

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week, we’re chatting about a new Google ranking factor.

Now, I want to be clear. This is not something that’s directly in Google’s algorithm for sure. It’s just that they’re measuring a lot of things that lead us to this conclusion. This is essentially what Google is optimizing toward with all of their ranking signals, and therefore it’s what SEOs nowadays have to think about optimizing for with our content. And that is searcher task accomplishment.

So what do I mean by this? Well, look, when someone does a search like “disinfect a cut,” they’re trying to actually accomplish something. In fact, no matter what someone is searching for, it’s not just that they want a set of results. They’re actually trying to solve a problem. For Google, the results that solve that problem fastest and best and with the most quality are the ones that they want to rank.

In the past, they’ve had to do all sorts of algorithms to try and get at this from obtuse angles. But now, with a lot of the work that they’re doing around measuring engagement and with all of the data that’s coming to them through Chrome and through Android, they’re able to get much, much closer to what is truly accomplishing the searcher’s task. That’s because they really want results that satisfy the query and fulfill the searcher’s task.

So pretty much every — I’m excluding navigational searches — but every informational and transactional type of search — I mean, navigational, they just want to go to that website — but informational and transactional search query is basically this. It’s I have an expression of need. That’s what I’m telling Google. But behind that, there’s a bunch of underlying goals, things that I want to do. I want to know information. I want to accomplish something. I want to complete an activity.

When I do that, when I perform my search, I have this sort of evaluation of results. Is this going to help me do what I want? Then I choose one, and then I figure out whether that result actually helps me complete my task. If it does, I might have discovery of additional needs around that, like once you’ve answered my disinfect a cut, now it’s, okay, now I kind of want to know how to prevent an infection, because you described using disinfectant and then you said infections are real scary. So let me go look up how do I prevent that from happening. So there’s that discovery of additional needs. Or you decide, hey, this did not help me complete my task. I’m going to go back to evaluation of results, or I’m going to go back to my expression of need in the form of a different search query.

That’s what gives Google the information to say, “Yes, this result helped the searcher accomplish their task,” or, “No, this result did not help them do it.”

Some examples of searcher task accomplishment

This is true for a bunch of things. I’ll walk you through some examples.

If I search for how to get a book published, that’s an expression of need. But underlying that is a bunch of different goals like, well, you’re going to be asking about like traditional versus self-publishing, and then you’re going to want to know about agents and publishers and the publishing process and the pitch process, which is very involved. Then you’re going to get into things like covers and book marketing and tracking sales and all this different stuff, because once you reach your evaluation down here and you get into discovery of additional needs, you find all these other things that you need to know.

If I search for “invest in Ethereum,” well maybe I know enough to start investing right away, but probably, especially recently because there’s been a ton of search activity around it, I probably need to understand: What the heck is the blockchain and what is cryptocurrency, this blockchain-powered currency system, and what’s the market for that like, and what has it been doing lately, and what’s my purchase process, and where can I actually go to buy it, and what do I have to do to complete that transaction?

If I search for something like “FHA loans,” well that might mean I’m in the mindset of thinking about real estate. I’m buying usually my first house for an FHA loan, and that means that I need to know things about conditions by region and the application process and what are the providers in my area and how can I go apply, all of these different things.

If I do a search for “Seattle event venues,” well that means I’m probably looking for a list of multiple event venues, and then I need to narrow down my selection by the criteria I care about, like region, capacity, the price, the amenities. Then once I have all that, I need contact information so that I can go to them.

In all of these scenarios, Google is going to reward the results that help me accomplish the task, discover the additional needs, and solve those additional needs as well, rather than the ones that maybe provide a slice of what I need and then make me go back to the search results and choose something else or change my query to figure out more.

Google is also going to reward, and you can see this in all these results, they’re going to reward ones that give me all the information I need, that help me accomplish my task before they ask for something in return. The ones that are basically just a landing page that say, “Oh yeah, Seattle event venues, enter your email address and all this other information, and we’ll be in touch with a list of venues that are right for you.” Yeah, guess what? It doesn’t matter how many links you have, you are not ranking, my friends.

That is so different from how it used to be. It used to be that you could have that contact form. You could have that on there. You could not solve the searcher’s query. You could basically be very conversion rate-focused on your page, and so long as you could get the right links and the right anchor text and use the right keywords on the page, guess what? You could rank. Those days are ending. I’m not going to say they’re gone, but they are ending, and this new era of searcher task accomplishment is here.

Challenge: The conflict between SEO & CRO

There’s a challenge. I want to be totally up front that there is a real challenge and a problem between this world of optimizing for searcher task accomplishment and the classic world of we want our conversions. So the CRO in your organization, which might be your director of marketing or it might be your CEO, or maybe if your team is big enough, you might have a CRO specialist, conversation rate optimization specialist, on hand. They’re thinking, “Hey, I need the highest percent of form completions possible.”

So when someone lands on this page, I’m trying to get from two percent to four percent. How do we get four percent of people visiting this page to complete the form? That means removing distractions. That means not providing information up front. That means having a great teaser that says like, “Hey, we can give this to you, and here are testimonials that say we can provide this information. But let’s not give it right up front. Don’t give away the golden goose, my friend. We want these conversions. We need to get our qualified leads into the funnel,” versus the SEO, who today has to think about, “How do I get searchers to accomplish their task without friction?” This lead capture form, that’s friction.

So every organization, I think, needs to decide which way they’re going to go. Are they going to go for basically long-term SEO, which is I’m going to solve the searcher’s task, and then I’m going to figure out ways later to monetize and to capture value? Or am I going to basically lose out in the search results to people who are willing to do this and go this route instead and drive traffic from other sources? Maybe I’ll rank with different pages and I’ll send some people here, or maybe I will pay for my traffic, or I’ll try and do some barnacle SEO and get links from people who do rank up top there, but I won’t do it directly myself. This is a choice we all have.

How do we nail searcher task accomplishment?

All right. So how do you do this? Let’s say you’ve gone the SEO path. You’ve decided, “Yes, Rand, I’m in. I want to help the searcher accomplish their task. I recognize that I’m going to have to be willing to sacrifice some conversion rate optimization.” Well, there are two things here.

1. Gain a deep understanding of what drives searchers to search.

2. What makes some searchers come away unsatisfied.

Once they’ve performed this query, why do they click the back button? Why do they choose a different result? Why do they change their query to something else? There are ways we can figure out both of these.

To help with number 1 try:

Some of the best things that you can do are talk to people who actually have those problems and who are actually performing those searches or have performed them through…

  • Interviews
  • Surveys

I will provide you with a link to a document that I did around specifically how to get a book published. I did a survey that I ran that looked at searcher task accomplishment and what people hoped that content would have for them, and you can see the results are quite remarkable. I’ll actually embed my presentation on searcher task accomplishment in this Whiteboard Friday and make sure to link to that as well.

  • In-person conversations, and powerful things can come out of those that you wouldn’t get through remote or through email.
  • You can certainly look at competitors. So check out what your competitors are saying and what they’re doing that you may not have considered yet.
  • You can try putting yourself in your searcher’s shoes.

What if I searched for disinfect a cut? What would I want to know? What if I searched for FHA loans? I’m buying a house for the first time, what am I thinking about? Well, I’m thinking about a bunch of things. I’m thinking about price and neighborhood and all this. Okay, how do I accomplish all that in my content, or at least how do I provide navigation so that people can accomplish all that without having to go back to the search results?

To help with number 2 try:

Understanding what makes those searchers come away unsatisfied.

  • Auto-suggest and related searches are great. In fact, related searches, which are at the very bottom of the page in a set of search results, are usually searches people performed after they performed the initial search. I say usually because there can be some other things in there. But usually someone who searched for FHA loans then searches for jumbo loans or 30-year fixed loans or mortgage rates or those kinds of things. That’s the next step. So you can say, “You know what? I know what you want next. Let me go help you.” Auto-suggest related searches, those are great for that.
  • Internal search analytics for people who landed on a page and performed a site search or clicked on a Next link on your site. What did they want to do? Where did they want to go next? That helps tell you what those people need.
  • Having conversations with those who only got partway through your funnel. So if you have a lead capture at some point or you collect email at some point, you can reach out to people who initially came to you for a solution but didn’t get all the way through that process and talk to them.
  • Tracking the SERPs and watching who rises vs falls in the rankings. Finally, if you track the search results, generally speaking what we see here at Moz, what I see for almost all the results I’m tracking is that more and more people who do a great job of this, of searcher task accomplishment, are rising in the rankings, and the folks who are not are falling.

So over time, if you watch those in your spaces and do some rank tracking competitively, you can see what types of content is helping people accomplish those tasks and what Google is rewarding.

That said, I look forward to your comments. We’ll see you again next week for another edition of Whiteboard Friday. Take care.

Why We Can’t Do SEO WIthout CRO from Rand Fishkin

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Moz Blog https://moz.com/blog/searcher-task-accomplishment
via IFTTT

from Blogger http://imlocalseo.blogspot.com/2017/07/is-new-most-powerful-ranking-factor.html
via IFTTT

“SEO Is Always Changing”… Or Is It?: Debunking the Myth and Getting Back to Basics

Posted by bridget.randolph

Recently I made the shift to freelancing full-time, and it’s led me to participate in a few online communities for entrepreneurs, freelancers, and small business owners. I’ve noticed a trend in the way many of them talk about SEO; specifically, the blocks they face in attempting to “do SEO” for their businesses. Again and again, the concept that “SEO is too hard to stay on top of… it’s always changing” was being stated as a major reason that people feel a) overwhelmed by SEO; b) intimidated by SEO; and c) uninformed about SEO.

And it’s not just non-SEOs who use this phrase. The concept of “the ever-changing landscape of SEO” is common within SEO circles as well. In fact, I’ve almost certainly used this phrase myself.

But is it actually true?

To answer that question, we have to separate the theory of search engine optimization from the various tactics which we as SEO professionals spend so much time debating and testing. The more that I work with smaller businesses and individuals, the clearer it becomes to me that although the technology is always evolving and developing, and tactics (particularly those that attempt to trick Google rather than follow their guidelines) do need to adapt fairly rapidly, there are certain fundamentals of SEO that change very little over time, and which a non-specialist can easily understand.

The unchanging fundamentals of SEO

Google’s algorithm is based on an academia-inspired model of categorization and citations, which utilizes keywords as a way to decipher the topic of a page, and links from other sites (known as “backlinks”) to determine the relative authority of that site. Their method and technology keeps getting more sophisticated over time, but the principles have remained the same.

So what are these basic principles?

It comes down to answering the following questions:

  1. Can the search engine find your content? (Crawlability)
  2. How should the search engine organize and prioritize this content? (Site structure)
  3. What is your content about? (Keywords)
  4. How does the search engine know that your content provides trustworthy information about this topic? (Backlinks)

If your website is set up to help Google and other search engines answer these 4 questions, you will have covered the basic fundamentals of search engine optimization.

There is a lot more that you can do to optimize in all of these areas and beyond, but for businesses that are just starting out and/or on a tight budget, these are the baseline concepts you’ll need to know.

Crawlability

You could have the best content in the world, but it won’t drive any search traffic if the search engines can’t find it. This means that the crawlability of your site is one of the most important factors in ensuring a solid SEO foundation.

In order to find your content and rank it in the search results, a search engine needs to be able to:

  1. Access the content (at least the pages that you want to rank)
  2. Read the content

This is primarily a technical task, although it is related to having a good site structure (the next core area). You may need to adapt the code, and/or use an SEO plugin if your site runs on WordPress.

For more in-depth guides to technical SEO and crawlability, check out the following posts:

Site structure

In addition to making sure that your content is accessible and crawlable, it’s also important to help search engines understand the hierarchy and relative importance of that content. It can be tempting to think that every page is equally important to rank, but failing to structure your site in a hierarchical way often dilutes the impact of your “money” pages. Instead, you should think about what the most important pages are, and structure the rest of your site around these.

When Google and other search engine crawlers visit a site, they attempt to navigate to the homepage; then click on every link. Googlebot assumes that the pages it sees the most are the most important pages. So when you can reach a page with a single click from the homepage, or when it is linked to on every page (for example, in a top or side navigation bar, or a site footer section), Googlebot will see those pages more, and will therefore consider them to be more important. For less important pages, you’ll still need to link to them from somewhere for search engines to be able to see them, but you don’t need to emphasize them quite as frequently or keep them as close to the homepage.

The main question to ask is: Can search engines tell what your most important pages are, just by looking at the structure of your website? Google’s goal is to to save users steps, so the easier you make it for them to find and prioritize your content, the more they’ll like it.

For more in-depth guides to good site structure, check out the following posts:

Keywords

Once the content you create is accessible to crawlers, the next step is to make sure that you’re giving the search engines an accurate picture of what that content is about, to help them understand which search queries your pages would be relevant to. This is where keywords come into the mix.

We use keywords to tell the search engine what each page is about, so that they can rank our content for queries which are most relevant to our website. You might hear advice to use your keywords over and over again on a page in order to rank well. The problem with this approach is that it doesn’t always create a great experience for users, and over time Google has stopped ranking pages which it perceives as being a poor user experience.

Instead, what Google is looking for in terms of keyword usage is that you:

  1. Answer the questions that real people actually have about your topic
  2. Use the terminology that real people (specifically, your target audience) actually use to refer to your topic
  3. Use the term in the way that Google thinks real people use it (this is often referred to as “user intent” or “searcher intent”).

You should only ever target one primary keyword (or phrase) per page. You can include “secondary” keywords, which are related to the primary keyword directly (think category vs subcategory). I sometimes see people attempting to target too many topics with a single page, in an effort to widen the net. But it is better to separate these out so that there’s a different page for each different angle on the topic.

The easiest way to think about this is in physical terms. Search engines’ methods are roughly based on the concept of library card catalogs, and so we can imagine that Google is categorizing pages in a similar way to a library using the Dewey decimal system to categorize books. You might have a book categorized as Romance, subcategory Gothic Romance; but you wouldn’t be able to categorize it as Romance and also Horror, even though it might be related to both topics. You can’t have the same physical book on 2 different shelves in 2 different sections of the library. Keyword targeting works the same way: 1 primary topic per page.

For more in-depth guides to keyword research and keyword targeting, check out the following posts:

Backlinks

Another longstanding ranking factor is the number of links from other sites to your content, known as backlinks.

It’s not enough for you to say that you’re the expert in something, if no one else sees it that way. If you were looking for a new doctor, you wouldn’t just go with the guy who says “I’m the world’s best doctor.” But if a trusted friend told you that they loved their doctor and that they thought you’d like her too, you’d almost certainly make an appointment.

When other websites link to your site, it helps to answer the question: “Do other people see you as a trustworthy resource?” Google wants to provide correct and complete information to people’s queries. The more trusted your content is by others, the more that indicates the value of that information and your authority as an expert.

When Google looks at a site’s backlinks, they are effectively doing the same thing that humans do when they read reviews and testimonials to decide which product to buy, which movie to see, or which restaurant to go to for dinner. If you haven’t worked with a product or business, other people’s reviews point you to what’s good and what’s not. In Google’s case, a link from another site serves as a vote of confidence for your content.

That being said, not all backlinks are treated equally when it comes to boosting your site’s rankings. They are weighted differently according to how Google perceives the quality and authority of the site that’s doing the linking. This can feel a little confusing, but when you think about it in the context of a recommendation, it becomes a lot easier to understand whether the backlinks your site is collecting are useful or not. After all, think about the last time you saw a movie. How did you choose what to see? Maybe you checked well-known critics’ reviews, checked Rotten Tomatoes, asked friends’ opinions, looked at Netflix’s suggestions list, or saw acquaintances posting about the film on social media.

When it comes to making a decision, who do you trust? As humans, we tend to use an (often unconscious) hierarchy of trust:

  1. Personalized recommendation: Close friends who know me well are most likely to recommend something I’ll like;
  2. Expert recommendation: Professional reviewers who are authorities on the art of film are likely to have a useful opinion, although it may not always totally match my personal taste;
  3. Popular recommendation: If a high percentage of random people liked the movie, this might mean it has a wide appeal and will likely be a good experience for me as well;
  4. Negative association: If someone is raving about a movie on social media and I know that they’re a terrible human with terrible taste… well, in the absence of other positive signals, that fact might actually influence me not to see the movie.

To bring this back to SEO, you can think about backlinks as the SEO version of reviews. And the same hierarchy comes into play.

  1. Personalized/contextual recommendation: For local businesses or niche markets, very specific websites like a local city’s tourism site, local business directory or very in-depth, niche fan site might be the equivalent of the “best friend recommendation”. They may not be an expert in what everyone likes, but they definitely know what works for you as an individual and in some cases, that’s more valuable.
  2. Expert recommendation: Well-known sites with a lot of inherent trust, like the BBC or Harvard University, are like the established movie critics. Broadly speaking they are the most trustworthy, but possibly lacking the context for a specific person’s needs. In the absence of a highly targeted type of content or service, these will be your strongest links.
  3. Popular recommendation: All things being equal, a lot of backlinks from a lot of different sites is seen as a signal that the content is relevant and useful.
  4. Negative association: Links that are placed via spam tactics, that you buy in bulk, or that sit on sites that look like garbage, are the website equivalent of that terrible person whose recommendation actually turns you off the movie.

If a site collects too many links from poor-quality sites, it could look like those links were bought, rather than “earned” recommendations (similar to businesses paying people to write positive reviews). Google views the buying of links as a dishonest practice, and a way of gaming their system, and therefore if they believe that you are doing this intentionally it may trigger a penalty. Even if they don’t cause a penalty, you won’t gain any real value from poor quality links, so they’re certainly not something to aim for. Because of this, some people become very risk-averse about backlinks, even the ones that came to them naturally. But as long as you are getting links from other trustworthy sources, and these high quality links make up a substantially higher percentage of your total, having a handful of lower quality sites linking to you shouldn’t prevent you from benefiting from the high quality ones.

For more in-depth guides to backlinks, check out the following posts:

Theory of Links

Getting More Links

Mitigating Risk of Links

Does anything about SEO actually change?

If SEO is really this simple, why do people talk about how it changes all the time? This is where we have to separate the theory of SEO from the tactics we use as SEO professionals to grow traffic and optimize for better rankings.

The fundamentals that we’ve covered here — crawlability, keywords, backlinks, and site structure — are the theory of SEO. But when it comes to actually making it work, you need to use tactics to optimize these areas. And this is where we see a lot of changes happening on a regular basis, because Google and the other search engines are constantly tweaking the way the algorithm understands and utilizes information from those four main areas in determining how a site’s content should rank on a results page.

The important thing to know is that, although the tactics which people use will change all the time, the goal for the search engine is always the same: to provide searchers with the information they need, as quickly and easily as possible. That means that whatever tactics and strategies you choose to pursue, the important thing is that they enable you to optimize for your main keywords, structure your site clearly, keep your site accessible, and get more backlinks from more sites, while still keeping the quality of the site and the backlinks high.

The quality test (EAT)

Because Google’s goal is to provide high-quality results, the changes that they make to the algorithm are designed to improve their ability to identify the highest quality content possible. Therefore, when tactics stop working (or worse, backfire and incur penalties), it is usually related to the fact that these tactics didn’t create high-quality outputs.

Like the fundamentals of SEO theory which we’ve already covered, the criteria that Google uses to determine whether a website or page is good quality haven’t changed all that much since the beginning. They’ve just gotten better at enforcing them. This means that you can use these criteria as a “sniff test” when considering whether a tactic is likely to be a sustainable approach long-term.

Google themselves refer to these criteria in their Search Quality Rating Guidelines with the acronym EAT, which stands for:

  • Expertise
  • Authoritativeness
  • Trustworthiness

In order to be viewed as high-quality content (on your own site) or a high-quality link (from another site to your site), the content needs to tick at least one of these boxes.

Expertise

Does this content answer a question people have? Is it a *good* answer? Do you have a more in-depth degree of knowledge about this topic than most people?

This is why you will see people talk about Google penalizing “thin” content — that just refers to content which isn’t really worth having on its own page, because it doesn’t provide any real value to the reader.

Authority

Are you someone who is respected and cited by others who know something about this topic?

This is where the value of backlinks can come in. One way to demonstrate that you are an authority on a topic is if Google sees a lot of other reputable sources referring to your content as a source or resource.

Trust

Are you a reputable person or business? Can you be trusted to take good care of your users and their information?

Because trustworthiness is a factor in determining a site’s quality, Google has compiled a list of indicators which might mean a site is untrustworthy or spammy. These include things like a high proportion of ads to regular content, behavior that forces or manipulates users into taking actions they didn’t want to take, hiding some content and only showing it to search engines to manipulate rankings, not using a secure platform to take payment information, etc.

It’s always the same end goal

Yes, SEO can be technical, and yes, it can change rapidly. But at the end of the day, what doesn’t change is the end goal. Google and the other search engines make money through advertising, and in order to get more users to see (and click on) their ads, they have to provide a great user experience. Therefore, their goal is always going to be to give the searchers the best information they can, as easily as they can, so that people will keep using their service.

As long as you understand this, the theory of SEO is pretty straightforward. It’s just about making it easy for Google to answer these questions:

  1. What is your site about?
    1. What information does it provide?
    2. What service or function does it provide?
  2. How do we know that you’ll provide the best answer or product or service for our users’ needs?
  3. Does your content demonstrate Expertise, Authoritativeness, and/or Trustworthiness (EAT)?

This is why the fundamentals have changed so little, despite the fact that the industry, technology and tactics have transformed rapidly over time.

A brief caveat

My goal with this post is not to provide step-by-step instruction in how to “do SEO,” but rather to demystify the basic theory for those who find the topic too overwhelming to know where to start, or who believe that it’s too complicated to understand without years of study. With this goal in mind, I am intentionally taking a simplified and high-level perspective. This is not to dismiss the importance of an SEO expert in driving strategy and continuing to develop and maximize value from the search channel. My hope is that those business owners and entrepreneurs who currently feel overwhelmed by this topic can gain a better grasp on the way SEO works, and a greater confidence and ease in approaching their search strategy going forward.

I have provided a few in-depth resources for each of the key areas — but you will likely want to hire a specialist or consultant to assist with analysis and implementation (certainly if you want to develop your search strategy beyond simply the “table stakes” as Rand calls it, you will need a more nuanced understanding of the topic than I can provide in a single blog post).

At the end of the day, the ideas behind SEO are actually pretty simple — it’s the execution that can be more complex or simply time-consuming. That’s why it’s important to understand that theory — so that you can be more informed if and when you do decide to partner with someone who is offering that expertise. As long as you understand the basic concepts and end goal, you’ll be able to go into that process with confidence. Good luck!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Moz Blog https://moz.com/blog/seo-back-to-basics
via IFTTT

from Blogger http://imlocalseo.blogspot.com/2017/07/seo-is-always-changing-or-is-it.html
via IFTTT

Fighting Review Spam: The Complete Guide for the Local Enterprise

Posted by MiriamEllis

It’s 105 degrees outside my office right now, and the only thing hotter in this summer of 2017 is the local SEO industry’s discussion of review spam. It’s become increasingly clear that major review sites represent an irresistible temptation to spammers, highlighting systemic platform weaknesses and the critical need for review monitoring that scales.

Just as every local brand, large and small, has had to adjust to the reality of reviews’ substantial impact on modern consumer behavior, competitive businesses must now prepare themselves to manage the facts of fraudulent sentiment. Equip your team and clients with this article, which will cover every aspect of review spam and includes a handy list for reporting fake reviews to major platforms.

What is review spam?

A false review is one that misrepresents either the relationship of the reviewer to the business, misrepresents the nature of the interaction the reviewer had with the business, or breaks a guideline. Examples:

  • The reviewer is actually a competitor of the business he is reviewing; he’s writing the review to hurt a competitor and help himself
  • The reviewer is actually the owner, an employee, or a marketer of the business he is reviewing; he’s falsifying a review to manipulate public opinion via fictitious positive sentiment
  • The reviewer never had a transaction with the business he is reviewing; he’s pretending he’s a customer in order to help/hurt the business
  • The reviewer had a transaction, but is lying about the details of it; he’s trying to hurt the company by misrepresenting facts for some gain of his own
  • The reviewer received an incentive to write the review, monetary or otherwise; his sentiment stems from a form of reward and is therefore biased
  • The reviewer violates any of the guidelines on the platform on which he’s writing his review; this could include personal attacks, hate speech or advertising

All of the above practices are forbidden by the major review platforms and should result in the review being reported and removed.

What isn’t review spam?

A review is not spam if:

  • It’s left directly by a genuine customer who experienced a transaction
  • It represents the facts of a transaction with reasonable, though subjective, accuracy
  • It adheres to the policies of the platform on which it’s published

Reviews that contain negative (but accurate) consumer sentiment shouldn’t be viewed as spam. For example, it may be embarrassing to a brand to see a consumer complain that an order was filled incorrectly, that an item was cold, that a tab was miscalculated or that a table was dirty, but if the customer is correctly cataloging his negative experience, then his review isn’t a misrepresentation.

There’s some inherent complexity here, as the brand and the consumer can differ widely in their beliefs about how satisfying a transaction may have been. A restaurant franchise may believe that its meals are priced fairly, but a consumer can label them as too expensive. Negative sentiment can be subjective, so unless the reviewer is deliberately misrepresenting facts and the business can prove it, it’s not useful to report this type of review as spam as it’s unlikely to be removed.

Why do individuals and businesses write spam reviews?

Unfortunately, the motives can be as unpleasant as they are multitudinous:

Blackmail/extortion

There’s the case of the diner who was filmed putting her own hair in her food in hopes of extorting a free meal under threat of negative reviews as a form of blackmail. And then there’s blackmail as a business model, as this unfortunate business reported to the GMB forum after being bulk-spammed with 1-star reviews and then contacted by the spammer with a demand for money to raise the ratings to 5-stars.

Revenge

The classic case is the former employee of a business venting his frustrations by posing as a customer to leave a highly negative review. There are also numerous instances of unhappy personal relationships leading to fake negative reviews of businesses.

Protest or punishment

Consumer sentiment may sometimes appear en masse as a form of protest against an individual or institution, as the US recently witnessed following the election of President Trump and the ensuing avalanche of spam reviews his various businesses received.

It should be noted here that attempting to shame a business with fake negative reviews can have the (likely undesirable) effect of rewarding it with high local rankings, based on the sheer number of reviews it receives. We saw this outcome in the infamous case of the dentist who made national news and received an onslaught of shaming reviews for killing a lion.

Finally, there is the toxic reviewer, a form of Internet troll who may be an actual customer but whose personality leads them to write abusive or libelous reviews as a matter of course. While these reviews should definitely be reported and removed if they fail to meet guidelines, discussion is open and ongoing in the local SEO industry as to how to manage the reality of consumers of this type.

Ranking manipulation

The total review count of a business (regardless of the sentiment the reviews contain) can positively impact Google’s local pack rankings or the internal rankings of certain review platforms. For the sake of boosting rankings, some businesses owners review themselves, tell their employees to review their employer, offer incentives to others in exchange for reviews, or even engage marketers to hook them up to a network of review spammers.

Public perception manipulation

This is a two-sided coin. A business can either positively review itself or negatively review its competitors in an effort to sway consumer perception. The latter is a particularly prevalent form of review spam, with the GMB forum overflowing with at least 10,000 discussions of this topic. Given that respected surveys indicate that 91% of consumers now read online reviews, 84% trust them as much as personal recommendations and 86% will hesitate to patronize a business with negative reviews, the motives for gaming online sentiment, either positively or negatively, are exceedingly strong.

Wages

Expert local SEO, Mike Blumenthal, is currently doing groundbreaking work uncovering a global review spam network that’s responsible for tens or hundreds of thousands of fake reviews. In this scenario, spammers are apparently employed to write reviews of businesses around the world depicting sets of transactions that not even the most jet-setting globetrotter could possibly have experienced. As Mike describes one such reviewer:

“She will, of course, be educated at the mortuary school in Illinois and will have visited a dentist in Austin after having reviewed four other dentists … Oh, and then she will have bought her engagement ring in Israel, and then searched out a private investigator in Kuru, Philippines eight months later to find her missing husband. And all of this has taken place in the period of a year, right?”

The scale of this network makes it clear that review spam has become big business.

Lack of awareness

Not all review spammers are dastardly characters. Some small-timers are only guilty of a lack of awareness of guidelines or a lack of foresight about the potential negative outcomes of fake reviews to their brand. I’ve sometimes heard small local business owners state they had their family review their newly-opened business to “get the ball rolling,” not realizing that they were breaking a guideline and not considering how embarrassing and costly it could prove if consumers or the platform catch on. In this scenario, I try to teach that faking success is not a viable business model — you have to earn it.

Lack of consequences

Unfortunately, some of the most visible and powerful review platforms have become enablers of the review spam industry due to a lack of guideline enforcement. When a platform fails to identify and remove fake reviews, either because of algorithmic weaknesses or insufficient support staffing, spammers are encouraged to run amok in an environment devoid of consequences. For unethical parties, no further justification for manipulating online sentiment is needed than that they can “get away with it.” Ironically, there are consequences to bear for lack of adequate policing, and until they fall on the spammer, they will fall on any platform whose content becomes labeled as untrustworthy in the eyes of consumers.

What is the scope of review spam?

No one knows for sure, but as we’ve seen, the playing field ranges from the single business owner having his family write a couple of reviews on Yelp to the global network employing staff to inundate Google with hundreds of thousands of fake reviews. And, we’ve see two sides to the review spam environment:

  1. People who write reviews to help themselves (in terms of positive rankings, perception, and earnings for themselves either directly from increased visibility or indirectly via extortion, and/or in terms of negative outcomes for competitors).
  2. People who write reviews to hurt others (for the sake of revenge with little or no consequence).

The unifying motive of all forms of review spam is manipulation, creating an unfair and untrustworthy playing field for consumers, enterprises and platforms alike. One Harvard study suggests that 20% of Yelp reviews are fake, but it would be up to the major review platforms to transparently publicize the total number of spam reviews they receive. Just the segment I’ve seen as an individual local SEO has convinced me that review spam has now become an industry, just like “black hat” SEO once did.

How to spot spam reviews

Here are some basic tips:

Strange patterns:

A reviewer’s profile indicates that they’ve been in too many geographic locations at once. Or, they have a habit of giving 1-star reviews to one business and 5-star reviews to its direct competitor. While neither is proof positive of spam, think of these as possible red flags.

Strange language:

Numerous 5-star reviews that fawn on the business owner by name (e.g. “Bill is the greatest man ever to walk the earth”) may be fishy. If adulation seems to be going overboard, pay attention.

Strange timing:

Over the course of a few weeks, a business skyrockets from zero reviews to 30, 50, or 100 of them. Unless an onslaught of sentiment stems from something major happening in the national news, chances are good the company has launched some kind of program. If you suspect spam, you’ll need to research whether the reviews seem natural or could be stemming from some form of compensation.

Strange numbers:

The sheer number of reviews a business has earned seems inconsistent with its geography or industry. Some business models (restaurants) legitimately earn hundreds of reviews each year on a given platform, but others (mortuaries) are unlikely to have the same pattern. If a competitor of yours has 5x as many reviews as seems normal for your geo-industry, it could be a first indicator of spam.

Strange “facts”:

None of your staff can recall that a transaction matching the description in a negative review ever took place, or a transaction can be remembered but the way the reviewer is presenting it is demonstrably false. Example: a guest claims you rudely refused to seat him, but your in-store cam proves that he simply chose not to wait in line like other patrons.

Obvious threats:

If any individual or entity threatens your company with a negative review to extort freebies or money from you, take it seriously and document everything you can.

Obvious guideline violations:

Virtually every major review platform prohibits profane, obscene, and hateful content. If your brand is victimized by this type of attack, definitely report it.

In a nutshell, the first step to spotting review spam is review monitoring. You’ll want to manually check direct competitors for peculiar patterns, and, more importantly, all local businesses must have a schedule for regularly checking their own incoming sentiment. For larger enterprises and multi-location business models, this process must be scaled to minimize manual workloads and cover all bases.

Scaling review management

On an average day, one Moz Local customer with 100 retail locations in the U.S. receives 20 reviews across the various platforms we track. Some are just ratings, but many feature text. Many are very positive. A few contain concerns or complaints that must be quickly addressed to protect reputation/budget by taking action to satisfy and retain an existing customer while proving responsiveness to the general consumer public. Some could turn out to be spam.

Over the course of an average week for this national brand, 100–120 such reviews will come in, totaling up to more than 400 pieces of customer feedback in a month that must be assessed for signs of success at specific locations or emerging quality control issues at others. Parse this out to a year’s time, and this company must be prepared to receive and manage close to 5,000 consumer inputs in the form of reviews and ratings, not just for positive and negative sentiment, but for the purposes of detecting spam.

Spam detection starts with awareness, which can only come from the ability to track and audit a large volume of reviews to identify some of the suspicious hallmarks we’ve covered above. At the multi-location or enterprise level, the solution to this lies in acquiring review monitoring software and putting it in the hands of a designated department or staffer. Using a product like Moz Local, monitoring and detection of questionable reviews can be scaled to meet the needs of even the largest brands.

What should your business do if it has been victimized by review spam?

Once you’ve become reasonably certain that a review or a body of reviews violates the guidelines of a specific platform, it’s time to act. The following list contains links to the policies of 7 dominant review platforms that are applicable to all industries, and also contains tips and links outlining reporting options:

Google

Policy: https://support.google.com/business/answer/2622994?hl=en

Review reporting tips

Flag the review by mousing over it, clicking the flag symbol that appears and then entering your email address and choosing a radio button. If you’re the owner, use the owner response function to mention that you’ve reported the review to Google for guideline violations. Then, contact GMB support via their Twitter account and/or post your case in the GMB forum to ask for additional help. Cross your fingers!

Yelp

Policy: https://www.yelp.com/guidelines

Review reporting tips

Yelp offers these guidelines for reporting reviews and also advises owners to respond to reviews that violate guidelines. Yelp takes review quality seriously and has set high standards other platforms might do well to follow, in terms of catching spammers and warning the public against bad actors.

Facebook

Policy: https://www.facebook.com/communitystandards

Review reporting tips

Here are Facebook’s instructions for reporting reviews that fail to meet community standards. Note that you can only report reviews with text — you can’t report solo ratings. Interestingly, you can turn off reviews on Facebook, but to do so out of fear would be to forego the considerable benefits they can provide.

Yellow Pages

Policy: https://www.yellowpages.com/about/legal/terms-conditions#user-generated-content

Review reporting tips

In 2016, YP.com began showing TripAdvisor reviews alongside internal reviews. If review spam stems from a YP review, click the “Flag” link in the lower right corner of the review and fill out the form to report your reasons for flagging. If the review spam stems from TripAdvisor, you’ll need to deal with them directly and read their extensive guidelines, TripAdvisor states that they screen reviews for quality purposes, but that fake reviews can slip through. If you’re the owner, you can report fraudulent reviews from the Management Center of your TripAdvisor dashboard. Click the “concerned about a review” link and fill out the form. If you’re simply a member of the public, you’ll need to sign into TripAdvisor and click the flag link next to the review to report a concern.

SuperPages

Policy: https://my.dexmedia.com/spportal/jsp/popups/businessprofile/reviewGuidelines.jsp

Review reporting tips

The policy I’ve linked to (from Dex Media, which owns SuperPages) is the best I can find. It’s reasonably thorough but somewhat broken. To report a fake review to SuperPages, you’ll need either a SuperPages or Facebook account. Then, click the “flag abuse” link associated with the review and fill out a short form.

CitySearch

Policy: http://www.citysearch.com/aboutcitysearch/about_us

Review reporting tips

If you receive a fake review on CitySearch, email customerservice@citygrid.com. In your email, link to the business that has received the spam review, include the date of the review and the name of the reviewer and then cite the guidelines you feel the review violates.

FourSquare

Policy: https://foursquare.com/legal/terms

Review reporting tips

The “Rules and Conduct” section I’ve linked to in Foursquare’s TOS outlines their content policy. Foursquare is a bit different in the language they use to describe tips/reviews. They offer these suggestions for reporting abusive tips.

*If you need to find the guidelines and reporting options for an industry-specific review platform like FindLaw or HealthGrades, Phil Rozek’s definitive list will be a good starting point for further research.

Review spam can feel like being stuck between a rock and a hard place

I feel a lot of empathy in this regard. Google, Facebook, Yelp, and other major review platforms have the visibility to drive massive traffic and revenue to your enterprise. That’s the positive side of this equation. But there’s another side — the uneasy side that I believe has its roots in entities like Google originating their local business index via aggregation from third party sources, rather than as a print YellowPages-style, opt-in program, and subsequently failing to adequately support the millions of brands it was then representing to the Internet public.

To this day, there are companies that are stunned to discover that their business is listed on 35 different websites, and being actively reviewed on 5 or 10 of them when the company took no action to initiate this. There’s an understandable feeling of a loss of control that can be particularly difficult for large brands, with their carefully planned quality structures, to adjust to.

This sense of powerlessness is further compounded when the business isn’t just being listed and discussed on platforms it doesn’t control, but is being spammed. I’ve seen business owners on Facebook declaring they’ve decided to disable reviews because they feel so victimized and unsupported after being inundated with suspicious 1-star ratings which Facebook won’t investigate or remove. By doing so, these companies are choosing to forego the considerable benefits reviews drive because meaningful processes for protecting the business aren’t yet available.

These troubling aspects of the highly visible world of reviews can leave owners feeling like they’re stuck between a rock and a hard place. Their companies will be listed, will be reviewed, and may be spammed whether the brand actively participates or not, and they may or may not be able to get spam removed.

It’s not a reality from which any competitive enterprise can opt-out, so my best advice is to realize that it’s better to opt-in fully, with the understanding that some control is better than none. There are avenues for getting many spam reviews taken down, with the right information and a healthy dose of perseverance. Know, too, that every one of your competitors is in the same boat, riding a rising tide that will hopefully grow to the point of offering real-world support for managing consumer sentiment that impacts bottom-line revenue in such a very real way.

There ought to be a law

While legitimate negative reviews have legal protection under the Consumer Review Fairness Act of 2016, fraudulent reviews are another matter.

Section 5(a) of the Federal Trade Communication Act states:

Unfair methods of competition in or affecting commerce, and unfair or deceptive acts or practices in or affecting commerce, are hereby declared unlawful.”

Provisions like these are what allowed the FTC to successfully sue Sage Automotive Group for $3.6 million dollars for deceptive advertising practices and deceptive online reviews, but it’s important to note that this appears to be the first instance in which the FTC has involved themselves in bringing charges on the basis of fraudulent reviews. At this point, it’s simply not reasonable to expect the FTC to step in if your enterprise receives some suspicious reviews, unless your research should uncover a truly major case.

Lawsuits amongst platforms, brands, and consumers, however, are proliferating. Yelp has sued agencies and local businesses over the publication of fake reviews. Companies have sued their competitors over malicious, false sentiment, and they’ve sued their customers with allegations of the same.

Should your enterprise be targeted with spam reviews, some cases may be egregious enough to warrant legal action. In such instances, definitely don’t attempt to have the spam reviews removed by the host platform, as they could provide important evidence. Contact a lawyer before you take a step in any direction, and avoid using the owner response function to take verbal revenge on the person you believe has spammed you, as we now have a precedent in Dietz v. Perez for such cases being declared a draw.

In many scenarios, however, the business may not wish to become involved in a noisy court battle, and seeking removal can be a quieter way to address the problem.

Local enterprises, consumers, and marketers must advocate for themselves

According to one survey, 90% of consumers read less than 10 reviews before forming an opinion about a business. If some of those 10 reviews are the result of negative spam, the cost to the business is simply too high to ignore, and it’s imperative that owners hold not just spammers, but review platforms, accountable.

Local businesses, consumers, and marketers don’t own review sites, but they do have the power to advocate. A single business could persistently blog about spam it has documented. Multiple businesses could partner up to request a meeting with a specific platform to present pain points. Legitimate consumers could email or call their favorite platforms to explain that they don’t want their volunteer hours writing reviews to be wasted on a website that is failing to police its content. Marketers can thoughtfully raise these issues repeatedly at conferences attended by review platform reps. There is no cause to take an adversarial tone in this, but there is every need for squeaky wheels to highlight the costliness of spam to all parties, advocating for platforms to devote all possible resources to:

  • Increasing the sophistication of algorithmic spam detection
  • Increasing staffing for manual detection
  • Providing real-time support to businesses so that spam can be reported, evaluated and removed as quickly as possible

All of the above could begin to better address the reality of review spam. In the meantime, if your business is being targeted right now, I would suggest using every possible avenue to go public with the problem. Blog, use social media, report the issue on the platform’s forum if it has one. Do anything you can to bring maximum attention to the attack on your brand. I can’t promise results from persistence and publicity, but I’ve seen this method work enough times to recommend it.

Why review platforms must act aggressively to minimize spam

I’ve mentioned the empathy I feel for owners when it comes to review platforms, and I also feel empathy for the platforms, themselves. I’ve gotten the sense, sometimes, that different entities jumped into the review game and have been struggling to handle its emerging complexities as they’ve rolled out in real time. What is a fair and just policy? How can you best automate spam detection? How deeply should a platform be expected to wade into disputes between customers and brands?

With sincere respect for the big job review sites have on their hands, I think it’s important to state:

  • If brands and consumers didn’t exist, neither would review platforms. Businesses and reviewers should be viewed and treated as MVPs.
  • Platforms which fail to offer meaningful support options to business owners are not earning goodwill or a good reputation.
  • The relationship between local businesses and review platforms isn’t an entirely comfortable one. Increasing comfort could turn wary brands into beneficial advocates.
  • Platforms that allow themselves to become inundated with spam will lose consumers’ trust, and then advertisers’ trust. They won’t survive.

Every review platform has a major stake in this game, but, to be perfectly honest, some of them don’t act like it.

Google My Business Forum Top Contributor and expert Local SEO, Joy Hawkins, recently wrote an open letter to Google offering them four actionable tips for improving their handling of their massive review spam problem. It’s a great example of a marketer advocating for her industry, and, of interest, some of Joy’s best advice to Google is taken from Yelp’s own playbook. Yelp may be doing the best of all platforms in combating spam, in that they have very strong filters and place public warnings on the profiles of suspicious reviewers and brands.

What Joy Hawkins, Mike Blumenthal, other industry experts, and local business owners seem to be saying to review platforms could be summed up like this:

“We recognize the power of reviews and appreciate the benefits they provide, but a responsibility comes with setting your platform up as a hub of reputation for millions of businesses. Don’t see spammed reputations as acceptable losses — they represent the livelihoods of real people. If you’re going to trade responsibly in representing us, you’ve got to back your product up with adequate quality controls and adequate support. A fair and trustworthy environment is better for us, better for consumers and better for you.”

Key takeaways for taking control of review spam

  • All local enterprises need to know that review spam is a real problem
  • Its scope ranges from individual spammers to global networks
  • Enterprises must monitor all incoming reviews, and scale this with software where necessary
  • Designated staff must be on the lookout for suspicious patterns
  • All major review platforms have some form of support for reporting spam reviews, but its not always adequate and may not lead to removal
  • Because of this, brands must advocate for better support from review platforms
  • Review platforms need to listen and act, because their stake in game is real

Being the subject of a review spam attack can be a stressful event that I wish no brand ever had to face, but it’s my hope that this article has empowered you to meet a possible challenge with complete information and a smart plan of action.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Moz Blog https://moz.com/blog/review-spam
via IFTTT

from Blogger http://imlocalseo.blogspot.com/2017/07/fighting-review-spam-complete-guide-for.html
via IFTTT

SEO for Copywriters: Tips on Measuring SEO Impact – Next Level

Posted by BrianChilds

Welcome to the newest installment of our educational Next Level series! In our last episode, Brian Childs shared a few handy shortcuts for targeting multiple keywords with one page. Today, he’s back to share how to use Google Analytics to measure the SEO impact of your content. Read on and level up!

Understanding how to write web content for SEO is important. But equally important is knowing how to measure the SEO impact of your content after it’s published. In this article I’ll describe how to use Google Analytics to create reports that evaluate the performance of articles or the writers creating those articles.

Let’s start with some definitions.

What is SEO content?

Search engine optimized content is the strategic process of researching and writing website copy with the goal of maximizing its impact in the SERPs. This requires having a keyword strategy, the ability to conduct competitive analyses, and knowledge of current ranking factors.

If you’re a copywriter, you’ve likely already been asked by your clients to create content “written for SEO.” Translating this into action often means the writer needs to have a greater role in both strategy and research. Words matter in SEO, and spending the time to get them right is a big part of creating content effectively. Adding SEO research and analysis to the process of researching content often fits nicely.

So the question is: How do I measure the effectiveness of my content team?

We go in greater depth on the research and reporting processes during the Moz seminar SEO for Content Writers, but I’ll explain a few useful concepts here.

What should I measure?

Well-defined goals are at the heart of any good digital marketing strategy, whether you’re doing SEO or PPC. Goals will differ by client and I’ve found that part of my role as a digital marketer is to help the client understand how to articulate the business goals into measurable actions taken by visitors on their site.

Ideally, goals have a few essential traits. They should:

  • Have measurable value (revenue, leads generated, event registrations)
  • Be identifiable on the site (PDF downloads, button clicks, confirmation page views)
  • Lead to business growth (part of an online campaign, useful to sales team, etc.)

Broad goals such as “increase organic sessions on site” are rarely specific enough for clients to want to invest in after the first 3–6 months of a relationship.

One tool you can use to measure goals is Google Analytics (GA). The nice part about GA is that almost everyone has an account (even if they don’t know how to use it) and it integrates nicely with almost all major SEO software platforms.

Lay the foundation for your SEO research by taking a free trial of Moz Pro. After you’ve researched your content strategy and competition with Keyword Explorer and Open Site Explorer, you can begin measuring the content you create in Google Analytics.

Let me show you how I set this up.

How to measure SEO content using Google Analytics

Step 1: Review conversion actions on site

As I mentioned before, your SEO goals should tie to a business outcome. We discuss setting up goals, including a worksheet that shows monthly performance, during the Reporting on SEO Bootcamp.

During the launch phase of a new project, locate the on-site actions that contribute to your client’s business and then consider how your content can drive traffic to those pages. Some articles have CTAs pointing to a whitepaper; others may suggest setting up a consultation.

When interviewing your client about these potential conversion locations (contact us page, whitepaper download, etc), ask them about the value of a new customer or lead. For nonprofits, maybe the objective is to increase awareness of events or increase donations. Regardless of the goal, it’s important that you define a value for each conversion before creating goals in Google Analytics.

Step 2: Navigate to the Admin panel in Google Analytics

Once you have goals identified and have settled on an acceptable value for that goal, open up Google Analytics and navigate to the admin panel. At the time of writing this, you can find the Admin panel by clicking on a little gear icon at the bottom-left corner of the screen.

Step 3: Create a goal (including dollar value)

There are three columns in the Admin view: Account, Property, and View. In the “View” column, you will see a section marked “Goals.”

Once you are in Goals, select “+New Goal.”

I usually select “Custom” rather than the pre-filled templates. It’s up to you. I’d give the Custom option a spin just to familiarize yourself with the selectors.

Now fill out the goal based on the analysis conducted in step #1. One goal should be filled out for each conversion action you’ve identified. The most important factor is filling out a value. This is the dollar amount for this goal.

The Google description of how to create goals is located here: Create or Edit Goals

Step 4: Create and apply a “Segment” for Organic Traffic

Once you have your goals set up, you’ll want to set up and automate reporting. Since we’re analyzing traffic from search engines, we want to isolate only traffic coming from the Organic Channel.

Organic traffic = people who arrive on your site after clicking on a link from a search engine results page.

An easy way to isolate traffic of a certain type or from a certain source is to create a segment.

Navigate to any Google Analytics page in the reports section. You will see some boxes near the top of the page, one of them labeled “All Users” (assuming segments haven’t been configured in the past).

Select the box that says “All Users” and it will open up a list with checkboxes.

Scroll down until you find the checkbox that says “Organic Traffic,” then select and apply that.

Now no matter what reports you look at In Google Analytics, you’ll only be viewing the traffic from search engines.

Step 5: Review the Google Analytics Landing Page Report

Now that we’ve isolated only traffic from search engines using a Google Analytics Segment, we can view our content performance and assess what is delivering the most favorable metrics. There are several reports you can use, but I prefer the “Landing Pages” report. It shows you the page where a visitor begins their session. If I want to measure blog writers, I want to know whose writing is generating the most traffic for me. The Landing Pages report will help do that.

To get to the Landing Pages report in Google Analytics, select this sequence of subheadings on the left sidebar:

Behavior > Site Content > Landing Pages

This report will show you, for any period of time, which pages are delivering the most visits. I suggest going deeper and sorting the content by the columns “Pages per session” and “Session Duration.” Identify the articles that are generating the highest average page depth and longest average session duration. Google will see these behaviors and signal that you’re delivering value to your visitors. That is good for SEO.

Step 6: Review the conversion value of your writers

Remember those goals we created? In the far right columns of the Landing Pages report, you will find the value being delivered by each page on your site. This is where you can help answer the question, “Which article topics or writers are consistently delivering the most business value?”

If you want to share this report with your team to help increase transparency, I recommend navigating up to the top of the page and, just beneath the name of the report, you’ll see a link called “Email.”

Automate your reporting by setting up an email that delivers either a .csv file or PDF on a monthly basis. It’s super easy and will save you a ton of time.

Want to learn more SEO content tips?

If you find this kind of step-by-step process helpful, consider joining Moz for our online training course focused on SEO for copywriters. You can find the upcoming class schedule here:

See upcoming schedule

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Moz Blog https://moz.com/blog/seo-for-copywriters-next-level
via IFTTT

from Blogger http://imlocalseo.blogspot.com/2017/07/seo-for-copywriters-tips-on-measuring.html
via IFTTT

SEO Best Practices for Canonical URLs + the Rel=Canonical Tag – Whiteboard Friday

Posted by randfish

If you’ve ever had any questions about the canonical tag, well, have we got the Whiteboard Friday for you. In today’s episode, Rand defines what rel=canonical means and its intended purpose, when it’s recommended you use it, how to use it, and sticky situations to avoid.

https://fast.wistia.net/embed/iframe/y02fcc4v1f?videoFoam=true

https://fast.wistia.net/assets/external/E-v1.js

SEO best practices for canonical URLs

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week, we’re going to chat about some SEO best practices for canonicalization and use of the rel=canonical tag.

Before we do that, I think it pays to talk about what a canonical URL is, because a canonical URL doesn’t just refer to a page upon which we are targeting or using the rel=canonical tag. Canonicalization has been around, in fact, much longer than the rel=canonical tag itself, which came out in 2009, and there are a bunch of different things that a canonical URL means.

What is a “canonical” URL?

So first off, what we’re trying to say is this URL is the one that we want Google and the other search engines to index and to rank. These other URLs that potentially have similar content or that are serving a similar purpose or perhaps are exact duplicates, but, for some reason, we have additional URLs of them, those ones should all tell the search engines, “No, no, this guy over here is the one you want.”

So, for example, I’ve got a canonical URL, ABC.com/a.

Then I have a duplicate of that for some reason. Maybe it’s a historical artifact or a problem in my site architecture. Maybe I intentionally did it. Maybe I’m doing it for some sort of tracking or testing purposes. But that URL is at ABC.com/b.

Then I have this other version, ABC.com/a?ref=twitter. What’s going on there? Well, that’s a URL parameter. The URL parameter doesn’t change the content. The content is exactly the same as A, but I really don’t want Google to get confused and rank this version, which can happen by the way. You’ll see URLs that are not the original version, that have some weird URL parameter ranking in Google sometimes. Sometimes this version gets more links than this version because they’re shared on Twitter, and so that’s the one everybody picked up and copied and pasted and linked to. That’s all fine and well, so long as we canonicalize it.

Or this one, it’s a print version. It’s ABC.com/aprint.html. So, in all of these cases, what I want to do is I want to tell Google, “Don’t index this one. Index this one. Don’t index this one. Index this one. Don’t index this one. Index this one.”

I can do that using this, the link rel=canonical, the href telling Google, “This is the page.” You put this in the header tag of any document and Google will know, “Aha, this is a copy or a clone or a duplicate of this other one. I should canonicalize all of my ranking signals, and I should make sure that this other version ranks.”

By the way, you can be self-referential. So it is perfectly fine for ABC.com/a to go ahead and use this as well, pointing to itself. That way, in the event that someone you’ve never even met decides to plug in question mark, some weird parameter and point that to you, you’re still telling Google, “Hey, guess what? This is the original version.”

Great. So since I don’t want Google to be confused, I can use this canonicalization process to do it. The rel=canonical tag is a great way to go. By the way, FYI, it can be used cross-domain. So, for example, if I republish the content on A at something like a Medium.com/@RandFish, which is, I think, my Medium account, /a, guess what? I can put in a cross-domain rel=canonical telling them, “This one over here.” Now, even if Google crawls this other website, they are going to know that this is the original version. Pretty darn cool.

Different ways to canonicalize multiple URLs

There are different ways to canonicalize multiple URLs.

1. Rel=canonical.

I mention that rel=canonical isn’t the only one. It’s one of the most strongly recommended, and that’s why I’m putting it at number one. But there are other ways to do it, and sometimes we want to apply some of these other ones. There are also not-recommended ways to do it, and I’m going to discuss those as well.

2. 301 redirect.

The 301 redirect, this is basically a status code telling Google, “Hey, you know what? I’m going to take /b, I’m going to point it to /a. It was a mistake to ever have /b. I don’t want anyone visiting it. I don’t want it clogging up my web analytics with visit data. You know what? Let’s just 301 redirect that old URL over to this new one, over to the right one.”

3. Passive parameters in Google search console.

Some parts of me like this, some parts of me don’t. I think for very complex websites with tons of URL parameters and a ton of URLs, it can be just an incredible pain sometimes to go to your web dev team and say like, “Hey, we got to clean up all these URL parameters. I need you to add the rel=canonical tag to all these different kinds of pages, and here’s what they should point to. Here’s the logic to do it.” They’re like, “Yeah, guess what? SEO is not a priority for us for the next six months, so you’re going to have to deal with it.”

Probably lots of SEOs out there have heard that from their web dev teams. Well, guess what? You can end around it, and this is a fine way to do that in the short term. Log in to your Google search console account that’s connected to your website. Make sure you’re verified. Then you can basically tell Google, through the Search Parameters section, to make certain kinds of parameters passive.

So, for example, you have sessionid=blah, blah, blah. You can set that to be passive. You can set it to be passive on certain kinds of URLs. You can set it to be passive on all types of URLs. That helps tell Google, “Hey, guess what? Whenever you see this URL parameter, just treat it like it doesn’t exist at all.” That can be a helpful way to canonicalize.

4. Use location hashes.

So let’s say that my goal with /b was basically to have exactly the same content as /a but with one slight difference, which was I was going to take a block of content about a subsection of the topic and place that at the top. So A has the section about whiteboard pens at the top, but B puts the section about whiteboard pens toward the bottom, and they put the section about whiteboards themselves up at the top. Well, it’s the same content, same search intent behind it. I’m doing the same thing.

Well, guess what? You can use the hash in the URL. So it’s a#b and that will jump someone — it’s also called a fragment URL — jump someone to that specific section on the page. You can see this, for example, Moz.com/about/jobs. I think if you plug in #listings, it will take you right to the job listings. Instead of reading about what it’s like to work here, you can just get directly to the list of jobs themselves. Now, Google considers that all one URL. So they’re not going to rank them differently. They don’t get indexed differently. They’re essentially canonicalized to the same URL.

NOT RECOMMENDED

I do not recommend…

5. Blocking Google from crawling one URL but not the other version.

Because guess what? Even if you use robots.txt and you block Googlebot’s spider and you send them away and they can’t reach it because you said robots.txt disallow /b, Google will not know that /b and /a have the same content on them. How could they?

They can’t crawl it. So they can’t see anything that’s here. It’s invisible to them. Therefore, they’ll have no idea that any ranking signals, any links that happen to point there, any engagement signals, any content signals, whatever ranking signals that might have helped A rank better, they can’t see them. If you canonicalize in one of these ways, now you’re telling Google, yes, B is the same as A, combine their forces, give me all the rankings ability.

6. I would also not recommend blocking indexation.

So you might say, “Ah, well Rand, I’ll use the meta robots no index tag, so that way Google can crawl it, they can see that the content is the same, but I won’t allow them to index it.” Guess what? Same problem. They can see that the content is the same, but unless Google is smart enough to automatically canonicalize, which I would not trust them on, I would always trust yourself first, you are essentially, again, preventing them from combining the ranking signals of B into A, and that’s something you really want.

7. I would not recommend using the 302, the 307, or any other 30x other than the 301.

This is the guy that you want. It is a permanent redirect. It is the most likely to be most successful in canonicalization, even though Google has said, “We often treat 301s and 302s similarly.” The exception to that rule is but a 301 is probably better for canonicalization. Guess what we’re trying to do? Canonicalize!

8. Don’t 40x the non-canonical version.

So don’t take /b and be like, “Oh, okay, that’s not the version we want anymore. We’ll 404 it.” Don’t 404 it when you could 301. If you send it over here with a 301 or you use the rel=canonical in your header, you take all the signals and you point them to A. You lose them if you 404 that in B. Now, all the signals from B are gone. That’s a sad and terrible thing. You don’t want to do that either.

The only time I might do this is if the page is very new or it was just an error. You don’t think it has any ranking signals, and you’ve got a bunch of other problems. You don’t want to deal with having to maintain the URL and the redirect long term. Fine. But if this was a real URL and real people visited it and real people linked to it, guess what? You need to redirect it because you want to save those signals.

When to canonicalize URLs

Last but not least, when should we canonicalize URLs versus not?

I. If the content is extremely similar or exactly duplicate.

Well, if it is the case that the content is either extremely similar or exactly duplicate on two different URLs, two or more URLs, you should always collapse and canonicalize those to a single one.

II. If the content is serving the same (or nearly the same) searcher intent (even if the KW targets vary somewhat).

If the content is not duplicate, maybe you have two pages that are completely unique about whiteboard pens and whiteboards, but even though the content is unique, meaning the phrasing and the sentence structures are the same, that does not mean that you shouldn’t canonicalize.

For example, this Whiteboard Friday about using the rel=canonical, about canonicalization is going to replace an old version from 2009. We are going to take that old version and we are going to use the rel=canonical. Why are we going to use the rel=canonical? So that you can still access the old one if for some reason you want to see the version that we originally came out with in 2009. But we definitely don’t want people visiting that one, and we want to tell Google, “Hey, the most up-to-date one, the new one, the best one is this new version that you’re watching right now.” I know this is slightly meta, but that is a perfectly reasonable use.

What I’m trying to aim at is searcher intent. So if the content is serving the same or nearly the same searcher intent, even if the keyword targeting is slightly different, you want to canonicalize those multiple versions. Google is going to do a much better job of ranking a single piece of content that has lots of good ranking signals for many, many keywords that are related to it, rather than splitting up your link equity and your other ranking signal equity across many, many pages that all target slightly different variations. Plus, it’s a pain in the butt to come up with all that different content. You would be best served by the very best content in one place.

III. If you’re republishing or refreshing or updating old content.

Like the Whiteboard Friday example I just used, you should use the rel=canonical in most cases. There are some exceptions. If you want to maintain that old version, but you’d like the old version’s ranking signals to come to the new version, you can take the content from the old version, republish that at /a-old. Then take /a and redirect that or publish the new version on there and have that version be the one that is canonical and the old version exist at some URL you’ve just created but that’s /old. So republishing, refreshing, updating old content, generally canonicalization is the way to go, and you can preserve the old version if you want.

IV. If content, a product, an event, etc. is no longer available and there’s a near best match on another URL.

If you have content that is expiring, a piece of content, a product, an event, something like that that’s going away, it’s no longer available and there’s a next best version, the version that you think is most likely to solve the searcher’s problems and that they’re probably looking for anyway, you can canonicalize in that case, usually with a 301 rather than with a rel=canonical, because you don’t want someone visiting the old page where nothing is available. You want both searchers and engines to get redirected to the new version, so good idea to essentially 301 at that point.

Okay, folks. Look forward to your questions about rel=canonicals, canonical URLs, and canonicalization in general in SEO. And we’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Moz Blog https://moz.com/blog/rel-canonical
via IFTTT

from Blogger http://imlocalseo.blogspot.com/2017/07/seo-best-practices-for-canonical-urls.html
via IFTTT

Moz Transitions: Rand to Step Away from Operations and into Advisory Role in Early 2018

Posted by SarahBird

I have some big news to share with you.

As many of you know, three and a half years ago, Rand began to shift his role at Moz. He transitioned from CEO into a product architect role where he could focus his passion and have hands-on impact in evolving our tools. Now, over the next 6 to 9 months he will transition into a supporting role as a Moz Associate. He will continue to be a passionate speaker and evangelist, and you’ll still see his enthusiastic face in Whiteboard Fridays, on the Moz Blog, and on various conference stages. And of course, he is one of our largest shareholders and will remain Chairman of the Board.

This is hard. Rand started Moz (formerly seomoz.org) over 16 years ago as a blog to record what he was learning about this new field. He and his co-founder Gillian Muessig created a marketing agency that focused on helping websites get found in search. They launched their first SAAS software product in February 2007, and I joined the company nine months later as the 8th employee. We’ve come a long way. Today, we have over 36,000 customers, 160 team members, a strong values-based culture, great investors, over $42 million in annual revenue last year, and a large and growing community of marketers. So many people have helped us reach this point.

What else is next for Rand? We’re excited to find out. His book about the last 16 years at Moz comes out next year.

When you see Rand, please show him gratitude and support. He is an incredibly talented, passionate, and productive individual with a commitment to helping others. I know he’s going to continue to make marketing better and spread TAGFEE in all his future roles.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Moz Blog https://moz.com/blog/rand-to-move-into-advisory-role
via IFTTT

from Blogger http://imlocalseo.blogspot.com/2017/07/moz-transitions-rand-to-step-away-from.html
via IFTTT

How “Message Match” Can Lift Conversion Rates by 212.74% [Case Study]

Posted by bsmarketer

Google offered to build a free mobile website for our past client. But rather than take them up on that very generous offer, they hired us to rebuild it for them (at about $20,000+ times Google’s initial estimate).

Smart or dumb?

The problem is that shoving an outdated legacy design onto a smaller screen won’t fix your problems. In fact, it’ll only amplify them. Instead, the trick is to zoom back out to the big picture. Then it’s a fairly straightforward process of:

  1. Figuring out who your customers are
  2. What they want
  3. And how they want it

That way, you can align all of the critical variables (thereby making your “messages match”) in order to improve their experience. Which, if done correctly, should also improve your bottom line; in the end, our client saw a 69.39% cost per conversion decrease with a 212.74% conversion rate lift.

Here’s how you can do the same.

How AdWords pricing works

AdWords is an auction. Kinda, sorta.

It’s an auction-based system where (typically) the highest bidder receives the best positions on the page. But that’s not always the case. It’s possible for someone to rank in the coveted 1–2 positions above you and actually pay less per click than you. (Not to mention convert those people at a higher percentage once they hit your site — but we’ll leave that until later.)

Any marketer worth their salt knows what’s coming up next.

The Quality Score begins to dictate effective pricing. It’s not the end-all be-all PPC metric. But it’s a helpful gauge that lets you know if you’re on the right path to prosperity and profits — or not. It’s a blend of several factors, including the expected click-through rate, ad relevance, and landing page experience. Ad Rank is used in conjunction to determine position based on an ad’s performance. (That’s the 30-second explanation, anyway.)

Years ago, Larry Kim analyzed Quality Score in-depth to determine just what kind of impact it had on what you pay. You should read the full thing. But one of the key takeaways was this:

Note that if your Quality Score is below average, you’ll basically pay a penalty — up to 64% more per conversion than your average advertiser. In a nutshell, for every Quality Score point above the average 5/10 score, your CPA will drop by 16%, on average. Conversely, for every Quality Score point below the average of 5/10, your CPA will rise by 16%.

gSbiVlC.png

(Image source)

Fast forward to just a few months ago, and Disruptive Advertising’s Jacob Baadsgaard analyzed their 2,000+ AdWords accounts (with millions in ad spend) to filter out a similar analysis. They ended up with strikingly similar results:

In fact, our results are strikingly similar to those reported by Larry Kim. If your quality score increases by 1 point, your cost-per-conversion decreases by 13% (Larry puts it at 16%). If your quality score decreases by 1 point, your cost-per-conversion increases by 13%.”

45KHbG9.png

(Image source)

Coincidence? Unlikely.

But wait, there’s more!

Jumping platforms for a second, Facebook introduced a “Relevance Score” recently. AdEspresso analyzed 104,256 ads over a 45-day period and saw a similar correlation between a higher Relevance Score and lower CPC rates. The inverse is also true.

szonvTY.png

(Image source)

Okay. Three different analyses, by three different people, across two channels, with three similar results. What can we learn from this?

That the alignment of your ads, your keyword or audience targeting, and your landing pages significantly influence costs (not to mention, eventual results). And what’s the one underlying concept that affects these?

Your “message match.”

How to get message match right

Oli from Unbounce is a masochist. You’d have to be anyway, in order to spend a day clicking on 300 different paid ads, noting message match along the way.

The final tally?

98% of the 300 ads Oli clicked on did NOT successfully match. That’s incredibly bad, as this doesn’t take any PPC ninja skills. All it takes is a little attention to detail. Because what is message match?

You use the same headline, description or value proposition, and image from your ad:

great message match ad

(Image source)

And include those same elements on the landing page people visit:

great message match landing page

(Image source)

Sure, you probably don’t want to use clip art in your ads and on your landing pages in 2017, but at least they’ve got the basics down.

When you think about this concept holistically, it makes perfect sense. In real life, the majority of communication is nonverbal. Fifty-five percent, in fact, comes down to your expressions, gestures, and posture.

Online you lack that nuance and context. It’s difficult (if not impossible) to strike the same emotional chord with a text-only headline limited to 25 characters as you can through audio and video. It (literally) pays to be as specific and explicit as possible. And while it could take hours to distill all of this down, here’s the CliffsNotes version.

Step #1: Your audience/keywords

AdWords generated about 68% of Google’s revenue in 2014. Last year they made $75 billion. So we’re talking billions with a B here.

A lot of that comes down to a searcher’s (1) intent and (2) urgency, where you bid on classically bottom-of-the-funnel keyphrases and convert ~2–10% of those clicks.

iIxPzsq.png

(Image source)

(Facebook’s kind of a different beast, where you instead build a funnel for each step.)

Even though it sounds trite, the best ways to come up with keyphrases is a deeper understanding of what makes your potential customers tick (besides doing the obvious and dropping your competitor’s domain name into SEMrush or SpyFu to see what they’re all bidding on).

A nice, actionable example of this is The Ad Grid from Digital Marketer, which helps you figure out which potential “hooks” should/would work for each customer type. build-traffic-campaigns-img5.jpg

(Image source)

From there, you would obviously hit the keyword research market with your Keyword Explorers and SEMrushes and then distill all of your information down into one nice, neat little package.

Again borrowing from the excellence of others, my favorite approach would be single-keyword ad group (SKAG) from Johnathan Dane at KlientBoost.

For example, one Ad Group would have a single keyphrase with each match type, like the following:

  • Broad: +marriage +proposal +planners
  • Phrase: “marriage proposal planners”
  • Exact: [marriage proposal planners]

This, unsurprisingly, seems time-consuming. That’s because it is.

Don’t worry, because it’s about to get even worse.

Step #2: Your ads

The best way to scale your PPC ad writing is to create a formula. You have different variables that you mix-and-match, watching CTRs and other metrics to determine which combination works best.

Start with something simple, like Johnathan + Klientboost’s example that incorporates the appropriate balance of keyphrase + benefits + action:

New-Ad

(Image source)

For bottom-of-the-funnel, no-frills keyphrases, sometimes simple and direct works best. You don’t have to get overly clever with reinventing the wheel. You just slap in your keyphrase in that little headline space and try to emphasize your primary value prop, USP, or benefit that might get people to click on your ad instead of all the others that look just like it.

Ad writing can get difficult and messy if you get lost in the intangible fluffiness of jargon.

Don’t.

Instead, focus on emphasizing concrete examples, benefits, and outcomes of whatever it is you’re advertising. Here are some of Digital Marketer’s hooks to borrow from:

  1. How does it compare the before and after effect?
  2. How does it make them feel emotionally/?
  3. How (specifically) does it improve their average day?
  4. How does it affect their status or vanity?
  5. Is there quantifiable proof of results?
  6. What’s the expected time to results (i.e. speed)?

You can then again strip away the minutia by boiling everything down to variables.

B4jsCwp.png

For more reading on this topic, here’s a deeper dive into scaling PPC ad writing on WordStream.

Step #3: Landing page

Okay — here comes the fun part.

Marketing efforts in general fail when we can only (or are only allowed) to make surface-level changes. Marketing doesn’t equal just advertising, after all.

Made a ton of updates to an AdWords account? Great. You’ll still struggle until you can take full control over the destinations those ads are sending to, and create new dedicated pages for each campaign.

In an ideal world, each of your SKAGs created above would have their own specific landing page too. If you’re good at math, that landing page total in your head just jumped another 5X most likely. But as we’ve alluded, it’s worth it.

You start with a single new landing page template. Then think of each element as its own interchangeable variable you can mix and match (get it?). For example, the headline, hero image, bullet points and CTAs can evolve or update for one type of customer:

Attorney insurance quotes

And be quickly duplicated/cloned, then switched out for another to increase message match as much as possible:

Dentist insurance quotes

Perfect. Another incredibly time-consuming task to add to your list to get done this week.

Fortunately, there are a few tricks to scale this approach too.

Possibility #1: Dynamic Text Replacement

Unbounce’s ready-made solution will allow you to create a standard landing page, and then automatically (or dynamically) switch out that content based on what someone just searched for.

You can enter these dynamic text fields using their visual builder, then hook it up to your AdWords account so you literally don’t have to lift a finger.

1QB4ZJG.png

(Image source)

Each section allows you to specify default text to use (similar to how you’d specify a fallback font for all browsers for example).

Possibility #2: Advanced Custom Fields

This one requires a little bit of extra leg work, but it makes technical people smile.

My company used Advanced Custom Fields + Flexible Content to create these variable options on the backend of WordPress pages, so we (and clients) can simply mass-produce unique content at scale.

For the example used earlier, here’s what switching out the Hero section on the earlier landing page example would look like:

Click and upload an image to a pre-formatted space. Select a few radio options for page placement. Easy-peasy.

Here’s what the headline and subhead space looks like:

Now making changes or updates to landing pages (to get message match right) takes just a few seconds per page.

We even build out these options for secondary calls-to-action on a page as well, like footer CTAs:

This way, with the click of a button, we can set up and test how different CTA options might work.

For example, how does simple and direct…

GuZqW8P.png

…compare with one of the hooks that we came up with in a previous step?

1fSB5Rt.png

For extra credit, you can combine these customizable features based on your inbound traffic segmentation with your exit intent (or overlay) messaging.

q4Y2EgA.png

How increasing PPC message match drives results

So back to the results.

After updating the ad account and making major modifications to our client’s landing page infrastructure, here’s what improved message match can deliver (in a competitive industry with mid-five figure monthly spend).

In 2015, before all of this work, the cost per converted click was $482.41 and conversion rate across all accounts was only 4.08%.

IfClUhB.png

During the same 30-day period in 2016 (after all of this work), the cost per converted click fell to only $147.65 and the conversion rate jumped to 12.76%.

2EZ7BjO.png

That means way more leads, for far less. And this just scratches the surface, because in many cases, AdWords conversions are still just leads. Not true sales.

We haven’t even discussed post-lead conversion tactics to combine all of this with, like marketing automation, where you would combine the same message match approach by sending targeted content that builds on the same topics or hooks that people originally searched for and converted on.

Or layering in newer (read: less competitive or expensive) options like Facebook, automatically syncing these leads to your aforementioned marketing automation workflows that are pre-configured with the same message match in mind.

The possibilities are endless, and the same laser-focus on aligning message match with each channel has the potential to increase results throughout the entire funnel.

Conclusion

When a sale is moved from offline to on, we lose a lot of the context for communication that we commonly rely upon.

As a result, the focus tends to shift more towards clarity and specificity.

There’s no greater example than looking at how today’s most popular online ad platforms work, where the costs people pay are directly tied to their performance and ability to “match” or align their ads and content to what people are looking for.

Clever vs. clear?

Who cares — as long as your messages match.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Moz Blog https://moz.com/blog/message-match-conversion-rates
via IFTTT

from Blogger http://imlocalseo.blogspot.com/2017/07/how-message-match-can-lift-conversion.html
via IFTTT