Saturday, 31 August 2019

Building a marketing operations team from scratch, one year in

SAN JOSE, CA — Rachel Beck had led Cisco Meraki’s demand gen team for just over a year when she was named the global manager of marketing operations in April 2018. The newly appointed marketing ops leader was put in charge of building a team that would oversee the tools, processes and systems to power Cisco Meraki’s marketing team — something the company previously did not have.

“I was spending a lot of time on things that were foundational,” said Beck of her previous role before being named global manager of marketing operations. When the CMO realized Beck was more focused on foundational issues than demand gen priorities, he knew it was time to put a marketing ops team in place.

“Before the team existed, we had a queue of things to work on,” said Beck. She started by listing all the projects the team could potentially be giving time to, and of those, which ran the greatest risk of putting the company in jeopardy if they didn’t get done.

“You have to access and prioritize because there is no end to the things you could do.”

Finding the right talent. In the 13 months Beck has been leading the marketing ops team, she has developed six roles. Her advice: learn to look far and wide for the right talent, but know that new hires may be available internally.

“Even if you’re starting a new team, don’t overlook resources that already exist within your company,” said Beck. When hiring for a new role, Beck tries to be as truthful as possible about the position.

“I try to be as specific and honest as I can in the job descriptions,” said Beck, noting that this hiring tactic may result in fewer candidates, but a better pool of potential hires.

“It’s counter-intuitive in some ways,” said Beck. “Instead of selling a role, you’re often saying we don’t know how to do this and hoping you do.”

Creating a mission and putting processes in place to support it. Beck admits that a mission statement isn’t going to get the work done, but does believe it helps the entire team understand what their purpose and set goals. She wrote the following mission statement for her group: “Our mission is to empower marketers and delight customers by intelligently scaling the department, maximizing marketing effectiveness and reducing operational friction.”

To keep her team on track and focused on its mission, she created a process to determine which projects they would take on and which were not worthy of her team’s effort. The process involves four primary requirements:

  • Is it a far-reaching objective that will impact multiple teams?
  • Is it scalable?
  • How impactful is it? Will it improve ROI or help increase the marketing team’s impact?
  • Is it future proof?

Beck said her CMO made it clear early on he didn’t want the marketing ops team to be a catch-all for every issue that came up, so she developed this process to set guidelines on how other departments should engage with her team.

Lessons learned. After a year into her role, Beck has learned a number lessons, starting with the importance of having a mission statement and using it to inform your goals. She also said getting buy-in from top management is crucial.

“Does the highest person in my company have at least some idea of what we do and why we do it?”

Beck said it’s important to be transparent. That the customer is always right (the customer being the internal teams that submit requests to marketing ops), but they still might not get what they want. Her team has to focus on the greater good. For example, marketing ops teams may have to control admin privileges for marketing tools, but such decisions benefit the entire organization, making processes more efficient for all marketers.

“People are angsty about not having access to marketing automation, but the greater good is we need to reduce risk and have governance in place.”

More insights from the MarTech Conference

This story first appeared on MarTech Today. For more on marketing technology, click here.


About The Author

Amy Gesenhues is Third Door Media’s General Assignment Reporter, covering the latest news and updates for Marketing Land and Search Engine Land. From 2009 to 2012, she was an award-winning syndicated columnist for a number of daily newspapers from New York to Texas. With more than ten years of marketing management experience, she has contributed to a variety of traditional and online publications, including MarketingProfs.com, SoftwareCEO.com, and Sales and Marketing Management Magazine. Read more of Amy’s articles.

This marketing news is not the copyright of Scott.Services – please click here to see the original source of this article. Author: Amy Gesenhues

For more SEO, PPC, internet marketing news please check out https://news.scott.services

Why not check out our SEO, PPC marketing services at https://www.scott.services

We’re also on:
https://www.facebook.com/scottdotservices/
https://twitter.com/scottdsmith
https://plus.google.com/112865305341039147737

The post Building a marketing operations team from scratch, one year in appeared first on Scott.Services Online Marketing News.



source https://news.scott.services/building-a-marketing-operations-team-from-scratch-one-year-in/

SEMrush expands to Amazon with Sellerly for product page testing

SEMrush is a popular competitive intelligence platform used by search marketers. The company, recently infused with $40 million in funding to expand beyond Google, Bing and Yahoo insights, has launched a new product called Sellerly specifically for Amazon sellers.

What is Sellerly? Announced Monday, Sellerly designed to give Amazon sellers the ability to split test product detail pages.

“By introducing Sellerly as a seller’s buddy in Amazon marketing, we hope to improve hundreds of existing Amazon sellers’ strategies,”said SEMrush Chief Strategy Officer Eugene Levin in a statement. “Sellerly split testing is only the first step here. We’ve already started to build a community around the new product, which is very important to us. We believe that by combining feedback from users with our leading technology and 10 years of SEO software experience, we will be able to build something truly exceptional for Amazon sellers.”

How does it work? Sellerly is currently free to use. Amazon sellers connect their Amazon accounts to the tool in order to manage their product pages. Sellers can make changes to product detail pages to test against the controls. Sellerly collects data in real time and sellers can then choose winners based on views and conversions.

Sellers can run an unlimited number of tests.

Why we should care. Optimized product detail pages on Amazon is a critical aspect of success on the platform. As Amazon continues to generate an increasing share of e-commerce sales for merchants big and small, and competition only increases, product page optimization becomes even more critical. Amazon does not support AB testing natively. Sellerly is not the first split test product for Amazon product pages to market. Splitly (paid), Listing Dojo (free) are two others that offer similar split testing services.

This story first appeared on Search Engine Land. For more on search marketing and SEO, click here.


About The Author

Ginny Marvin is Third Door Media’s Editor-in-Chief, managing day-to-day editorial operations across all of our publications. Ginny writes about paid online marketing topics including paid search, paid social, display and retargeting for Search Engine Land, Marketing Land and MarTech Today. With more than 15 years of marketing experience, she has held both in-house and agency management positions. She can be found on Twitter as @ginnymarvin.

This marketing news is not the copyright of Scott.Services – please click here to see the original source of this article. Author:

For more SEO, PPC, internet marketing news please check out https://news.scott.services

Why not check out our SEO, PPC marketing services at https://www.scott.services

We’re also on:
https://www.facebook.com/scottdotservices/
https://twitter.com/scottdsmith
https://plus.google.com/112865305341039147737

The post SEMrush expands to Amazon with Sellerly for product page testing appeared first on Scott.Services Online Marketing News.



source https://news.scott.services/semrush-expands-to-amazon-with-sellerly-for-product-page-testing/

Be smart, advertisers. Here’s how to approach rising Google brand CPC

Branded keywords, or keywords that include the name of the advertiser bidding on those keywords, have long been a source of controversy in the paid search industry. For years, many paid search managers grouped these keywords into reports that reflected total account performance.

This tends to overinflate the value of paid search campaigns since most brand queries are navigational and reflect a user that is already intent on buying from the brand searched for. As such, brand conversion rate is typically significantly higher than that of non-brand traffic and high brand return on ad spend (ROAS) can cover up underperforming non-brand campaigns.

These days, most advertisers are hip to the fact that they should be looking at brand and non-brand performance separately. However, there have been a lot of changes over time that might impact an advertiser’s brand keyword strategy, starting with a significant increase in the price of these keywords over the years.

The price of brand keywords ain’t what it used to be

Google has long given advertisers an advantage over competitors in bidding on their brand terms by way of quality score, which is generally very high for an advertiser bidding on its own terms and lower for competitors trying to show ads on those terms. This makes a lot of sense in terms of providing users with a quality experience since the query indicates that the user is probably most interested in going to the website for that particular brand and that Google should prioritize the brand’s listing as opposed to a competitor.

This quality score advantage plays a direct role in the price advertisers pay for brand keywords and has long suppressed average cost-per-click below what many advertisers might be willing to pay for brand traffic. However, that gap is becoming smaller over time.

Evaluating Merkle (my employer) advertiser data, average brand CPC rose more than 20% between Q4 2017 and Q3 2018 before final slowing over the last couple of quarters.

Google’s response to the increases that specific advertisers see typically references competitive forces encroaching on these auctions. That may well be true, but Google itself is responsible for the extent to which competitors can drive up brand CPC.

This goes back to the quality score advantage most advertisers have over competitors for their brand terms. Changes to advertisers’ relative quality score impact the ad ranks of those brands, which directly affects the CPC an advertiser must pay.

For example, say Google started giving competitors even worse quality scores for an advertiser’s brand keywords. If the advertiser were paying just enough to beat the ad rank of the closest competitor, this change should result in lower brand CPC, since competitors’ ad ranks would go down with worse quality score.

The opposite can certainly also happen, with Google giving competitors greater quality scores relative to an advertiser bidding on its brand terms. This would naturally increase an advertiser’s CPC.

Of course, Google’s response to unpalatable CPC increases is to call out the fact that advertisers have control over how much they pay for branded traffic.

Don’t like brand CPC? Just lower the bid of course!

Google is well within its rights to charge as much as it wants for brand clicks as long as advertisers are paying less than the max CPC assigned to those keywords. As such, shrinking the gap between the price paid for a keyword and the max CPC bid is one method of keeping CPC increases in a palatable range.

Advertisers that have already seen brand CPC go up significantly often combat the increase by testing out lower bids in a step-down approach to figure out how much traffic is lost at different levels. This can be useful in determining a bid that keeps the advertiser visible for as many brand searches as possible but also limits exposure to increases in CPC by reducing the gap between the bid and the average CPC.

However, the auction can change at any time with updates, either by competitors or Google, that throws a wrench into the conclusions reached from past test results, and a bid that gets an advertiser the vast majority of brand traffic today might not cut it tomorrow. Sometimes such increases can spawn from mistakes on Google’s end, but the search giant has become far less forgiving over time in making sure advertisers feel whole from such events.

Don’t ask for a refund

Back in 2016, brand CPC briefly spiked on phones before coming back down after an article of mine exposed the issue.

The increase was the result of an unintentional issue on Google’s end, and affected advertisers received a credit for the overspend from Google.

Fast-forward to 2019, and we recently saw a similar spike in brand CPC for some advertisers, attributed by Google to what it described as a ‘bug.’ CPC soon returned to normal, but there were no credits given out this go around, even for advertisers that saw dramatic increases in ad spend from the issue. Where keeping good relations with advertisers may have once led to a goodwill gesture to make good, Google now seems to favor the argument that as long as average CPC is below max CPC, spikes in spend are on the advertiser.

This transition in attitude only makes it more important for advertisers to control the gap between average CPC and bids to ensure there’s only so much wiggle room for a similar bug to ramp up costs since Google clearly won’t be saving anyone from itself. Setting logical budgets to cap brand spend based on campaign history and creating systems for intraday checks can also go a long way towards limiting the potential damage from surges in CPC.

I think it’s particularly true that advertisers need to be protective in these ways in light of Google’s recent decision to sunset average position.

Don’t let new metrics drive you to bid too much

Google announced in February that it would be eliminating the average position metric come this September. Instead of average position, it recommends that advertisers rely on impression and click share metrics in assessing how competitive a particular ad is in relevant auctions.

The announcement highlighted that average position is often a messy metric to use when assessing where an ad is falling on the page. However, as with most updates, it stands to reason that there might be some upside to Google in transitioning advertisers away from average position and towards metrics like absolute top impression share. This is particularly true for brand keywords, which are often judged based primarily on how well they take up the top possible placement in search results.

For example, some advertisers see a perfect average position of 1.0 for brand keywords but an absolute top impression share of just 75%. Once these brands no longer have average position to fly by, it’s possible that they’ll turn to bidding based on achieving as high an absolute top impression share as possible. Given 75% might not seem good enough, it’s quite possible this will lead to increased bids, which would, in turn, give Google more wiggle room in charging higher average CPC.

Knowing this, it might make sense for advertisers to assess brand traffic growth over time and make bid adjustments based on that. We find that Merkle advertisers typically see Y/Y click growth around 5%.

However, brand traffic growth is hugely dependent on efforts outside of paid search, such as print and television advertising. As such, it’s necessary to adjust expectations based on such efforts as well other big-picture factors such as shifts in overall market share and consumer demand for the specific offerings of an advertiser. This gets messy quickly but is at least one alternative data point to reference in assessing whether bids should get ramped up to maximize absolute top impression share.

Of course, none of this takes into account the role of organic listings in brand search.

Can I just not pay for brand listings? Maybe!

After years of debate, the answer to whether a brand can forego bidding on brand keywords altogether and still receive all the traffic from branded queries remains the same: it depends on the brand.

If the brand is big enough and the competition sparse enough that all or nearly all brand searchers end up making their way to the brand’s website without a paid ad, it should certainly consider turning off brand ads to save the money. However, most brands do see a dip in traffic and orders when turning off brand ads, and the only way to measure just how significant that dip might be is through testing – though again, changes to the SERP can render any past tests useless at a moment’s notice.

Advertisers’ appetite for bidding on brand keywords despite higher CPCs isn’t infinite, and there is a point at which brands should call it quits despite the potential for lost clicks and sales, though Google certainly doesn’t want to reach that point. While paid search marketers are somewhat at the mercy of Google’s auction systems in determining CPC, they can still take proactive steps to learn as much as possible about the incremental lift coming from brand ads and install safeguards to ensure increases in CPC are controlled.


Opinions expressed in this article are those of the guest author and not necessarily Marketing Land. Staff authors are listed here.


About The Author

Andy Taylor is a Senior Research Analyst at RKG, responsible for analyzing trends across the digital marketing spectrum for best practices and industry commentary. A primary contributor to the Merkle | RKG Blog, Dossier, and quarterly Digital Marketing Report, his 4+ years of experience have seen him master and provide valuable insights into topics that extend across paid search, comparison shopping engines, display advertising, SEO, and social media. Prior to coming to RKG, Andy worked as an event organizer for a political campaign and dabbled in freelance writing. A graduate of the University of Virginia with a degree in Economics, he likes to spend his free time watching documentaries and selling homemade ice cream sandwiches at farmer’s markets with his girlfriend.

This marketing news is not the copyright of Scott.Services – please click here to see the original source of this article. Author:

For more SEO, PPC, internet marketing news please check out https://news.scott.services

Why not check out our SEO, PPC marketing services at https://www.scott.services

We’re also on:
https://www.facebook.com/scottdotservices/
https://twitter.com/scottdsmith
https://plus.google.com/112865305341039147737

The post Be smart, advertisers. Here’s how to approach rising Google brand CPC appeared first on Scott.Services Online Marketing News.



source https://news.scott.services/be-smart-advertisers-heres-how-to-approach-rising-google-brand-cpc/

Salesforce’s Pardot went down for 15 hours, exposing data in the cloud

Last Friday’s Salesforce outage meant work came to a halt for thousands of marketing and sales users locked out of Pardot and Salesforce Marketing Cloud.

The outage, reportedly caused by a faulty Pardot database script, was prompted by reports that users were able to see and edit all of their company’s data, regardless of their permission settings. Salesforce quickly responded by cutting off access to current and past Pardot customers as it worked to resolve the faulty script.

“As a result, customers who were not affected may have also experienced service disruption, including customers using Marketing Cloud integrations,” a Salesforce spokesperson said in a statement. Salesforce and Pardot have not responded to Marketing Land’s request to comment on the outage and remediation plan.

Why we should care

Productivity all but stopped for many organizations that rely on Salesforce for their sales and marketing efforts. If you couldn’t access any of your analytics, content, data, contacts, how would your team operate?

It’s yet another reminder that internal and external workflows and processes we rely on to conduct business throughout our organizations could all potentially be impacted by an outage. The impact of lost productivity may have been especially devastating to small organizations that were unable to access their instances and solely depend on Salesforce for their business operations.

Digital marketers strive for seamless integrations and unified marketing execution, but this outage should serve as a wake-up call to us. Digital marketers need to develop contingency plans for handling worst-case scenarios and platform outages that out of our control. If your organization doesn’t have a marketing-specific business continuity plan or disaster recovery plan, work with your internal stakeholders to identify risks and outline plan of action in case of an emergency.

More on the outage

  • Users were locked out for 15 hours before Admins regained access on Friday.
  • Salesforce users who want to monitor for updates related to this issue can do so on the Salesforce status page.
  • Salesforce has shared two workarounds for Admins to restore production profiles and permissions from a Sandbox Copy:
    • Option 1:
    • Under “Administration/Users”, check the Profiles and Permission Sets in Setup to determine if your Sandbox Copy contains a valid data backup.
    • If your non-admin profiles are configured such that all of the “Standard Object Permissions” (read, create, edit, delete) are unchecked, the Sandbox org was not impacted and is not a valid source for recovery.
    • Permission Sets and User Profiles can be deployed from Sandbox to Production orgs through the following links:
    • Change Set Documentation
    • Ant Migration Tool Documentation
    • Please note that your sandbox configuration could be outdated and not identical to your product org before the incident. Carefully review these setting before deploying them to production. Salesforce recommends testing with one profile before migrating all profiles.
    • Option 2:
    • If a Sandbox containing production profiles and permission sets does not exist and there is an organizational need for you to restore, Admins can manually modify Profile and Permission Set configurations to grant appropriate access to users:
    • Edit Profiles Documentation
    • Permission Set Documentation

About The Author

Jennifer Videtta serves as Third Door Media’s Senior Editor, covering topics from email marketing and analytics to CRM and project management. With over a decade of organizational digital marketing experience, she has overseen digital marketing operations for NHL franchises and held roles at tech companies including Salesforce, advising enterprise marketers on maximizing their martech capabilities. Jennifer formerly organized the Inbound Marketing Summit and holds a certificate in Digital Marketing Analytics from MIT Sloan School of Management.

This marketing news is not the copyright of Scott.Services – please click here to see the original source of this article. Author:

For more SEO, PPC, internet marketing news please check out https://news.scott.services

Why not check out our SEO, PPC marketing services at https://www.scott.services

We’re also on:
https://www.facebook.com/scottdotservices/
https://twitter.com/scottdsmith
https://plus.google.com/112865305341039147737

The post Salesforce’s Pardot went down for 15 hours, exposing data in the cloud appeared first on Scott.Services Online Marketing News.



source https://news.scott.services/salesforces-pardot-went-down-for-15-hours-exposing-data-in-the-cloud/

Complete guide to Google Search Console

Complete guide to Google Search Console

At the frontlines in the battle for SEO is Google Search Console (GSC), an amazing tool that makes you visible in search engine results pages (SERPs) and provides an in-depth analysis of web traffic being routing to your doorstep. And it does all this for free.

If your website marks your presence in cyberspace, GSC boosts viewership and increases traffic, conversions, and sales. In this guide, SEO strategists at Miromind explain how you benefit from GSC, how you integrate it with your website, and what you do with its reports to strategize the domain dominance of your brand.

What is Google Search Console (GSC)?

Created by Google, the Google Webmaster Tools (GWT) initially targeted webmasters. Offered by Google as a free of cost service, GWT metamorphosed into its present form, the Google Search Console (GSC). It’s the cutting edge tool widely used by an exponentially diversifying group of digital marketing professionals, web designers, app developers, SEO specialists, and business entrepreneurs.

For the uninitiated, GSC tells you everything that you wish to know about your website and the people who visit it daily. For example, how much web traffic you’re attracting, what are people searching for in your site, the kind of platform (mobile, app, desktop) people are using to find you, and more importantly, what makes your site popular.

Then GSC takes you on a subterranean dive to find and fix errors, design sitemaps, and check file integrity.

Precisely what does Google Search Console do for you? These are the benefits.

1. Search engine visibility improves

Ever experienced the sinking sensation of having done everything demanded of you for creating a great website, but people who matter can’t locate you in a simple search? Search Console makes Google aware that you’re online.

2. The virtual image remains current and updated

When you’ve fixed broken links and coding issues, Search Console helps you update the changes in such a manner that Google’s search carries an accurate snapshot of your site minus its flaws.

3. Keywords are better optimized to attract traffic

Wouldn’t you agree that knowing what draws people to your website can help you shape a better user experience? Search Console opens a window to the keywords and key phrases that people frequently use to access your site. Armed with this knowledge, you can optimize the site to respond better to specific keywords.

4. Safety from cyber threats

Can you expect to grow business without adequate protection against external threats? Search Console helps you build efficient defenses against malware and spam, securing your growing business against cyber threats.

5. Content figures prominently in rich results

It’s not enough to merely figure in a search result. How effectively are your pages making it into Google rich results? These are the cards and snippets that carry tons of information like ratings, reviews, and just about any information that results in better user experience for people searching for you. Search console gives you a status report on how your content is figuring in rich results so you can remedy a deficit if detected.

6. Site becomes better equipped for AMP compliance

You’re probably aware that mobile friendliness has become a search engine ranking parameter. This means that the faster your pages load, the more user-friendly you’re deemed to be. The solution is to adopt accelerated mobile pages (AMP), and Search Console helpfully flags you out in the areas where you’re not compliant.

7. Backlink analysis

The backlinks, the websites that are linking back to your website give Google an indication of the popularity of your site; how worthy you are of citation. With Search Console, you get an overview of all the websites linking to you, and you get a deeper insight into what motivates and sustains your popularity.

8. The site becomes faster and more responsive to mobile users

If searchers are abandoning your website because of slow loading speeds or any other glitch, Search Console alerts you so you can take remedial steps and become mobile-friendly.

9. Google indexing keeps pace with real-time website changes

Significant changes that you make on the website could take weeks or months to figure in the Google Search Index if you sit tight and do nothing. With search console, you can edit, change, and modify your website endlessly, and ensure the changes are indexed by Google instantaneously. By now you have a pretty good idea why Google Search Console has become the must-have tool for optimizing your website pages for improved search results. This also helps ensure that your business grows in tandem with the traffic that you’re attracting and converting.

Your eight step guide on how to use Google Search Console

1. How to set up your unique Google Search Console account

Assuming that you’re entirely new to GSC, your immediate priority is to add the tool and get your site verified by Google. By doing this, you’ll be ensuring that Google classifies you unambiguously as the owner of the site, whether you’re a webmaster, or merely an authorized user.

This simple precaution is necessary because you’ll be privy to an incredibly rich source of information that Google wouldn’t like unauthorized users to have access to.

You can use your existing Google account (or create a new one) to access Google Search Console. It helps if you’re already using Google Analytics because the same details can be used to login to GSC. Your next step is to open the console and click on “Add property”.

Screenshot of adding a property in Google Search Console

By adding your website URL into the adjacent box, you get an umbilical connection to the console so you can start using its incredible array of features. Take care to add the prefix “https” or “www” so Google loads the right data.

2. How to enable Google to verify your site ownership

Screenshot of Google verifying site ownership

Option one

How to add an HTML tag to help Google verify ownership

Once you have established your presence, Google will want to verify your site. At this stage, it helps to have some experience of working in HTML. It’ll be easier to handle the files you’re uploading; you’ll have a better appreciation of how the website’s size influences the Google crawl rate, and gain a clearer understanding of the Google programs already running on your website.

Screenshot of adding an HTML tag to help Google verify ownership

If all this sounds like rocket science, don’t fret because we’ll be hand-holding you through the process.

Your next step is to open your homepage code and paste the search console provided HTML tag within the

section of your site’s HTML code.

The newly pasted code can coexist with any other code in the

section; it’s of no consequence.

An issue arises if you don’t see the

section, in which case you’ll need to create the section to embed the Search Console generated code so that Google can verify your site.

Save your work and come back to the homepage to view the source code; the console verification code should be clearly visible in the

section confirming that you have done the embedding correctly.

Your next step is to navigate back to the console dashboard and click “Verify”.

At this stage, you’ll see either of two messages – A screen confirming that Google has verified the site, or pop up listing onsite errors that need to be rectified before completing verification. By following these steps, Google will be confirming your ownership of the site. It’s important to remember that once the Google Search Console code has been embedded onsite and verified, any attempt to tamper or remove the code will have the effect of undoing all the good work, leaving your site in limbo.

Getting Google Search Console to verify a WordPress website using HTML tag

Even if you have a WordPress site, there’s no escape from the verification protocol if you want to link the site to reap the benefits of GSC.

Assuming that you’ve come through the stage of adding your site to GSC as a new property, this is what you do.

The WordPress SEO plugin by Yoast is widely acknowledged to be an awesome SEO solution tailor-made for WordPress websites. Installing and activating the plugin gives you a conduit to the Google Search Console.

Once Yoast is activated, open the Google Search Console verification page, and click the “Alternate methods” tab to get to the HTML tag.

You’ll see a central box highlighting a meta tag with certain instructions appearing above the box. Ignore these instructions, select and copy only the code located at the end of the thread (and not the whole thread).

Screenshot of verifying a WordPress website using Yoast

Now revert back to the website homepage and click through SEO>Dashboard. In the new screen, on clicking “Webmaster tools” you open the “Webmaster tools verification” window. The window displays three boxes; ensure to paste the previously copied HTML code into the Google Search Console box, and save the changes.

Now, all you have to do is revert to the Google Search Console and click “Verify” upon which the console will confirm that verification is a success. You are now ready to use GSC on your WordPress site.

Option two

How to upload an HTML file to help Google verify ownership

This is your second verification option. Once you’re in Google Search Console, proceed from “Manage site” to “Verify this site” to locate the “HTML file upload” option. If you don’t find the option under the recommended method, try the “Other verification methods”.

Once you’re there, you’ll be prompted to download an HTML file which must be uploaded in its specified location. If you change the file in any manner, Search Console won’t be able to verify the site, so take care to maintain the integrity of the download.

Once the HTML file is loaded, revert back to the console panel to verify, and once that is accomplished you’ll get a message confirming that the site is verified. After the HTML file has been uploaded, go back to Search Console and click “Verify”.

If everything has been uploaded correctly, you will see a page letting you know that the site has been verified.

Once again, as in the first option we’ve listed, don’t change, modify, or delete the HTML file as that’ll bring the site back to the unverified status.

Option three

Using the Google Tag Manager route for site verification

Before you venture into the Google Search Console, you might find it useful to get the hang of Google Tag Manager (GTM). It’s a free tool that helps you manage and maneuver marketing and analytics tags on your website or app.

You’ll observe that GTM doubles up as a useful tool to simplify site verification for Google Search Console. If you intend to use GTM for site verification there are two precautions you need to take; open your GTM account and enable the “View, Edit, and Manage” mode.

Also, ensure that the GTM code figures adjacent to the

tag in your HTML code.

Once you’re done with these simple steps, revert back to GSC and follow this route – Manage site > Verify this site > Google Tag Manager. By clicking the “Verify” option in Google Tag Manager, you should get a message indicating that the site has been verified.

Pop up screenshot of site verification using Google Tag Manager

Once again, as in the previous options, never attempt to change the character of the GTM code on your site as that may bring the site back to its unverified position.

Option four

Securing your status as the domain name provider

Once you’re done with the HTML file tagging or uploading, Google will prompt you to verify the domain that you’ve purchased or the server where your domain is hosted, if only to prove that you are the absolute owner of the domain, and all its subdomains or directories.

Open the Search Console dashboard and zero in on the “Verify this site” option under “Manage site”.

You should be able to locate the “Domain name provider” option either under the “Recommended method” or the “Alternate method” tab. When you are positioned in the “Domain name provider”, you’ll be shown a listing of domain hosting sites that Google provides for easy reference.

Screenshot of securing yourself as the domain name provider

At this stage, you have two options.

If your host doesn’t show up in the list, click the “Other” tab to receive guidelines on creating a DNS TXT code aimed at your domain provider. In some instances, the DNS TXT code may not match your provider. If that mirrors your dilemma, create a DNS TXT record or CNAME code that will be customized for your provider.

3. Integrating the Google Analytics code on your site

If you’re new to Google Analytics (GA), this is a good time to get to know this free tool. It gives you amazing feedback which adds teeth to digital marketing campaigns.

At a glance, GA helps you gather and analyze key website parameters that affect your business. It tracks the number of visitors converging on your domain, the time they spend browsing your pages, and the specific keywords in your site that are most popular with incoming traffic.

Most of all, GA gives you a fairly comprehensive idea of how efficiently your sales funnel is attracting leads and converting customers. The first thing you need to do is to verify whether the website has the GA tracker code inserted in the

segment in the homepage HTML code. If the GA code is to carry out its tracking functions correctly, you have to ensure that the code is placed only in the segment and not elsewhere as in the segment.

Back in the Google Search Console, follow the given path – Manage site > Verify this site till you come to the “Google Analytics tracking code” and follow the guidelines that are displayed. Once you get an acknowledgment that the GA code is verified, refrain from making any changes to the code to prevent the site from reverting to unverified status.

Google Analytics vs. Google Search Console – Knowing the difference and appreciating the benefits

For a newbie, both Google Analytics and Google Search Console appear like they’re focused on the same tasks and selling the same pitch, but nothing could be further from the truth.

Read also: An SEO’s guide to Google Analytics

GA’s unrelenting focus is on the traffic that your site is attracting. GA tells you how many people visit your site, the kind of platform or app they’re using to reach you, the geographical source of the incoming traffic, how much time each visitor spends browsing what you offer, and which are the most searched keywords on your site.

If GA gives you an in-depth analysis of the efficiency (or otherwise) of your marketing campaigns and customer conversion pitch. Google Search Console then peeps under the hood of your website to show you how technically sound you are in meeting the challenges of the internet.

GSC is active in providing insider information.

  • Are there issues blocking the Google search bot from crawling?
  • Are website modifications being instantly indexed?
  • Who links to you and which are your top-linked pages?
  • Is there malware or some other cyber threat that needs to be quarantined and neutralized?
  • Is your keyword strategy optimized to fulfill searcher intent?

GSC also opens a window to manual actions, if any, issued against your site by Google for perceived non-compliance of the Webmaster guidelines.

If you open the manual actions report in the Search Console message center and see a green check mark, consider yourself safe. But if there’s a listing of non-compliances, you’ll need to fix either the individual pages or sometimes the whole website and place the matter before Google for a review.

Screenshot of a complying site on Google Search Console

Manual actions must be looked into because failure to respond places your pages in danger of being omitted from Google’s search results. Sometimes, your site may attract manual action for no fault of yours, like a spammy backlink that violates Webmaster quality guidelines, and which you can’t remove.

Screenshot of a site non-compliance on Google Search Console

In such instances, you can use the GSC “Disavow Tool” to upload a text file, listing the affected URLs, using the disavow links tool page in the console.

If approved, Google will recrawl the site and reprocess the search results pages to reflect the change. Basically, GA is more invested in the kind of traffic that you’re attracting and converting, while GSC shows you how technically accomplished your site is in responding to searches, and in defining the quality of user experience.

Packing power and performance by combining Google Analytics and Google Search Console

You could follow the option of treating GA and GSC as two distinct sources of information and analyze the reports you access, and the world would still go on turning.

But it may be pertinent to remember that both tools present information in vastly different formats even in areas where they overlap. It follows that integrating both tools presents you with additional analytical reports that you’d otherwise be missing; reports that trudge the extra mile in giving you the kind of design and marketing inputs that lay the perfect foundation for great marketing strategies.

Assuming you’re convinced of the need for combining GA and GSC, this is what you do.

Open the Google Search Console, navigate to the hub-wheel icon, and click the “Google Analytics Property” tab.

Screenshot of how to combine Google Analytics and Google Search Console

This shows you a listing of all the GA accounts that are operational in the Google account.

Hit the save button on all the accounts that you’ll be focusing on, and with that small step, you’re primed to extract maximum juice from the excellent analytical reporting of the GA-GSC combo.

Just remember to carry out this step only after the website has been verified by Google by following the steps we had outlined earlier.

What should you do with Google Search Console?

1. How to create and submit a sitemap to Google Search Console

Is it practical to hand over the keys to your home (website) to Google and expect Google to navigate the rooms (webpages) without assistance?

You can help Google bots do a better job of crawling the site by submitting the site’s navigational blueprint or sitemap.

The sitemap is your way of showing Google how information is organized throughout your webpages. You can also position valuable details in the metadata, information on textual content, images, videos, and podcasts, and even mention the frequency with which the page is updated.

We’re not implying that a sitemap is mandatory for Google Search Console, and you’re not going to be penalized if you don’t submit the sitemap.

But it is in your interests to ensure that Google has access to all the information it needs to do its job and improve your visibility in search engines, and the sitemap makes the job easier. Ultimately, it works in your favor when you’re submitting a sitemap for an extensive website with many pages and subcategories.

For starters, decide which web pages you want Google bots should crawl, and then specify the canonical version of each page.

What this means is that you’re telling Google to crawl the original version of any page to the exclusion of all other versions.

Then create a sitemap either manually or using a third-party tool.

At this stage, you have the option of adding the sitemap to the robots.txt file in your source code or link it directly to the search console.

Read also: Robots.txt best practice guide + examples

Assuming that you’ve taken the trouble to get the site verified by GSC, revert back to the search console, and then navigate to “Crawl” and its subcategory “Sitemaps.”

On clicking “Sitemaps” you will see a field “Add a new sitemap”. Enter the URL of your sitemap in a .xml format and then click “Submit”.

Screenshot of adding a site map

With these simple steps, you’ve effectively submitted your sitemap to Google Search Console.

2. How to modify your robots.txt file so search engine bots can crawl efficiently

There’s a file embedded in your website that doesn’t figure too frequently in SEO optimization circles. The minor tweaking of this file has major SEO boosting potential. It’s virtually a can of high-potency SEO juice that a lot of people ignore and very few open.

It’s called the robots exclusion protocol or standard. If that freaks you out, we’ll keep it simple and call it the robots.txt file.

Even without technical expertise, you can open your source code and you’ll find this file.

The robots.txt is your website’s point of contact with search engine bots.

Before tuning in on your webpages, the search bot will peep into this text file to see if there are any instructions about which pages should be crawled and which pages can be ignored (that’s why it helps to have your sitemap stored here).

The bot will follow the robots exclusion protocol that your file suggests regarding which pages are allowed for crawling and which are disallowed. This is your site’s way of guiding search engines to pages that you wish to highlight and also helps in exclusion of content that you do not want to share.

There’s no guarantee that robots.txt instructions will be followed by bots, because bots designed for specific jobs may react differently to the same set of instructions. Also, the system doesn’t block other websites from linking to your content even if you wouldn’t want the content indexed.

Before proceeding further, please ensure that you’ve already verified the site; then open the GSC dashboard and click the “Crawl” tab to proceed to “robots.txt Tester.”

This tool enables you to do three things:

  • Peep into the robots.txt file to see which actions are currently allowed or disallowed
  • Check if there are any crawl errors in the past 90 days
  • Make changes to suit your desired mode of interacting with search bots

Once you’ve made necessary changes, it’s vital that the robots.txt file in your source code reflects those changes immediately.

To do that, shortly after making changes, click the “Submit” tag below the editing box in the search console and proceed to upload the changed file to update the source code. Your root directory should then appear as www.yourwebsite.com/robots.txt.

To confirm that you’ve completed the mission, go back to the search console’s robots.txt testing tool and click “Verify live version” following which you should get a message verifying the modification.

3. How to use the “Fetch as Google” option to update regular website changes

On-page content and title tags undergo regular changes in the website’s life cycle, and it’s a chore to manually get these changes recorded and updated in the Google search engine. Fortunately, GSC comes up with a solution.

Once you’ve located the page that needs a change or update, open the search console, go to the “Crawl” option and zero in on the “Fetch as Google” option. You’ll see a blank URL box in the center.

Screenshot of how to use the “Fetch as Google” in Google Search Console

Enter the modified page in the box to look like this; http://yourwebsite.com/specificcategory, then click “Fetch and Render.”

After completing this step go to the “Request indexing” button and consider the options before you.

What you have just done is to authorize the Google bot to index all the changes that you’ve put through, and within a couple of days, the changes become visible in the Google search results.

4. How to use Google Search Console to identify and locate site errors

A site error is a technical malfunction which prevents Google search bots from indexing your site correctly.

Naturally, when your site is wrongly configured or slowing down, you are creating a barrier between the site and search engines. This blocks content from figuring in top search results.

Even if you suspect that something is wrong with your site, you can’t lose time waiting for the error to show up when it’s too late, because the error would have done the damage by then.

So, you turn to Google Search Console for instant troubleshooting. With GSC, you get a tool that keeps you notified on errors that creep into your website.

When you’ve opened the Google Search Console, you’ll see the “Crawl” tab appearing on the left side of the screen. Click the tab and open “Crawl errors”.

What you see now is a listing of all the page errors that Google bots encountered while they were busy indexing the site. The pop up will tell you when the page was last crawled and when the first error was detected, followed by a brief description of the error.

Once the error is identified, you can handover the problem for rectification to your in-house webmaster.

When you click the “Crawl” tab, you’ll find “Crawl stats.” This is your gateway to loads of statistically significant graphs that show you all the pages that were crawled in the previous 90 days, the kilobytes downloaded during this period, and precisely how much time it took for Google to access and download a page. These stats give a fair indication of your website’s speed and user-friendliness.

Conclusion

A virtual galaxy of webmasters, SEO specialists, and digital marketing honchos would give their right arm and left leg for tools that empower SEO and bring in more customers that’ll ring the cash registers. Tools with all the bells and whistles are within reach, but many of them will make you pay hefty fees to access benefits.

But here’s a tool that’s within easy reach, a tool that promises high and delivers true to expectations without costing you a dollar, and paradoxically, very few people use it.

GSC is every designer’s dream come true, every SEO expert’s plan B, every digital marketer’s Holy Grail when it comes to SEO and the art of website maintenance.

What you gain using GSC are invaluable insights useful in propelling effective organic SEO strategies, and a tool that packs a punch when used in conjunction with Google Analytics.

When you fire the double-barreled gun of Google Analytics and Google Search Console, you can aim for higher search engine rankings and boost traffic to your site, traffic that converts to paying customers.

Dmitriy Shelepin is an SEO expert and co-founder of Miromind.

Related reading

Google featured snippet - A short guide for 2019
Five ways to improve your website's bounce rate (and why you should)
Debunked Nine link building myths you should ignore in 2019
social media_does it affect seo

This marketing news is not the copyright of Scott.Services – please click here to see the original source of this article. Author: Dmitriy Shelepin

For more SEO, PPC, internet marketing news please check out https://news.scott.services

Why not check out our SEO, PPC marketing services at https://www.scott.services

We’re also on:
https://www.facebook.com/scottdotservices/
https://twitter.com/scottdsmith
https://plus.google.com/112865305341039147737

The post Complete guide to Google Search Console appeared first on Scott.Services Online Marketing News.



source https://news.scott.services/complete-guide-to-google-search-console/

Friday, 30 August 2019

Seven reasons why your rankings dropped and how to fix them

Seven reasons why your rankings dropped and how to fix them

Do you know the triumph when your content finally hits the first page of Google and attracts significant traffic? Unfortunately, nobody is safe from a sudden drop in rankings. The thing is that the reasons for it may be different and not obvious at all.

In this post, you’ll discover what could cause a sudden drop in traffic and how to fix the issue.

The tip of an iceberg

Unfortunately, there’s no one size fits all decision, when it comes to SEO. When you face the drop in your rankings or traffic, it’s just the tip of an iceberg. So, get ready to check lots of issues, before you identify the problem.

Graph on issues that cause ranking drops

Note: Percentages assigned in the above graph are derived from personal observation.

I’ve illustrated the most common reasons for a plummet. Start from checking these parameters to find out how you can recover your rankings and drive traffic to your website.

Algorithms test

First of all, check the SERP. What if it’s not only your website that changed its positions in search results? These sharp shifts may happen when Google tests its algorithms. In this case, you don’t even have to take any further steps, as the rankings will be restored soon.

If you track your rankings with Serpstat, you can analyze your competitors’ positions as well. It’ll help you understand whether the SERP was changing a lot lately. From the moment you create a new project, the tool starts tracking the history of top-100 search rankings’ changes for the selected keywords. The “Storm” graph illustrates the effect of the changes that have occurred in the search results.

The

On this chart, you see that for the “cakes for dads” keyword the storm score was pretty high on 21st March. Now, let’s look at how the top-10 positions that were changing on this date.

Graph showing a phrase-wise rise and drop in the SERP

The graph shows a sharp drop and rise that occurred in most of the positions. In a few days, all the rankings were back to normal again.

This example tells us that whenever you witness a significant drop in your search rankings, you should start with analyzing the whole SERP. If there’s a high storm score, all you need to do is to wait a bit.

In case you checked your competitors’ positions and didn’t see any movements, here’s the next step for you.

Technical issues

Technical SEO affects how search robots crawl and index your site’s content. Even though you have optimized your website technically, every time you add or remove some files or pages, the troubles may occur. So, make sure you’re aware of technical SEO issues on your site. With Google’s URL Inspection tool, you can check the way search engines see your website.

These are the main factors crucial for your rankings:

1. Server overload

If your server isn’t prepared for traffic surges, it can take your site down any minute. To fix this problem, you can add a CDN on your website or cache your content, set up a load balancer, or set up a cloud hosting,

2. Page speed

The more the images, files, and pop-ups you add to your content, the more time it takes for your pages to get loaded. Mind that page speed isn’t only a ranking factor, but it also influences user experience. To quickly check the issue, you can go with Google’s PageSpeed Insights. And to speed up your website, you can:

  • Minimize HTTP requests or minify and combine files
  • Use asynchronous loading for CSS and JavaScript files
  • Defer JavaScript loading
  • Minimize time to first byte
  • Reduce server response time
  • Enable browser caching
  • Reduce image sizes
  • Use CDN again
  • Optimize CSS delivery
  • Prioritize above-the-fold content (lazy loading)
  • Reduce the number of plugins you use on your site
  • Reduce redirects and external scripts
  • Monitor mobile page speed

3. Redirections

It’s the most common cause of lost rankings. When you migrate to a new server or change the structure of your site, never forget to set up 301 redirects. Otherwise, search engines will either fail to index your new pages or even penalize your site for duplicate content.

Detecting site errors can be quite difficult especially if it’s located solely on one page. Inspecting every page would be time-consuming. Also, it’d be very costly if you’re running a business. To speed up the process of identifying such errors you can use different SEO tools and site audit tools, like Serpstat, OnCrawl, and other such ones.

Wrong keywords

Are you using the right keywords? If you hadn’t considered user intent when collecting the keywords, it might have caused some problems. Even if your site was ranking high for these queries for some time, Google could have changed the way it understands your site’s intent.

I’ll provide two examples to illustrate the issue.

Case one

There’s a website of an Oxford Summer School named “oxford-royale.co.uk”. The site didn’t contain any long-form descriptions but services pages. Once Google began to rank the website for queries with informational intent, SEO experts noticed the traffic dropped. After they added more texts to the service pages, they succeeded in fixing the problem.

Case two

This case occurred to a flower delivery agency. While the website was ranking for transactional queries, everything was alright. Then Google decided the site better suits informational intent. To restore the site’s rankings, SEOs had to add keywords with high transactional intent, such as “order”, “buy”, and many such keywords.

To collect the keywords that are right for your business goals, you can use KWFinder. With the tool, you can identify relevant keywords that you can easily rank for.

Screenshot of a suitable keywords' list in KWFinder

Outdated content

This paragraph doesn’t require long introductions. If your content isn’t fresh and up-to-date anymore, people won’t stay long on your site. Moreover, outdated content doesn’t attract shares and links. All these aspects may become good reasons for search engines to reduce your positions.

There’s an easy way to fix it. Update your content regularly and promote it not to lose traffic. The trends keep changing, and if you provided a comprehensive guide on the specific topic, you don’t want it to become outdated. Instead of creating a new guide every time, update the old one with new data.

Lost links

Everybody knows your link profile is a crucial part of your site’s SEO. Website owners take efforts to build quality links to the new pieces of content. However, when you managed to earn a large number of backlinks, you shouldn’t stop monitoring your link profile.

To discover whether your link profile has undergone any changes for the last weeks, go with Moz or Majestic. The tools will provide you with data on your lost and discovered links for the selected period.

Screenshot of discovered and lost linking domains in Moz

If you find out you’ve lost the links from trustworthy sources, try to identify the reasons why these links were removed. In case they’re broken, you can always fix them. If website owners removed your links by chance (for example, when updating their websites), then ask them to restore links. If they did it intentionally, no one can stop you from building new ones.

Poor user experience

User experience is one more thing crucial for your site’s rankings. If it had started ranking your page high on search results and then noticed it didn’t meet users’ expectations, your rankings could have suffered a lot.

Search engines usually rely on metrics such as the click-through rate, time spent on your page, bounce rate, the number of visits, and more. That’s why you should remember the following rules when optimizing your site:

1. Provide relevant metadata

As metadata is used to form snippets, it should contain relevant descriptions of your content. First of all, if they aren’t engaging enough, users won’t click-through them and land on your site. On the other hand, if your snippets provide false promises, the bounce rate will increase.

2. Create an effective content structure

It should be easy for users to extract the necessary information. Most of your visitors pay attention to your content structure when deciding whether they’ll read the post.

Break the texts into paragraphs and denote the main ideas in the subheadings. This step will help you engage visitors looking for the answer to their very specific questions.

3. Avoid complicated design and pop-ups

The content isn’t the only thing your audience looks at. People may also decide to leave your website because of irritating colors, fonts, or pop-up ads. Provide simple design and minimize the number of annoying windows.

Competition from other websites

What if none of the steps worked? It might mean that your rankings dropped because your competitors were performing better. Monitor changes in their positions and identify the SERP leaders.

You can analyze your competitors’ strategies with Serpstat or Moz. With these tools, you can discover their backlink sources, keywords they rank for, top content, and more. This step will help you come up with ideas of how you could improve your own strategy.

Never stop tracking

You can’t predict whether your rankings will drop one day. It’s much better to notice the problem before you’ve already lost traffic and conversions. So, always keep tracking your positions and be ready to react to any changes quickly.

Inna Yatsyna is a Brand and Community Development Specialist at Serpstat. She can be found on Twitter .

Related reading

Google China
What defines high-quality links in 2019 and how to get them
The evolution of SEO and the shift from point solutions to platform
On-site analytics tactics to adopt now Heatmaps, intent analysis, and more

This marketing news is not the copyright of Scott.Services – please click here to see the original source of this article. Author: Inna Yatsyna

For more SEO, PPC, internet marketing news please check out https://news.scott.services

Why not check out our SEO, PPC marketing services at https://www.scott.services

We’re also on:
https://www.facebook.com/scottdotservices/
https://twitter.com/scottdsmith
https://plus.google.com/112865305341039147737

The post Seven reasons why your rankings dropped and how to fix them appeared first on Scott.Services Online Marketing News.



source https://news.scott.services/seven-reasons-why-your-rankings-dropped-and-how-to-fix-them/

As CCPA deadline approaches, only 14% of enterprises fully compliant so far

A new survey has found that only 14% of companies subject to looming California Consumer Privacy Act regulation consider themselves fully compliant, yet the majority (84%) said they had started the compliance process and 56% said they were in the process of implementation.

 TrustArc, the company behind TRUSTe certification, commissioned a survey of IT and legal professionals at 250 companies from a range of industries. Half of the firms were impacted by GDPR and CCPA, while 50% were only subject to CCPA.

What’s also interesting is asked how much they expected to spend on CCPA compliance, 71% of respondents said their spending would exceed $100,000; 39% said it would be more than $500,000 and 19% said it would be more than $1 million. The top areas of investment were technology and tools (72%), consultants (61%), lawyers (55%) and internal hiring (45%).

This week marks the one-year anniversary of GDPR, which is increasingly the model for data privacy laws around the world. In California, there’s a pending bill to make CCPA more like GDPR, even as other pro-business factions seek to weaken it.

CCPA is currently scheduled to take effect in on January 1, 2020. If the pro-privacy amendment (AB 1760) is passed it would push back the CCPA compliance date to January 1, 2021. 

The market a bigger motivator than the law. Perhaps the most interesting finding of the survey, the primary motivation for “investing in CCPA compliance” was not to avoid liability and legal sanctions. It was meeting customer and partner expectations, which is indirectly about lability and sanctions. Still, the concern is that business will be lost if these firms aren’t in compliance. In other words, the market is already starting to enforce the new privacy rules.

Why we should care. These results, if they can be generalized, indicate that most companies are aware of CCPA and are in somewhere on the compliance spectrum. This is encouraging despite the uncertainty surrounding what the specific requirements will be and the lobbying to both strengthen and weaken the law.

It’s highly unlikely that Congress will pass any privacy legislation before 2020 that will pre-empt the California law. Accordingly, companies from Washington to Florida will need to get ready to comply with the new privacy and data protection framework under CCPA.


About The Author

Greg Sterling is a Contributing Editor at Search Engine Land. He writes a personal blog, Screenwerk, about connecting the dots between digital media and real-world consumer behavior. He is also VP of Strategy and Insights for the Local Search Association. Follow him on Twitter or find him at Google+.

This marketing news is not the copyright of Scott.Services – please click here to see the original source of this article. Author:

For more SEO, PPC, internet marketing news please check out https://news.scott.services

Why not check out our SEO, PPC marketing services at https://www.scott.services

We’re also on:
https://www.facebook.com/scottdotservices/
https://twitter.com/scottdsmith
https://plus.google.com/112865305341039147737

The post As CCPA deadline approaches, only 14% of enterprises fully compliant so far appeared first on Scott.Services Online Marketing News.



source https://news.scott.services/as-ccpa-deadline-approaches-only-14-of-enterprises-fully-compliant-so-far/