Why does structured data matter for SEO?

Why does structured data matter for SEO?

Learn why now is the time to optimize your structured data.

Structured data represents a huge opportunity for SEOs to communicate key information with search engines, boost content visibility, and reach target audiences. It makes it easier for search engine crawlers to extract and understand specific information related to the content, in this case, the kind of product, the aggregate rating, available offers, and product reviews. This allows the crawler to understand your content with increased accuracy.

This report from Botify describes how structured data can give SEOs a competitive advantage and lead to significantly increased levels of search visibility and engagement rates. Visit Digital Marketing Depot to download “Structured Data: Why now is the time to optimize.”

Original Article Here

I USED ONLY BING FOR 3 MONTHS. HERE’S WHAT I FOUND—AND WHAT I DIDN’T

I USED ONLY BING FOR 3 MONTHS. HERE’S WHAT I FOUND—AND WHAT I DIDN’T

WHAT FINALLY BROKE me was the recipes.

On July 1, I abandoned Google search and committed myself instead to Bing. I downloaded the Bing app on my phone. I made it the default search mode in Chrome. (I didn’t switch to Edge, Microsoft’s browser, because I decided to limit this experiment strictly to search.) Since then, for the most part, any time I’ve asked the internet a question, Bing has answered.

A stunt? Sure, a little. But also an earnest attempt to figure out how the other half—or the other 6 percent overall, or 24 percent on desktop, or 33 percent in the US, depending on whose numbers you believe—finds their information online.

And Bing is big! The second-largest search engine by market share in the US, and one of the 50 most visited siteson the internet, according to Alexa rankings. (That’s the Amazon-owned analytics site, not the Amazon-made voice assistant.) I wanted to know how those people experienced the web, how much of a difference it makes when a different set of algorithms decides what knowledge you should see. The internet is a window on the world; a search engine warps and tints it.

There’s also never been a better time to give Bing an honest appraisal. If Google’s data-hoovering didn’t creep you out before, its attitude toward location tracking and Google+ privacy failings should. And while privacy-focused search options like DuckDuckGo go further to solve that problem, Bing is the most full-featured alternative out there. It’s the logical first stop on the express train out of Googletown.

 

Original Article Here

Facebook cracking down on ads with clickbait headlines, sensationalized language

Facebook cracking down on ads with clickbait headlines, sensationalized language

The company says any ads containing engagement bait, sensationalized language or headlines that withhold information will be penalized.

In May 2017, Facebook announced it was doing more to demote ads that included shocking, disruptive or malicious content. On Wednesday, the company announced it was expanding its efforts in this area by further reducing distribution of — or disapproving — low quality ads.

Marketers using less than credible ad content will need to reconsider their strategy as their ads will continue to get less and less play — or be removed. Here’s a look at what’s changing.

Ads that withhold information. See the examples shown above. Any Facebook ads that include sensationalized or exaggerated headlines to generate a reaction, but fail to deliver the anticipated response on the landing page will also be demoted or disallowed.

Engagement bait ads. Another type of advertisement that will be penalized is any ad that withholds information to get people to click on a link to understand the full meaning of the post. Advertising headlines like “You won’t believe what happened next” or “You’ll be shocked when you see the results” are prime examples of this type of clickbait.

Original Article Here

Bing says it is improving web crawler efficiency

Bing says it is improving web crawler efficiency

Bing is working on making sure their crawler doesn’t miss new content and at the same time overload your web servers.

Responding to user feedback. The update is a follow up to his talk at SMX Advanced in June, during which he announced an 18-month effort to improve BingBot. Canel asked the audience to submit suggestions and feedback.

In a blog post Tuesday, Canel said the team has made numerous improvements based on this feedback and thanked the SMX audience for its contributions. He said they will “continuing to improve” the crawler and share what they’ve done in a new “BingBot series” on the Bing webmaster blog.

BingBot’s goal. In this first post, Canel outlined the goal for BingBot, which is to use an algorithm to determine “which sites to crawl, how often, and how many pages to fetch from each site.” To ensure site’s servers aren’t overloaded by the crawler, the goal of BingBot is to limit its “crawl footprint” on a site while ensuring content in its index is as fresh as possible.

This “crawl efficiency” is the balance Bing is working to strike at scale. Canel said, “We’ve heard concerns that bingbot doesn’t crawl frequently enough and their content isn’t fresh within the index; while at the same time we’ve heard that bingbot crawls too often causing constraints on the websites resources.” It’s a work in progress.

Why should you care? Bing is clearly listening to the webmaster and SEO community. The Webmaster Tools team is making changes to ensure its crawler does not overload your servers while at the same time are faster and more efficient when it comes to finding new content on your web site. Bing is actively working on this and says it will continue to work on this.

How does this impact you? If you add new content to your web site and Bing doesn’t see it, it won’t rank it. That means searchers using Bing will not find your new content.

Recently Bing shut down the anonymous submit URL tool, and we have seen reports that Bing is not listening to submit URL requests even in Bing Webmaster Tools. It is possible the tweaks and changes Bing is making is causing some of this slowness with crawling and indexing now. But ultimately, Bing is clearly working on the issue.

Original Article Here

A basic SEO audit for small businesses

A basic SEO audit for small businesses

Most small businesses can perform a simple audit and improve their SEO considerably without involving an agency. Ten steps you can take to get started.

A crucial aspect of getting SEO on point for your small business is a basic SEO audit. This analysis provides you with a tool, much like an SEO SWOT analysis that highlights the areas that you need to focus on. And, while there are many tools out there that provide an overview of your SEO, learning how to perform a basic SEO audit using free tools like Screaming Frog is a skill that will serve you well and ensure the precious time you devote to SEO is well-spent.

Having the skills to perform a basic SEO audit really empowers you as a business owner with regard to your SEO. It’s tempting to think that web designers and developers all understand SEO, but in most cases, they are simply not experts, and nobody knows your business better than you do.

By following the steps outlined in this post, you will be able to assess your SEO, identify opportunities and troubleshoot issues as they arise.

Tools of the trade

There are a number of tools out there that will help you audit your site. But the majority of these come with a monthly fee. For this audit, though, all you will need is the following freely available tools:

  • Google Analytics.
  • Google Search Console.
  • Screaming Frog SEO Spider.

Note: Screaming Frog is free for up to 500 pages or is licensed yearly. Even the paid version is an affordable tool that will help you achieve your goals, so register it if you feel it adds value (Hint: It will).

Crawl your site

The first step here is to crawl your site, so start the Screaming Frog and software and crawl your site. This will provide you with the information we will check as we step through the audit.

Original Article Here

Google floods webmasters with ‘mobile-first indexing enabled’ notifications

Google floods webmasters with ‘mobile-first indexing enabled’ notifications

Did you get a notice from Google that your site is now enabled for mobile-first indexing? You’re not alone.

Since Tuesday, Google has been sending notices around mobile-first indexing being enabled for websites to a huge number of webmasters. It seems like this batch of notices is by far the single largest batch of notices we’ve seen since they officially began sending these types of batch notifications around the mobile-first indexing change.

Here is a screen shot I shared of just some of the notices I received for sites that I have access to within Google Search Console:

So if you received a notice, trust me, you are not alone.

What is mobile-first indexing? Google explained their “crawling, indexing and ranking systems have typically used the desktop version of a page’s content, which may cause issues for mobile searchers when that version is vastly different from the mobile version. Mobile-first indexing means that we’ll use the mobile version of the page for indexing and ranking, to better help our — primarily mobile — users find what they’re looking for.” In short, Google is crawling the web as a smartphone browser would render your website, as opposed to how a desktop browser would.

How does it impact my rankings? Google hopes that it won’t have an impact on your rankings. But if your site’s content is vastly different on desktop versus mobile, then it can impact your rankings on Google search. Google has historically moved sites over that show little change between mobile and desktop versions of their site (or if they just have a desktop-only version). But Google with this batch may have moved sites that have more differences between their desktop and mobile site.

Advice from Google? Google issued these clarifications a few months back with new and clearer advice around this change.

Original Article Here

Website redesign mistakes that destroy SEO

Website redesign mistakes that destroy SEO

To keep up with user preferences, you have to redesign your website now and then. Learn how to avoid the most common pitfalls when you do.

Redesigning a website, whether it’s your own or a client’s, is an essential part of marketing today. It’s essential because technology, trends, and the expectations of users change over time, and if we want to remain competitive, we must keep pace with these changes.

But this task, while essential, also presents certain risks from an SEO perspective. A number of things can go wrong during the process. These issues can potentially cause search engines to no longer view that website as the authoritative answer to relevant queries. In some cases, certain mistakes can even result in penalties.

No one wants that.

So in this article, we’re going to explore some of the common web design mistakes that can destroy SEO. Knowing the potential risks may help you avoid making the kind of mistakes that tank your organic search traffic.

Leaving the development environment crawlable / indexable

People handle development environments in a lot of different ways. Most simply set up a subfolder under their domain. Some may create a domain strictly for development. Then there are those who take the kind of precautions to hide their development environment that would give a CIA agent a warm fuzzy feeling in that empty spot where their heart should be.

I tend to fall into the latter category.

Search engines are generally going to follow links and index the content they find along the way — sometimes even when you explicitly tell them not to. That creates problems because they could index two versions of the same website, potentially causing issues with both content and links.

Because of that, I place as many roadblocks as possible in the way of search engines trying to access my development environment.

Here’s what I do. The first step is to use a clean URL that has never been used for a live website before. This ensures there are no links pointing to it. Next, disallow all bots using robots.txt, and set up an empty index page so that other folders are not visible. In the past, I’ve even gone as far as setting up password protection, but in most cases, that may be overkill. You can make that call.

From there, I’ll set up a separate folder for each website in development. Typically, the folder name will be a combination of incomplete words so that it’s unlikely to be found randomly. WordPress will then be installed in these folders, and configured to also block bots at this level.

Arbitrarily changing image names on pages that rank well

This isn’t always an issue, but if a web page is ranking well, changing the name of an image on that page may cause a loss of ranking. Especially if the web designer doesn’t know what they’re doing.

I’ve seen this happen more than a few times, where a client hires a web designer who doesn’t understand SEO to redesign a website that already ranks well. As part of the redesign process, they replace old images with new, larger images, but, lacking the appropriate experience, they use stupid image names that provide zero SEO value, like image1.jpg.

This takes away a vital piece of context that search engines use to determine where a particular web page should rank.

Deleting pages or changing page URLs without redirecting them

During a redesign, some pages will almost certainly no longer be needed. Less experienced web designers will often simply delete them. Other pages may be moved and/or renamed, which in most cases, changes their URL. In these cases, inexperienced web designers often change these URLs and consider the task complete.

This is a big mistake because some of those pages may already rank well. They might have inbound links pointing to them or have been bookmarked by visitors.

When you delete pages that already have inbound links, you’ll lose all of the SEO value from those links. In some cases, this could result in a drastic loss of ranking.

The issue goes even deeper though. Anyone clicking those links or bookmarks will be greeted by a 404 page. That presents zero value to anyone, and more importantly, it creates a negative user experience. This is important because Google has confirmed that user experience is a ranking factor.

The proper way to delete pages is to redirect any them to the most relevant page that currently exists. As for moving pages, which includes anything that changes the URL of that page in any way, it’s equally important to redirect the old URL to the new one.

In both scenarios, a 301 redirect should generally be used. This tells search engines that the old page has been permanently moved to the new location. For most hosting platforms, this is best accomplished by adding the appropriate entry into your .htaccess file.

If you’re unable to see a .htaccess file on your server, you may need to adjust the settings on your FTP program to view hidden files.

Some specialized hosting platforms may utilize a different method, so you may need to check with their support team to determine how to accomplish it.

Original Article Here

Compare 17 top SEO tools and platforms

Compare 17 top SEO tools and platforms

Let MarTech Today’s ‘Enterprise SEO Platforms: A Marketer’s Guide’ help you make the informed choice.

Organic search remains the most important step in the purchase funnel. But with hundreds, thousands, tens of thousands, and even millions of pages, sites, social conversations, images and keywords to manage and optimize, SEO has become increasingly complicated and time-consuming. Using an enterprise SEO platform can increase efficiency and productivity while reducing the time and errors involved in managing organic search campaigns.

MarTech Today’s “Enterprise SEO Platforms: A Marketer’s Guide” examines the market for SEO platforms and the considerations involved in implementing this software into your business.

This 58-page report includes profiles of 17 leading SEO tools, vendors, pricing information, capabilities comparisons and recommended steps for evaluating and purchasing.

Visit Digital Marketing Depot to download “Enterprise SEO Platforms: A Marketer’s Guide.”

Original Article Here

Is it time to graduate to Google Analytics 360?

Is it time to graduate to Google Analytics 360?

Your business is growing. Has it outgrown the free version of Google Analytics?

It’s a common concern for marketers and analysts worldwide: As your business has grown, so have your needs for complete and accurate data that can be integrated with other platforms. It might be time to upgrade to Google Analytics 360.

This eBook from InfoTrust will provide you with everything you need to know when considering and purchasing a Google Analytics 360 license for your organization. It covers:

  • The differences between Google Analytics and Google Analytics 360
  • Identifying if there is a business case for migrating to Google Analytics 360
  • Negotiating the right deal with a Google Analytics 360 Reseller

Visit Digital Marketing Depot to download “Google Analytics 360: A Purchasing Guide.”

Original Article Here

How to thrive within the fast-paced SEO environment

How to thrive within the fast-paced SEO environment

There will always be new mistakes and new challenges. The key is to learn and evolve and make yourself a better marketer.

One of the best things about the search space is that it’s hard. We are constantly working to keep our finger on the pulse, experiment with new ideas, and drive results. It’s why I love it.

At the same time, because it’s hard and because it’s constantly evolving, no matter how long you’ve been doing it, you are bound to make mistakes. In fact, search “SEO mistakes” and you’ll find about 18 million other people who agree.

I have made any number of mistakes in the past 13 years and while I’m not going to list them out line by line (we don’t have that kind of time), I do think there’s value in discussing what we can do to avoid some of the more common ones. Let’s jump in.

1. Always track changes

It’s an age-old tale; someone in an organization (the client, the dev team, the CEO) decides to make an update to the site without communicating it. Pages are gone or moved, content has been changed, and even worse, you didn’t notice it until a few weeks later when traffic was gone and rankings had tanked.

Unfortunately, as much as we communicate, as much as we try to stay involved, situations like this are bound to occur. The best thing to do is to prepare. Here’s how.

Set up change alerts

Tools like SEORadar or VisualPing will notify you when changes are made to a site. Whether it’s on-page or in the code, you will get an alert and immediately be able to see where the change occurred. For larger e-commerce sites where changes are made frequently, a tool like SEORadar will allow you to choose the types of changes you want to be notified about. A good feature considering none of us want to be bombarded with useless emails.

Keep a changelog

At KoMarketing, we use a combination of Basecamp and Google Drive to ensure we can easily find existing recommendations. After all, if a page is accidentally removed or you need to revert content or tagging, finding the approved content becomes pretty important. Even more importantly, if a site tanks, it’s good to be able to see what drove it.

A few things we do to stay organized:

  • Shared Changelog. For a number of clients, we keep a shared changelog with the dev team. This way we know the when, what, and where of site updates.
  • Analytics Annotations. When an update is released, recommendations are implemented, or a big announcement is made (ex: mobile indexing), make an annotation in your analytics platform. A year from now, when you are pulling data and wondering what happened, you’ll have it right in front of you. Annotations can be lifesavers.
  • Close out messages. For example, if a page was updated, make a note in the original message, noting the date of the change and the URL. Record keeping FTW!!

2. Clean data = Good data

You spent hours creating a report. The results look good. You’re showing value. And just when it’s time to present the report to the team, you hear:

“Does this include login traffic?”

or

“We actually switched to a new profile.”

or

“We need to take out traffic from X.”

Make sure you’re using the right data from the start. I can’t tell you how many times I’ve been on a project when one team has been using one data point and another team a different one. And you’d be surprised by the number of reports I’ve had to redo because we had the wrong information or the client wanted certain data points removed.

At the same time as you sync up with your team and the client, make sure your analytics is set up properly from the start – is tracking on all pages? Is sub-domain tracking set up? Are the correct goal URLs set up? Is event tracking working properly?

One of the biggest challenges we have in SEO is showing value and we rely on analytics data to help us. Without the right data in place, our challenge becomes even greater.

3. Knowledge is power

I’ve said it before and I’ll say it again and again and again…there is a lack of education in the marketing space when it comes to SEO. Not only that, but the value of SEO is still being questioned, as Simon Heseltine pointed out in a recent post, “Is SEO table stakes? (Hint: No!)”.

Here’s the thing – while it’s changing, it’s not changing fast enough and we can’t get mad because someone doesn’t understand the value of what we are doing or understand everything that’s involved in the process. More importantly, we have to be able to explain things in a way that matters to the stakeholders. Here’s how:

Know Your Audience

How we talk to the PR team is different than how we talk to the Dev team and certainly different than how we talk to the CMO. Guess what? The CMO probably doesn’t care about the type of redirect you are recommending. What they do care about is the impact it has on the overall business.

Know who it is you are talking to, what their knowledge is, and what they care about. If you are unsure, ask ahead of time. During our initial discovery, we not only ask questions related to SEO but also get backgrounds on the people we will be working with.

  • What is their role?
  • What are their goals?
  • Have they worked with an SEO team in the past?

This type of information can be really helpful.

Avoid the SEO bubble

Last week I was providing a recommendation on duplicate content. The client set up a sub-domain and a sub-folder containing the same information. As I started to explain the way search engines index pages, I realized they didn’t care and they didn’t need to know that information. What they needed to know was the result and why it was important we fix it.

Look, we spend hours of our lives analyzing Google, so I get why we want to share our knowledge. The thing is, it doesn’t always matter. Sometimes we have to step out of our SEO bubble and talk like regular humans.

4. Don’t forget the customer

Back in May, I gave a presentation on identifying gaps in your content strategy. One of the case studies I used involved lots of content, huge increases in traffic and rankings, and an unhappy client.

See, it turned out that while we were building an amazing portfolio of content that was driving people to the site, we were actually building an amazing portfolio of top to medium funnel content. We weren’t focused on conversions and we weren’t focused on existing customers. Fail!

As search marketers, it’s so easy to forget what it is we are trying to do. There’s so much pressure to improve results and improve position that we often forget why we are doing it in the first place…sales.

Make sure you are focused on the customer and what it is they want. Here are a few resources to guide you:

5. There’s more than one way

Can we all just agree there’s often more than one right way? That yes, maybe this way worked great for you but this other way worked great for someone else. Perhaps SEO has a lot of intricacies and nuances and is often specific to a site or industry or platform. Maybe?

I am harping on this a bit but the reason is that we often get too caught up in the “this has to be done a certain way” mentality. We get on calls with developers and tell them the way we want it done. We fight battles over meta tag lengths or how a title tag should be written. Come on.

To be a good SEO means being able to compromise and figure out how to make things work even if it’s not the way you would’ve done it. We have to pick our battles and push for the things that really matter. And remember, just because Google says jump, doesn’t mean you have to jump.

More lessons to be learned

If I’ve learned anything over the course of my career, it’s that there will always be new lessons. That I will always make another mistake and I will always have to face new challenges. The key is to learn and evolve and make myself a better marketer.

Remember, focus on the things that matter to the business, to the stakeholders, to the customer, and most importantly, remember that a successful SEO needs the support of those around them.

Original Article Here