The best online backup service with best encryption features

The best online backup service with best encryption features

Find out which ones give you more (or less) control over your data.

By

Many people resist backing up their data to an online backup service like MozyHome, Carbonite, or Backblaze because they worry their data will be poked through by company employees, hijacked by criminals, or provided to law enforcement or government agents without due process.

The sanctity of your data boils down to whether the encryption key used to scramble your data can be recovered by anyone other than yourself. Below I outline the various methods and levels of encryption that can be employed by these services, and then evaluate five of the best options for home users. Several give subscribers full control of their encryption. If you’re already using a service, it’s possible you can even upgrade to take advantage of greater ownership options.

[Editor’s note: This article originally posted on September 2016. It was updated in August 2017 to remove CrashPlan for Home, one of the services offered by Code42. The company stopped accepting new subscribers and will shut the service down in October 2018. For information on options for migrating, see our article “How to move from CrashPlan for Home to another backup solution.” In this update, we examine Code42’s higher-priced CrashPlan for Small Business, as it’s being marketed to consumers as an alternative and some people may be evaluating whether to migrate from the Home offering.]

Choosing the services to evaluate

These are the parameters I set up for this roundup:

  • Focused on services that offer a personal edition, where you can purchase an account for a single computer or a bundle for a family.
  • Included services that are established or well-reviewed.
  • Excluded services that offer scant information about their security and encryption practices. Subscribers should always be privy to how their data is protected.
  • Excluded sync services, even those (like SugarSync) that offer continuous backup and versioning. I define a sync service as one that doesn’t encrypt data with a per-user key before being transmitted over a secure connection. That also leaves out Box, Dropbox, iCloud, Google Drive, and others.
  • I also bypassed services that offer bad advice about file retention or security practices, and ones whose information is years out of date.

Six companies remained after this winnowing: Backblaze, Carbonite, CrashPlan, iDrive, MozyHome, and SpiderOak ONE. Keep reading to see how they rate on encryption features and strength. (Note that we include CrashPlan for Small Business, for reasons noted above, even though it would otherwise not fit our criteria.)

The services are given a rating from Excellent to Poor, with a summary of the best and worst points in the pros and cons that follow each rating. For services that offer multiple ways to set up security and privacy, I’ve ranked based on the best method available, as outlined in the section above.

Backblaze

Encryption rating: Very good

Pros:

  • Data is encrypted before and in transit
  • Website lets you access encrypted backups
  • Platforms: macOS, Windows, iOS, Android

Cons: 

  • Password is transmitted for recovery
  • Lacks a client that can restore and browse with local encryption keys
  • Unique keys can be unlocked with passphrase for master key
  • Removable volumes’ backups are deleted after 30 days

Backblaze uses public-key cryptography—the same kind of encryption used widely across the internet, including web connections with SSL/TLS cryptographic protocols. The app creates a public-private key pair and transmits the private key to its servers. For each backup session, Backblaze creates a new strong session key, and uses the private key in the key pair to encrypt it and send to its servers. The key is only stored in memory on the client and never stored in the clear at the server.

online backup encryption backblaze

An optional Private Encryption Key protects your encryption key, even though it’s stored on Backblaze’s servers.

However, you can opt to set a passphrase to encrypt the private key before it’s transmitted to the server. In that way, this master private key and each session key are held in escrow. Only someone with the passphrase can access the private key, which in turn can decrypt a session key that restores data associated with a backup session.

Backblaze has engineered its system so that restores all happen via its website, not in the native computer app, so you have to enter that passphrase to decrypt the private key. The passphrase is also required for viewing information about backups through its website and mobile clients. The private key is also held only in memory on its servers and dumped when file browsing and restore operations finish.

This isn’t ideal. Backblaze falls short of other backup services by not offering a client that can handle restoring and browsing with encryption keys kept entirely locally. And while each backup session has a unique key, the fact that all can be unlocked with knowledge of the passphrase used to protect the master private key makes that less impressive. In practice, you’re more secure if you never restore files or browse lists.

One of Backblaze’s advantages, however, is cost: unlimited storage for $5 a month per computer, with discounts for one-year and two-year subscriptions.

Carbonite

Encryption rating: Excellent on Windows, Poor on Mac

Pros:

  • Data is encrypted before transit with Private Key encryption for Windows users
  • Website lets you access encrypted backups (only through Auto option)
  • Platforms: macOS (limited), Windows, iOS, Android

Cons: 

  • Data is encrypted before transit with Private Key encryption for Windows users, but not with Auto Encryption (Mac users’ only choice)
  • Mac users get a server-side key that’s stored on the server

Carbonite is a mixed bag. It offers only Windows users the opportunity to passphrase-protect a private key. Mac users rely on a server-side key that’s generated and stored there. Worse, Carbonite doesn’t encrypt Mac users’ data with a key before transmitting it with its default Automatic Encryption option; it encrypts data only after it’s been receiving at its servers. That’s not the case with what it calls Private Key under Windows.

online backup encryption carbonite

Carbonite allows only Windows users to turn on advanced backup settings and set a private key that the company never accesses.

Because encryption happens on the far end, restored files are also decrypted before being transmitted back to a Mac user. Carbonite should step up and provide Private Key for Mac users, as the current situation doesn’t meet the bar for robust protection for backups or restores.

CrashPlan for Small Business

Encryption rating: Excellent 

Pros:

  • Data is encrypted before and in transit
  • Password is not transmitted for recovery
  • Website lets you access encrypted backups
  • Platforms: macOS, Windows, Linux, iOS, Android, Windows Phone

Cons: 

  • The archive key reset via reminder question is not a secure method
  • CrashPlan requires a Java app, with associated security, reliability, usability issues
  • Expensive relative to comparable services

Code42’s CrashPlan for Small Business offers three distinct options for setting up password and key control:

  • Standard: At the basic level, Code42 maintains on its servers an encryption key generated by its backup app. Your password manages access to the account as well as tasks like adding computers, using mobile clients, and restoring files.
  • Archive key password: The CrashPlan client generates a key, but you set a separate passphrase to encrypt the key, which is then stored in escrow at CrashPlan’s servers. You can upgrade from Standard to Archive without dumping existing backups. The archive key can be changed. There’s even an option to add an archive key reset with a reminder question. This reduces security enormously, however, because it effectively means your easier-to-remember answer is now the weakest link in accessing backups. I recommend against using it.
  • Custom key: You generate a lengthy key in one of several methods that’s never stored in any fashion at the Code42 servers. This custom key option is unique among services surveyed—all others rely on either a key generated by the app, which a user may be able to escrow at a server, or use an algorithm to convert a passphrase into the encryption key. If you switch from standard or archive key, your previous data is dumped, and you can’t downgrade encryption of newly archived files.
security cloud backups crashplan for smb

IDG

You can choose among three kinds of archive encryption with CrashPlan. Custom Key keeps all information firmly in your hands.

CrashPlan can decrypt files entirely via its native app. The archive key or custom key need only be entered when restoring files via the web interface, or viewing files via the web or the mobile apps.

You have to leaven the excellent security options with CrashPlan with several negatives:

  • It relies on a non-native Java-based app which bundles a Java run-time distribution. Java is widely considered insecure, though in this form, it shouldn’t be vulnerable to outside attacks.
  • The app is slow and has an awkward and outdated interface, along with poor controls for restoring files.
  • It costs as much as twice for each computer as services with similar features and security.
  • Code42 recently canceled its CrashPlan for Home service, and gave existing subscribers as little as 60 days to choose between retaining archived versions and migrating to its Small Business flavor, albeit with a large discount.

iDrive and MozyHome

online backup encryption idrive encryption key copy

Even though it says you’re setting an encryption key, you’re really entering a passphrase that iDrive converts into the key used to protect your data.

Encryption rating: Fair 

Pros:

  • Data is encrypted before and in transit
  • Password is not transmitted for recovery
  • Website lets you access encrypted backups Platforms: macOS, Windows, Linux (iDrive only), iOS, Android

Con: 

  • Passphrase conversion carries some risk

iDrive and MozyHome are separate services that work in nearly an identical way. Both let a user create a passphrase—iDrive inaccurately calls it  a “private encryption key”—which is transformed through a cryptographic algorithm into a 256-bit encryption key.

When you use this option, neither the passphrase nor the resulting key gets transmitted to the service. Both iDrive and MozyHome also admirably handle decryption in their respective clients without sending the passphrase or key to a remote server.

online backup encryption mozy decrypt mac

MozyHome lets users set a key for encryption, but requires use of standalone software (shown here) to decrypt restored files.

Even though the key is never sent (good), this passphrase conversion approach is weaker (bad) than a passphrase that locks a separate encryption key. That’s because an attacker only needs to obtain your unencrypted passphrase or break it through brute force to have access to your key. With that, if they can obtain your backup archives from the services or capture them in transit somehow, they can decrypt.

While that scenario sounds unlikely, there have been exploits in the past that allow crackers to break encrypted data transmission. When a passphrase locks a separate encryption key, an attacker might need to obtain your account name and password and the passphrase, and then would either have to break into the backup service’s systems or log in directly and use the backup service’s interface to retrieve files, leaving a trail.

Outside of security, consider that MozyHome pricing is $6 a month for 50GB of storage and $10 a month for 120GB for one computer, and $2 a home for each additional computer. This is substantially more expensive than Backblaze, while also offering worse encryption options.

SpiderOak ONE

Encryption rating: Very Good 

Pros:

  • Data is encrypted before and in transit
  • Password is not transmitted for recovery
  • Website lets you access encrypted backups
  • Highly granular shared secure data areas
  • Platforms: macOS, Windows, Linux, iOS, Android

Cons: 

  • If you want to share files or use the website, you have to enter the password

SpiderOak ONE is a bit of a hybrid between the iDrive/MozyHome and Backblaze approaches, and its sole method is highly secure—there’s no account password-only default tier.

With SpiderOak, you create a password in the desktop client, and the software derives many, many encryption keys from that. The password is never stored or transmitted to SpiderOak, but the keys—generated uniquely for each data block of the backup, each folder, and each file revision—are wrapped in a layer of encryption and stored on the backup servers.

online backup encryption spideroak

By default, SpiderOak uses strong encryption for all backups; there’s no option to set.

This is because SpiderOak offers highly granular shared secure data areas, which require storing encryption keys on its servers in such a way that permission can be granted to multiple accounts to access files and folders. The same key can effectively be available to different users without storing it in such a way that SpiderOak (or a third party) can gain access.

In normal backup and restore operations, your password is never sent or used by the SpiderOak servers. However, if you want to share files, use the website for access, or use mobile clients, you have to enter the password to unlock access. As with other services, the keys generated from the password are stored in memory only while being used, and then flushed.

SpiderOak has multiple tiers based on how much data you want to backup, but has no additional fee per computer. Thus, you can pay $12 a month for 1TB of storage from an unlimited number of computers, which makes it competitive with Backblaze, which has unlimited storage per computer, but charges for each machine.

Encryption: The ins and outs

Internet-hosted backups have several points of failure where encryption can protect a user’s data. I evaluated the services on each of these points:

Key possession. Encrypted backups require someone to create and possess the underlying key that’s used to encrypt your data before being stored by the host. But there are several aspects to this:

  • Who creates the encryption key? In all six cases, the native desktop backup software handles key creation, but with two services, you can opt to create a key.
  • Does the backup host hold the key in a form it can directly access, or in “escrow,” where it’s protected by a passphrase you set and the host doesn’t know? Or does the host never hold the key at all?
  • Is the passphrase converted through an algorithm into the actual encryption key, or is the passphrase used to unlock the encryption key? In the former case, an attacker who recovers the passphrase also effectively has the key, and can decrypt your backups.

If a backup service lets you reset your account password without losing access to your archives, it has full access to the encryption keys that guard your backups. If it can’t access your files’ contents (and sometimes even the listing of files) unless you enter your password or a custom key, you retain control.

Diversity of keys. Each service varies in whether it uses a single key for all backups, or various keys for different tasks. For instance, CrashPlan for Small Business uses the same encryption key to scramble all backed-up files across all sessions; Backblaze generates a new key for each backup session; SpiderOak ONE has unique keys for every folder, version, and individual data block within its backups, partly to enable a group encrypted sharing option.

The more unique keys are used, the less risk you face from a single leaked or cracked key, or from advances in cryptographic cracking.

Encrypted before transit. Hosted backups require native apps to scan drives for files and transmit them. Strong encryption should be used by the app before files are transferred to a hosted service.

Encrypted in transit. It’s vitally important that transferred data is strongly protected separately from the encryption that wraps data before it’s sent. That’s to guard against offline attacks, where someone can intercept encrypted data and then attempt various ways to break it, both now and in the future. Encryption that’s unbreakable in 2016 may still break in the future.

Protected at rest. Even encrypted data needs additional layers of security. Some hosts disclose additional information about how they safeguard your data, including certifications and audits from third parties.

Restoring files. When you restore a backup, there’s also a question of where the key winds up. Even for services that allow a user to create a custom full encryption key, that key has to be transmitted to the backup host in a form that can be decrypted in order to restore files.

SEO Is Always Changing”… Or Is It?

“SEO Is Always Changing”… Or Is It?: Debunking the Myth and Getting Back to Basics

By: Bridget Randolph

Basic SEO

The author’s views are entirely his or her own (excluding the unlikely event of hypnosis) and may not always reflect the views of Moz.

Recently I made the shift to freelancing full-time, and it’s led me to participate in a few online communities for entrepreneurs, freelancers, and small business owners. I’ve noticed a trend in the way many of them talk about SEO; specifically, the blocks they face in attempting to “do SEO” for their businesses. Again and again, the concept that “SEO is too hard to stay on top of… it’s always changing” was being stated as a major reason that people feel a) overwhelmed by SEO; b) intimidated by SEO; and c) uninformed about SEO.

And it’s not just non-SEOs who use this phrase. The concept of “the ever-changing landscape of SEO” is common within SEO circles as well. In fact, I’ve almost certainly used this phrase myself.

But is it actually true?

To answer that question, we have to separate the theory of search engine optimization from the various tactics which we as SEO professionals spend so much time debating and testing. The more that I work with smaller businesses and individuals, the clearer it becomes to me that although the technology is always evolving and developing, and tactics (particularly those that attempt to trick Google rather than follow their guidelines) do need to adapt fairly rapidly, there are certain fundamentals of SEO that change very little over time, and which a non-specialist can easily understand.

The unchanging fundamentals of SEO

Google’s algorithm is based on an academia-inspired model of categorization and citations, which utilizes keywords as a way to decipher the topic of a page, and links from other sites (known as “backlinks”) to determine the relative authority of that site. Their method and technology keeps getting more sophisticated over time, but the principles have remained the same.

So what are these basic principles?

It comes down to answering the following questions:

  1. Can the search engine find your content? (Crawlability)
  2. How should the search engine organize and prioritize this content? (Site structure)
  3. What is your content about? (Keywords)
  4. How does the search engine know that your content provides trustworthy information about this topic? (Backlinks)

If your website is set up to help Google and other search engines answer these 4 questions, you will have covered the basic fundamentals of search engine optimization.

There is a lot more that you can do to optimize in all of these areas and beyond, but for businesses that are just starting out and/or on a tight budget, these are the baseline concepts you’ll need to know.

Crawlability

You could have the best content in the world, but it won’t drive any search traffic if the search engines can’t find it. This means that the crawlability of your site is one of the most important factors in ensuring a solid SEO foundation.

In order to find your content and rank it in the search results, a search engine needs to be able to:

  1. Access the content (at least the pages that you want to rank)
  2. Read the content

This is primarily a technical task, although it is related to having a good site structure (the next core area). You may need to adapt the code, and/or use an SEO plugin if your site runs on WordPress.

For more in-depth guides to technical SEO and crawlability, check out the following posts:

Site structure

In addition to making sure that your content is accessible and crawlable, it’s also important to help search engines understand the hierarchy and relative importance of that content. It can be tempting to think that every page is equally important to rank, but failing to structure your site in a hierarchical way often dilutes the impact of your “money” pages. Instead, you should think about what the most important pages are, and structure the rest of your site around these.

When Google and other search engine crawlers visit a site, they attempt to navigate to the homepage; then click on every link. Googlebot assumes that the pages it sees the most are the most important pages. So when you can reach a page with a single click from the homepage, or when it is linked to on every page (for example, in a top or side navigation bar, or a site footer section), Googlebot will see those pages more, and will therefore consider them to be more important. For less important pages, you’ll still need to link to them from somewhere for search engines to be able to see them, but you don’t need to emphasize them quite as frequently or keep them as close to the homepage.

The main question to ask is: Can search engines tell what your most important pages are, just by looking at the structure of your website? Google’s goal is to to save users steps, so the easier you make it for them to find and prioritize your content, the more they’ll like it.

For more in-depth guides to good site structure, check out the following posts:

Keywords

Once the content you create is accessible to crawlers, the next step is to make sure that you’re giving the search engines an accurate picture of what that content is about, to help them understand which search queries your pages would be relevant to. This is where keywords come into the mix.

We use keywords to tell the search engine what each page is about, so that they can rank our content for queries which are most relevant to our website. You might hear advice to use your keywords over and over again on a page in order to rank well. The problem with this approach is that it doesn’t always create a great experience for users, and over time Google has stopped ranking pages which it perceives as being a poor user experience.

Instead, what Google is looking for in terms of keyword usage is that you:

  1. Answer the questions that real people actually have about your topic
  2. Use the terminology that real people (specifically, your target audience) actually use to refer to your topic
  3. Use the term in the way that Google thinks real people use it (this is often referred to as “user intent” or “searcher intent”).

You should only ever target one primary keyword (or phrase) per page. You can include “secondary” keywords, which are related to the primary keyword directly (think category vs subcategory). I sometimes see people attempting to target too many topics with a single page, in an effort to widen the net. But it is better to separate these out so that there’s a different page for each different angle on the topic.

The easiest way to think about this is in physical terms. Search engines’ methods are roughly based on the concept of library card catalogs, and so we can imagine that Google is categorizing pages in a similar way to a library using the Dewey decimal system to categorize books. You might have a book categorized as Romance, subcategory Gothic Romance; but you wouldn’t be able to categorize it as Romance and also Horror, even though it might be related to both topics. You can’t have the same physical book on 2 different shelves in 2 different sections of the library. Keyword targeting works the same way: 1 primary topic per page.

For more in-depth guides to keyword research and keyword targeting, check out the following posts:

Backlinks

Another longstanding ranking factor is the number of links from other sites to your content, known as backlinks.

It’s not enough for you to say that you’re the expert in something, if no one else sees it that way. If you were looking for a new doctor, you wouldn’t just go with the guy who says “I’m the world’s best doctor.” But if a trusted friend told you that they loved their doctor and that they thought you’d like her too, you’d almost certainly make an appointment.

When other websites link to your site, it helps to answer the question: “Do other people see you as a trustworthy resource?” Google wants to provide correct and complete information to people’s queries. The more trusted your content is by others, the more that indicates the value of that information and your authority as an expert.

When Google looks at a site’s backlinks, they are effectively doing the same thing that humans do when they read reviews and testimonials to decide which product to buy, which movie to see, or which restaurant to go to for dinner. If you haven’t worked with a product or business, other people’s reviews point you to what’s good and what’s not. In Google’s case, a link from another site serves as a vote of confidence for your content.

That being said, not all backlinks are treated equally when it comes to boosting your site’s rankings. They are weighted differently according to how Google perceives the quality and authority of the site that’s doing the linking. This can feel a little confusing, but when you think about it in the context of a recommendation, it becomes a lot easier to understand whether the backlinks your site is collecting are useful or not. After all, think about the last time you saw a movie. How did you choose what to see? Maybe you checked well-known critics’ reviews, checked Rotten Tomatoes, asked friends’ opinions, looked at Netflix’s suggestions list, or saw acquaintances posting about the film on social media.

When it comes to making a decision, who do you trust? As humans, we tend to use an (often unconscious) hierarchy of trust:

  1. Personalized recommendation: Close friends who know me well are most likely to recommend something I’ll like;
  2. Expert recommendation: Professional reviewers who are authorities on the art of film are likely to have a useful opinion, although it may not always totally match my personal taste;
  3. Popular recommendation: If a high percentage of random people liked the movie, this might mean it has a wide appeal and will likely be a good experience for me as well;
  4. Negative association: If someone is raving about a movie on social media and I know that they’re a terrible human with terrible taste… well, in the absence of other positive signals, that fact might actually influence me not to see the movie.

To bring this back to SEO, you can think about backlinks as the SEO version of reviews. And the same hierarchy comes into play.

  1. Personalized/contextual recommendation: For local businesses or niche markets, very specific websites like a local city’s tourism site, local business directory or very in-depth, niche fan site might be the equivalent of the “best friend recommendation”. They may not be an expert in what everyone likes, but they definitely know what works for you as an individual and in some cases, that’s more valuable.
  2. Expert recommendation: Well-known sites with a lot of inherent trust, like the BBC or Harvard University, are like the established movie critics. Broadly speaking they are the most trustworthy, but possibly lacking the context for a specific person’s needs. In the absence of a highly targeted type of content or service, these will be your strongest links.
  3. Popular recommendation: All things being equal, a lot of backlinks from a lot of different sites is seen as a signal that the content is relevant and useful.
  4. Negative association: Links that are placed via spam tactics, that you buy in bulk, or that sit on sites that look like garbage, are the website equivalent of that terrible person whose recommendation actually turns you off the movie.

If a site collects too many links from poor-quality sites, it could look like those links were bought, rather than “earned” recommendations (similar to businesses paying people to write positive reviews). Google views the buying of links as a dishonest practice, and a way of gaming their system, and therefore if they believe that you are doing this intentionally it may trigger a penalty. Even if they don’t cause a penalty, you won’t gain any real value from poor quality links, so they’re certainly not something to aim for. Because of this, some people become very risk-averse about backlinks, even the ones that came to them naturally. But as long as you are getting links from other trustworthy sources, and these high quality links make up a substantially higher percentage of your total, having a handful of lower quality sites linking to you shouldn’t prevent you from benefiting from the high quality ones.

For more in-depth guides to backlinks, check out the following posts:

Theory of Links

Getting More Links

Mitigating Risk of Links

Does anything about SEO actually change?

If SEO is really this simple, why do people talk about how it changes all the time? This is where we have to separate the theory of SEO from the tactics we use as SEO professionals to grow traffic and optimize for better rankings.

The fundamentals that we’ve covered here — crawlability, keywords, backlinks, and site structure — are the theory of SEO. But when it comes to actually making it work, you need to use tactics to optimize these areas. And this is where we see a lot of changes happening on a regular basis, because Google and the other search engines are constantly tweaking the way the algorithm understands and utilizes information from those four main areas in determining how a site’s content should rank on a results page.

The important thing to know is that, although the tactics which people use will change all the time, the goal for the search engine is always the same: to provide searchers with the information they need, as quickly and easily as possible. That means that whatever tactics and strategies you choose to pursue, the important thing is that they enable you to optimize for your main keywords, structure your site clearly, keep your site accessible, and get more backlinks from more sites, while still keeping the quality of the site and the backlinks high.

The quality test (EAT)

Because Google’s goal is to provide high-quality results, the changes that they make to the algorithm are designed to improve their ability to identify the highest quality content possible. Therefore, when tactics stop working (or worse, backfire and incur penalties), it is usually related to the fact that these tactics didn’t create high-quality outputs.

Like the fundamentals of SEO theory which we’ve already covered, the criteria that Google uses to determine whether a website or page is good quality haven’t changed all that much since the beginning. They’ve just gotten better at enforcing them. This means that you can use these criteria as a “sniff test” when considering whether a tactic is likely to be a sustainable approach long-term.

Google themselves refer to these criteria in their Search Quality Rating Guidelines with the acronym EAT, which stands for:

  • Expertise
  • Authoritativeness
  • Trustworthiness

In order to be viewed as high-quality content (on your own site) or a high-quality link (from another site to your site), the content needs to tick at least one of these boxes.

Expertise

Does this content answer a question people have? Is it a *good* answer? Do you have a more in-depth degree of knowledge about this topic than most people?

This is why you will see people talk about Google penalizing “thin” content — that just refers to content which isn’t really worth having on its own page, because it doesn’t provide any real value to the reader.

Authority

Are you someone who is respected and cited by others who know something about this topic?

This is where the value of backlinks can come in. One way to demonstrate that you are an authority on a topic is if Google sees a lot of other reputable sources referring to your content as a source or resource.

Trust

Are you a reputable person or business? Can you be trusted to take good care of your users and their information?

Because trustworthiness is a factor in determining a site’s quality, Google has compiled a list of indicators which might mean a site is untrustworthy or spammy. These include things like a high proportion of ads to regular content, behavior that forces or manipulates users into taking actions they didn’t want to take, hiding some content and only showing it to search engines to manipulate rankings, not using a secure platform to take payment information, etc.

It’s always the same end goal

Yes, SEO can be technical, and yes, it can change rapidly. But at the end of the day, what doesn’t change is the end goal. Google and the other search engines make money through advertising, and in order to get more users to see (and click on) their ads, they have to provide a great user experience. Therefore, their goal is always going to be to give the searchers the best information they can, as easily as they can, so that people will keep using their service.

As long as you understand this, the theory of SEO is pretty straightforward. It’s just about making it easy for Google to answer these questions:

  1. What is your site about?
    1. What information does it provide?
    2. What service or function does it provide?
  2. How do we know that you’ll provide the best answer or product or service for our users’ needs?
  3. Does your content demonstrate Expertise, Authoritativeness, and/or Trustworthiness (EAT)?

This is why the fundamentals have changed so little, despite the fact that the industry, technology and tactics have transformed rapidly over time.

A brief caveat

My goal with this post is not to provide step-by-step instruction in how to “do SEO,” but rather to demystify the basic theory for those who find the topic too overwhelming to know where to start, or who believe that it’s too complicated to understand without years of study. With this goal in mind, I am intentionally taking a simplified and high-level perspective. This is not to dismiss the importance of an SEO expert in driving strategy and continuing to develop and maximize value from the search channel. My hope is that those business owners and entrepreneurs who currently feel overwhelmed by this topic can gain a better grasp on the way SEO works, and a greater confidence and ease in approaching their search strategy going forward.

I have provided a few in-depth resources for each of the key areas — but you will likely want to hire a specialist or consultant to assist with analysis and implementation (certainly if you want to develop your search strategy beyond simply the “table stakes” as Rand calls it, you will need a more nuanced understanding of the topic than I can provide in a single blog post).

At the end of the day, the ideas behind SEO are actually pretty simple — it’s the execution that can be more complex or simply time-consuming. That’s why it’s important to understand that theory — so that you can be more informed if and when you do decide to partner with someone who is offering that expertise. As long as you understand the basic concepts and end goal, you’ll be able to go into that process with confidence. Good luck!

About bridget.randolph —

Bridget Randolph is an SEO and marketing consultant. She recently took the leap into the freelance world after 4 years of agency life at Distilled, and a brief in-house stint with Hearst Digital Media. She especially enjoys learning about mobile technology, social media, and conversion rate optimization. Bridget is also interested in how different types of organizations implement a digital strategy, particularly arts and publishing brands.

Check Your Mobile Site Has Alt Text Tags on Images

Google: Check Your Mobile Site Has Alt Text Tags on Images

Most site owners are pretty good about ensuring their images have alt tags, especially ones that might also be used as links, as Google will use that alt text for SEO purposes.  But John Mueller had an interesting side note about the use of images when they are on a mobile site, which could have a pretty big impact for the upcoming mobile first index.

Specifically around images in general, so not specific to image links, but images in general, especially on mobile, make sure that your images on mobile also have an alt text.

This does make me wonder if this might be an issue Google is seeing as they are crawling and comparing desktop versions of a webpage with a mobile version, in preparation for mobile first index, where Google will be swapping over their search index from the desktop version of a page to the mobile version.  We do know that links in general is one issue with mobile first, so it is likely that it is something they are seeing with links as images as well on mobile.

Bottom line, when auditing your website in preparation for mobile first, checking that your images have appropriate alt tags should be on your list.  And don’t forget it isn’t strictly for SEO purposes, it is for usability purposes too, for those who can’t load images due to internet connection speed and for web accessibility reasons.

 

The 7 rules of respectful marketing

The 7 rules of respectful marketing

As more people implement ad blockers and brand safety takes on new urgency, columnist Lewis Gersh discusses how brands can put the customer experience front and center.

Consumers are in open rebellion about how they’re disrespectfully treated by your marketing. Millions now use ad blockers to escape the relentless barrage of online ads, resulting in 32 percent of global page views being impacted by ad blocking. And the rest don’t need an ad blocker because they’re conditioned to ignore the ad. Virtually all email users take advantage of spam protection software and mention receiving too many emails as the top reason for unsubscribing.

This is why it is absolutely imperative for brands to invest in respectful and relevant marketing. They must realize their customer experiences begin with their marketing, and that the customer has a reasonable expectation of respect. This is a two-way street, and you do harm to your own brand by plowing ahead with blinders on.

With this in mind, a brand should:

1. Be vigilant about where your partners (and their partners’ partners) get their data

When buying data for ad targeting and retargeting purposes, vet individual partners and use only those that are a validated, safe, primary data source. But even then, how do you know where that partner is getting their data? They could (unknowingly) be relying on a whole basket of providers from across the ethical spectrum.

Be thorough as you investigate, and only work with partners who use reliable, trustworthy data. They’re out there, and they generally have names you recognize.

2. Boycott data gathered from ambient listening and other unethical techniques

Ambient listening and scraping peoples’ text messages and emails are highly unethical and odious practices, and you want to stay far away from it. Add it to your contracts with data providers to ensure they will not knowingly give you data obtained unethically. Add a damage clause to it, and if they push back, run.

3. Favor publishers with good practices

Marketers should channel their dollars to publishers who prioritize quality content and a good user experience, especially ones that balance viewability guarantees without sacrificing UX.

Marketers should do business with publishers that have strong, plain-language privacy policies, and go publisher direct via private marketplaces where appropriate to reduce ad fraud.

4. Consider taking your programmatic in-house

Many major brands are setting up in-house programmatic efforts to reduce costs, gain transparency into what they’re buying and keep control of their first-party data. It’s a significant investment of time, money and resources — so at the very least, getting more involved in optimizations at the keyboard is worth the effort.

5. Fight ad fraud and deliver ‘healthy impressions’

A “healthy impression” is one that’s presented respectfully, at the proper time, in the proper context and situation. Fraudulent ads aren’t healthy, because they have no chance to influence a consumer. Capping the frequency of retargeting impressions and suppressing retargeting after the consumer has made a purchase are also respectful and healthy practices.

Marketers would be well served by a constant reminder that they are consumers as well. Whatever your personal threshold is, multiply that by 10 for the average consumer not enamored of (or employed by) marketing.

6. Consider regulation with teeth

Take an active, transparent and public stance. The industry should insist on effective regulation with criminal penalties to prevent some of the more egregious digital marketing abuses. CAN-SPAM was mostly ineffective to deter spam. It wasn’t until the free market created spam filters that spam got under control.

7. Consider brand safety precautions

The recent furor over digital ads appearing in, or adjacent to, objectionable and low-quality content has raised questions about current (and expensive) automated brand safety services that don’t work well enough. Demand to see where your ads run in real time, not after the campaign has ended. Put the time into your whitelists and blacklists, as others are doing.

While it’s true that the above practices are expensive and difficult, the respect you give and results you get will be well worth it. Why invest in display advertising that’s more likely to hurt your brand and waste your budget? Doesn’t it make sense to pay more, in an effort to finance a well-targeted and well-received campaign?

Digital advertising has fallen into an unhealthy pattern. Each new technique introduced enjoys high ROI for a while. Marketers overuse the technique, then it gets abused. Consumers, feeling harassed and disrespected, rebel and stop responding. Efficacy falls. Marketers, desperate for results, focus on efficiency to do more with less. This exacerbates the situation and worsens the abuse of the consumer.

It’s up to marketers to restore good health to the medium. Instead of trying to do more with less, focus on doing better. Put customer experience ahead of marketing ROI for a moment — just for a moment! — and you’ll start seeing fewer ad-blocking customers and better results in the long run.


Some opinions expressed in this article may be those of a guest author and not necessarily Marketing Land. Staff authors are listed here.

How to Remove Twitter Caches from iPhone and iPad

How to Remove Twitter Caches from iPhone and iPad

Twitter for iPhone and iPad has manual cache clearing features built into the app, offering a way to forcibly dump excessive caches and data stored within the application in iOS, thereby freeing up some storage. This is particularly nice because iOS does not offer a way to manually clear caches from an iPhone or iPad, so instead if you want to delete an apps Documents and Data in iOS you have to either force the iOS “cleaning” process on a near-full device, or delete the app and re-download it.

But that’s not the case with the Twitter app, which is nice enough to include a way to manually clear out its own documents and data cache storage within the iOS app.

How to Empty Twitter Caches on iPhone, iPad

Clearing Twitter caches on the iPhone and iPad is easy, here’s all you need to do:

    1. Open the Twitter app and go to your profile page
    2. Click on the gear icon

Clear caches on Twitter for iOS

    1. Click on “Settings” in the menu options

Clear caches on Twitter for iOS

    1. Choose “Data usage” from the settings menu

Clear caches on Twitter for iOS

    1. Under the ‘Storage’ section look for and choose “Media Storage” and “Web Storage” – alongside each will show you how much storage each is taking up

Clear caches on Twitter for iOS

    1. Tap on either Media Storage or Web Storage and then choose “Clear Media Storage” or “Clear web page storage” to remove the caches for those items in Twitter app

Clear caches on Twitter for iOS

  1. Repeat with the other cache type if desired

This is a great tip for heavy Twitter users, particularly after the Twitter app has grown in size with a large “Documents & Data” storage burden, since manually removing those caches and storages will free up a notable amount of space on an iPhone or iPad.

Of course if you have practically no data in the Twitter app with nothing cached and the app not taking up much storage, this won’t be particularly helpful to you. And obviously if you don’t use Twitter this won’t be useful to you either.

Hopefully at some point Apple will introduce a feature in iOS that allows users to force any apps on an iPhone or iPad to dump and clear out their built-in storage and documents and data without having to rely on the delete and re-download trick. But for now, only certain apps have manual cache clearing functionality, including the aforementioned Twitter app, and you can manually clear Google Maps caches on the iPhone too.

3 reasons SEO belongs at the beginning of a project, not the end

3 reasons SEO belongs at the beginning of a project, not the end
Search engine optimization (SEO) can sometimes be treated as an afterthought, but columnist Nate Dame outlines how keyword research can be instrumental in planning and developing an effective content calendar.
Nate Dame on July 26, 2017 at 11:28 am

Too many marketers still bring SEO in at the end of a content marketing project. They finish a blog post or finalize a new marketing campaign, and at the end of the line, SEO comes in to find related keywords and plug them into content.

Unfortunately, this approach is outdated and completely ineffective.

Robust, modern SEO research can decipher who your real audience is online, where visitors are in the buyer’s journey, what information they’re looking for, and what content format they prefer. These insights lead to more effective content strategies.

But if SEO is only given a voice at the end of the line, it’s too late to utilize the insights it provides. To fully enjoy the benefits — and optimize every piece of content — SEO must be a foundational part of every project from the beginning.

1. Keywords should help determine content, not decorate it
Most online experiences start with keywords — so marketers should, too.

There are a lot of ways to source ideas and inspiration for a content calendar, but questions that come in to customer service and comments overheard in the office aren’t necessarily what digital users are looking for. Content with SEO value needs to be inspired by the questions and needs that are being asked online.

And the best source for that information is Google.

Effective content strategies start with keyword research, because modern keyword research provides significant insight into what audiences want and need. The process enables marketers to identify user needs, brainstorm content ideas that satisfy those needs, and create the right content the first time. It also helps generate ample content ideas for filling editorial calendars.

2. Keyword research should define content
When expanded to discover user intent, keyword research tells marketers how to create content: what type of information is needed, who needs the information, and where searchers are in the buying journey.

Marketers can conduct searches for keywords and use search results to gather important content insights:

What type of information is needed? If search results for your target keyword offer beginner-level information, how-to guides or basic definitions, then the goal behind the keyword is general knowledge acquisition, or learn intent. If results include product feature or pricing comparisons, user reviews and brand landing pages, then the goal behind the keyword is to make a purchasing decision, or purchase intent.
Who needs the information? If search results are highly technical or provide very detailed information, the audience is likely individual contributors. If search results are high-level thought leadership pieces, the audience is more likely executives. User intent research provides insight into the persona who’s searching for a specific keyword, which allows marketers to personalize content for the appropriate audience.
Where are searchers in the buying journey? Keywords that result in introductory-level content are commonly used by searchers who are still in the “building awareness” stage — they may not even know they have a problem that needs to be solved. On the other hand, keywords that produce product comparison content suggest that users are aware they have a problem and have decided to make a purchase to solve it.
This type of keyword research enables the ideas captured in editorial calendars to be expanded with incredible detail. When content is written, it will include the right information, be addressed to the right audience and cater to the appropriate journey position, eradicating the waste caused by discovering these details after content is completed.

3. SERP analysis reveals Google’s ranking priorities
SERP analysis helps marketers decipher Google’s algorithmic preferences. In order to earn high rankings, marketers have to know which factors are priorities for their industry, content, niche, etc. Otherwise, they may waste time and effort pursuing things — such as certain keywords — that might not be important to their business.

Marketers can analyze page-one search results to develop a personalized list of ranking factors for their niche or industry. By comparing consistencies in top results — such as publication date, page load speeds, use of visual media, and related topics covered — the most pertinent ranking factors can be identified and addressed:

If most results point to video content, it’s because users who search for that keyword prefer videos.
If most results point to text content that exceeds 3,000 words, users are looking for comprehensive, long-form content.
If results are a blend of tables, infographics, videos, and slideshows, users searching for that keyword likely prefer visual content.
Identifying your top ranking factors allows marketers to anticipate user preferences and create the right content, in the right format, the first time.

Using SEO research to improve existing content
Don’t worry if your content has already been developed and published — it’s never too late to get started. Auditing existing content using SEO research insights is an extremely effective way to boost engagement and rankings without starting from scratch. Identify underperforming content, extract targeted keywords, and conduct user intent research.

Start by determining if existing content satisfies user intent. If search results for your target keyword are mostly defining the term and teaching about the concept, but your website is only offering a product page, you’re missing the mark on user intent. That page needs to either be updated to target the correct intent, or new content should be created to target the right intent.
Next, check content against buying journeys and personas. New insights might show that you have an excess of early-stage content, but nothing that really speaks to the decision-maker who’s ready to purchase. Make sure you have content that caters to every stage of the journey — and calls-to-action that help guide the user on your site — so you’re equipped to take leads all of the way through the funnel.
Finally, compare content as a whole to what you gathered in your research. Determine if content is in the best possible format for engaging users, take time to understand if it’s written for the right audience, and consider if calls-to-action and other navigational elements are appropriate for the stage of the buying journey that content caters to. Revise any elements that aren’t aligned with user intent insights.
This exercise allows marketers to derive more value from existing content, as well as expand the editorial calendar even further by identifying content gaps that need to be filled.

Modern SEO keyword research creates effective content strategies
Conducting keyword research at the end of a project, or after content is written, is little more than a shot in the dark for effective SEO. It’s user intent research that forms the groundwork for increased engagement and rankings by illuminating detailed information about who’s searching for specific keywords, what information they’re looking for, and what type of content they prefer.

Google — the master of connecting keywords to user goals — has all of the information needed to create effective content strategies, if we’re willing to read between the lines.

It’s never too late to change your approach to SEO. If your SEO efforts aren’t “working,” and your content isn’t increasing conversions, it’s time to shift. If you’re not convinced, or need to start slowly, begin with under-performing content. Look at search results for target keywords and see what insights you’ve missed.

11 Ways Social Media Will Evolve in the Future

11 Ways Social Media Will Evolve in the Future

CHIRAG KULKARNI
GUEST WRITER
Entrepreneur and Marketer. CEO of Taco.
Where social media can improve and grow going forward.

The 2016 election was a painful time for most Americans. It was so mentally strenuous that psychologists are still talking about post-election anxiety several months after Election Night.  And where did we process all that anxiety and frustration?

Why, Facebook and Twitter, of course. Some have even gone so far as to blame the results and tone of the election entirely on social media and the way real and fake information was shared.

Who among us isn’t still suffering aftershocks? Who doesn’t have strained relationships with friends and family after one too many political opinion posts? Who hasn’t been affected by use of the “delete” button? Social media, in its role as ground zero for viral political commentary, is invaluable, unavoidable, and exhausting.

But that’s not the only social media shift happening.

Demographics of social media are changing. Teens have been leaving Facebook in droves for years, and many can’t even be bothered to join because it’s what their parents use. In 2016, Facebook marked a 21 percent drop in original, personal updates as users have begun communicating more and more in shared articles and memes alone.

Privacy concerns are getting more pronounced as people become more aware of data harvesting, adding to previous concerns of identity theft.

Many in the industry are predicting massive changes as we move into the dramatically shifted post-election social media landscape. I recently interviewed Jeanne Lewis, CEO of Capsure, a new private social network for preserving memories, and she says, “Social media has really gotten away from us. It’s gotten to the point where we work for it, not the other way around. With social media as it’s been, the users are the product, which has caused some real rifts and problems between loved ones. It’s just not connecting us the way it was supposed to.”

Lewis isn’t the only one who feels that way, but she’s something of an expert on the subject, and she had some excellent points on how social media will transform in 2017 and on.

1. A focus on relationships

One of the first social networks was Friendster, a name which implies its purpose: Forming and maintaining friendships. That was how MySpace and Facebook ostensibly began, as well. However, as they’ve progressed, they’ve become more about personal brand maintenance and attempts to form and join various short-lived zeitgeists.

“Social platforms today have evolved into a broadcast tool both for companies and individuals,” says Lewis. “While this is valuable when you have a broad announcement to share and want to reach as many people as possible, these are no longer the vehicles for sharing photos of your kids, recording audio or staying connected with your inner circle of family and friends.”

To fix this, social media will probably begin to draw the focus back into relationships by emphasizing personal posts, photographs and small, intimate connections over outside content like memes and articles.

2. Diversity of personal posts

Until now, posts have been limited to outside material, pictures, videos and text. Don’t be surprised if, going forward, new players will introduce more diverse posting options, intermingling audio and visual components to create a unique experience for people viewing and creating posts. As digital technology progresses, people will be hungry for new and interesting ways to share experience.

Lewis emphasizes the importance audio will play in social media’s future: “Just as many of us gathered around a cassette recorder in our early childhood, the unique power of audio can be experienced once again using our smartphones.”

3. Users will pay for peace

Premium service will make a splash. This one sounds counterintuitive — after all, who would pay for a social media experience when they’ve all been free up to this point. Two things will happen to change that previous wisdom. First, with a more personal, story-driven experience, customers will want high-quality images, videos and audio files stored for posterity. Second, having a place to escape constant advertisements will become very important, something a premium social media experience will offer.

“In order to ensure our digital memories are stored and preserved there should be a direct and clear relationship between compensation and the service provided,” says Lewis. “Otherwise, what assurance do users have?”

4. Different types of groups

Google Plus tried something like this before to little success, but spurred by the frustration caused by people seeing the wrong posts, social grouping will make a comeback. In the last political cycle, many relationships were tested unnecessarily when people felt attacked by never-ending political rants. If you don’t want Grandma to see your stances on gay marriage, put her in your non-political group. This will become very important for relationship maintenance.

“Context is everything,” says Lewis. “The person we are with our family is not necessarily the person we are with our college friends. Nevertheless, there’s a desire to stay connected with all of these groups but in a separate forum.”

Related: How to Use Social Media and Not Feel Overwhelmed

5. Increased focus on privacy

Privacy concerns have plagued social media since its inception, and are only getting more pronounced. Expect future social media companies to offer more advanced network and profile privacy than ever before.

6. Less gamification

One of social media’s key components is that it’s highly addictive, even going so far as to be described as more addictive than cigarettes. New platforms will try to gear more toward long-term customer wellness as a feature by staving off more addictive qualities. They will focus more on the communal quality of social media rather than offer quick hits of serotonin from gratifying and frustrating outside content.

7. Legacy building

As has been said many times, the internet is written in ink. It cannot be erased, and in the future, people won’t want it to be. Users will want their social media to existing as an ongoing time capsule, a living record of their lives. Smart platform builders will realize posting shouldn’t be a burst about a single moment in time, to be consumed in a few seconds and forgotten about immediately, but as a multi-faceted, interactive diary involving many writers, all telling pieces of their own and others’ stories.

“We’ve arrived at a place where we are as thoughtful about capturing a personal moment to preserve as we are about carefully curating our Instagram feed,” says Lewis. “It’s a question of the legacy you want to leave behind.  If someone has 2 hours to flip through your life’s journey, what do you want them to see?”

8. Open to experimentation

The main social media giants are slow, lumbering machines, resistant to change, and unbearably clumsy when they do change.

Future models will have seen platforms of the past try different things to different levels of success and will be open to explore. They will try out wildly different ways of managing contacts, befriending people, organizing interface layouts, etc. Facebook has had basically the same layout since its beginning — don’t expect that to be the case with new platforms.

Related: 5 Social Media Rules Every Entrepreneur Should Know

9. Mobile-native

A mammoth advantage new platforms will have is that they about after the smartphone became ubiquitous. Facebook and Twitter both came before they could really function on a mobile phone, but future platforms will be designed with phones in mind from the beginning.

No clumsy borrowing between web and phone platforms — seamless integration. The future of the internet is mobile, so it stands to reason that mobile-native platforms will be built to last.

10. Build us up, don’t tear us down

The self care and heartfulness movements are big right now for a reason. In a world as chaotic and terrifying as ours, with such a constant barrage of information and stimuli, personal well-being is a thing we must actively pursue and maintain.

Family and friend communities have been part of humanity since there was humanity, and they’re there to build us up. Social media will begin to recognize that again.

11. Video, video, and more video

In late 2016, we saw a major development in social media video when Instagram release Instagram Stories and Instagram Live. Instagram’s parent company Facebook also released Facebook Live and Messenger Day. The focus on the live format follows in the footsteps of Snapchat and Twitter’s Periscope.

According to Jay Singh, CEO of PHL Venture Company, “We continue to see a shift toward live content that is composed through a camera. The camera keeps growing in importance and the ability to see through other people’s lenses in real time is becoming a powerful force in social media.”

Conclusion

The most recent political cycle has exposed a lot of what was rotten in social media, making us all so constantly aware of what is wrong with ourselves and others that we barely have space in our heads for anything other than frustration and anxiety.

If they’re smart, new social media platforms will understand that create a new kind of social networking — a kind that actually feels like a personal asset instead of a detriment.