Month: March 2017
Understanding HTTP Status Codes, for Ecommerce
Understanding HTTP Status Codes, for Ecommerce
MARCH 28, 2017 • ERIC DAVIS
An HTTP status code informs a browser about a web page or resource. When you click a link, your browser is asking the web server for the page that was linked to. If the server finds that page, your browser receives it, along with the status code of 200, meaning the page was found successfully.
There are many HTTP status codes. Every page, image, CSS stylesheet, JavaScript, and, in fact, every file transferred on the web using HTTP or HTTPS uses these status codes. They are an important technology that most people never see.
For example,, loading the home page of Practical Ecommerce returns the 200 status code returned 83 times. That’s just one page on one site.
Knowing the common status codes is important for all web developers. They might not be looking for them specifically, but recognizing an error page (500) when it should have been a redirect (301) is a necessary skill.
It’s also useful for non-developers to be aware of status codes, especially the common ones. Since codes are the language of how the web works, being fluent with them can help understand how your store functions. This is especially important if you communicate or manage web developers.
All of that is doubly important if your store uses, consumes, or interacts with an application programming interface. HTTP status codes are, essentially, the switchboard for APIs.
Status Code Organization
Status codes are grouped into five categories.
100 group: Items in progress.
200 group: Successful responses.
300 group: Redirects, which tell the browser to look someplace else.
400 group: Browser errors, also called client errors.
500 group: Server errors.
This grouping helps to identify the general type of status code, even if you don’t know what the exact code was. If you saw a 403 error, you’d know that is related to the URL relayed from your browser. A 511 code must have something to do with the server.
HTTP Status Codes
What follows are the status code groups and the codes within each group that are important to ecommerce sites. For the list of all status codes, see Wikipedia’s HTTP status code page, though it is very technical.
I’ve noted, below, when a status code is more common in API development.
100 Group: Items in Progress. You can ignore the 100 group of status codes. They are rarely used outside of data streaming.
200 Group: Success. The 200 group includes the generic 200 OK response, which is common.
200 OK. The request was accepted successfully with no problems.
201 Created (API). The request successfully created something.
204 No Content (API). The request caused the server to process the request successfully but there is nothing to show.
300 Group: Redirects. The 300 group contains the two common redirect responses, 301 and 302.
301 Moved Permanently. The request was successful but the browser should use a different URL. This usually includes the new URL. When a web browser gets this code, it will automatically open the new, redirected URL from the server.
302 Found. This is typically called a temporary redirect. It functions the same as the 301 but on a temporary basis.
304 Not Modified. This means that the browser already has the latest version of this URL and it should use that version. This is commonly used with caching to speed up repeat views of a page or file.
The 301 and 302 redirects have important search-engine-optimization considerations. A 301 will retain most of your existing SEO benefits — such as inbound links and page reputation — and transfer them to the new URL. A 302 will not. For more on this, read “3 Server Errors That Drain SEO” from contributor Jill Kocher.
400 Group: Client Errors. The 400 group includes one-half of the error codes. (The other half is 500 group errors, below.) 400 group errors relate to the client or browser.
400 Bad Request. This is a generic error when a browser requested the wrong information. While there are more specific error codes, some servers use just this catchall code.
401 Unauthorized (API). This happens when a browser is not authorized to see or use a page. Typically this protects private information.
403 Forbidden (API). This is similar to the 401 Unauthorized error. The difference is that with a 403 error, someone is logged in correctly but he doesn’t have permission to access something — e.g., John trying to look at Mary’s credit card details.
404 Not Found. The server cannot find anything from the URL.
405 Method Not Allowed (API). This is a common error with API development. It occurs when the incorrect HTTP method was used, such as a form that tries to send its data to a URL that doesn’t take form data.
429 Too Many Requests (API). Many APIs limit how quickly their APIs can be used. When you use it too quickly, this error is returned to let you know that you’ve hit the limit and should slow down your usage.
500 Group: Server Errors. While the 400 group deals with client errors in the browser, the 500 group is for server errors. There aren’t as many of these and the generic 500 error is common. Ideally if there is a server error with your site, your team should be notified by your server directly with all the details to fix.
500 Internal Server Error. The generic error when something goes wrong on the server.
502 Bad Gateway (API). Sometimes servers communicate with other servers and if the other server doesn’t respond successfully, this code is sent to the browser.
503 Service Unavailable. When a server gets overloaded and fails, it will generate this message. It usually means the user should try again later.
504 Gateway Timeout (API). Similar to the 502 Bad Gateway, this error is more specific and deals with another server not responding at all. An example could be if your payment gateway goes offline.
Why you need to get back to SEO basics
Why you need to get back to SEO basics
You can be well-versed on all the latest SEO trends, but columnist Ryan Shelley notes that you need to get the fundamentals down first.
Do a quick search on Google for “SEO tips” and you’ll get over 14 million results. That’s a lot of tips to wade through when trying to figure out the focus of your SEO strategy. What’s more overwhelming is that’s just one search.
Each year there are new posts of list of the “hottest” tips and tricks that are “guaranteed” to work. While many of these tips are great, to really see results, you need to have a good foundation. In this post, I want to talk about getting back to the basics of SEO and why they are essential to long-term success.
When it comes to optimizing your site for search, the basics are some of the most important, yet often overlooked, aspects of SEO. The recent push of “content is king” has also caused many to forget the essentials and just focus on content distribution.
Here’s the deal: you can post all the content you want, but if your site isn’t optimized, you’re not going to get the rankings you want. So here are few basics you should cover before ever diving into the more complex elements of search.
Crawler access
If search engine crawlers have a hard time crawling your site, they’ll have a hard time indexing and ranking your pages, too. As a site owner or SEO, your first and most important job is to make sure that your site is crawlable. Using the robots.txt file, you can help direct and assist the web crawlers that are crawling your site.
There are certain pages on your site that you probably don’t want the crawlers to index, such as login pages or private directories. You can block files, pages and/or directories by specifying them as “disallowed,” like so:
User-agent: *
Disallow: /cgi-bin/
Disallow: /folder
Disallow: /private.html
You can also block certain crawlers from accessing your site using the following (replace “BadBot” with the actual bot name you’re trying to block):
User-agent: BadBot
Disallow: /
Just be careful when blocking crawlers from your entire site; in fact, don’t do it unless you know for a fact that a particular bot is causing you trouble. Otherwise, you may end up blocking crawlers that should have access to your website, which could interfere with indexing.
If you are using WordPress, there are a number of plugins that can help you do this. If you are not using WordPress, you can also easily set up a robots.txt file on your server. Learn more about robots.txt here.
After you’ve created your robots.txt, it’s important to make sure Google can crawl your site. To do so, you’ll first need to create a site map. This can be done manually or with third-party tools. (If you have a WordPress site, there are many plugins available to create site maps for you.)
Once you’ve created your site map, log in to Google Search Console. (If you haven’t set your site up on Search Console, check this out.) You’ll want to upload your site map by going to “Crawl,” then “Sitemaps” in the left-hand navigation, ten clicking on the “Add/Test Sitemap” button in the upper right-hand corner. From there, you can test the site map and submit it to Google for indexation. (Note that it will take some time for Google to crawl and index your site.)
If you have already submitted a site map and just want to test/submit an individual page on your site, you can use the “Fetch as Google” feature, which is also under “Crawl” in the left-hand navigation.
- Once logged in, click “Crawl” in the left-hand navigation.
- Then select “Fetch as Google.”
- From there, enter the URL path of the page you want to test and click “Fetch.” (Leave this blank if you want to test the home page.)
- Check status. It should have a green check and say “Complete.”
- Click “Request Indexing” if available.
Making sure that Google can crawl your site is essential to getting indexed. Without having your site indexed, you will not rank no matter what you do.
Site structure
In today’s mobile-first, user-obsessed web culture, we sometimes overlook the simple and practical. While I am all for a good user experience and a huge believer in being mobile-first, I also believe we can’t forget the search engines. Having a solid site structure will add to your user experience and will help you rank better.
While this seems like a simple idea, building a good site structure takes time and planning. Not only does it impact your navigation and site links, it also helps the crawlers better understand your content and context. Site structure is all about putting your content together in a logical fashion. Don’t make your users or the search engines dig to find what they came to your site for. Learn how to create a great site structure here.
Titles and meta descriptions
Titles and meta descriptions are some of the most basic elements of SEO. While “titles” are considered in the ranking algorithm and descriptions are not, they both are still very important. Google may not use descriptions as a ranking signal, but that doesn’t mean they ignore them. The crawlers still read the descriptions — and any chance you have to tell the crawlers about your page, you should take it.
The title and the description are often the first things your potential visitors come in contact with in the SERPs. Here are a few tips for creating better titles and descriptions.
Titles
Optimize your title tag around the core focus of your page.
Don’t “keyword stuff.”
Stay within 50 to 60 characters.
Make it relevant to your users.
Don’t have duplicates.
Descriptions
Make it action-oriented.
Add your primary keyword.
Make copy easy to understand.
Stay within 135 to 160 characters.
Don’t have duplicates.
Having better titles and descriptions can lead to higher click-through rates and increase the visibility of your site in search. It’s important to note that if Google thinks your provided meta data doesn’t meet the users’ intent, they will alter it.
Before jumping into the latest and greatest SEO tactic, make sure you do the basics first. It’s amazing what a few simple tweaks and adjustments can do for your site and overall online marketing strategy. Make sure your site is crawlable, create a structure that is both user0 and search engine-friendly, and take the time to create better titles and descriptions. Doing the basics will help you build a strong foundation for long-term success.
Some opinions expressed in this article may be those of a guest author and not necessarily Search Engine Land. Staff authors are listed here.
How Many Pages Should Your Website Have?
How Many Pages Should Your Website Have? –
MARCH 2, 2017 BY ERIC ENGE
So an SEO walks into a store to buy a pair of pants…
Hey, that’s not just any SEO, that’s our CEO and lead author of The Art of SEO Eric Enge! And that pants salesman looks suspiciously like Stone Temple Senior Consultant Brian Weiss! Anyway, Eric just wants a simple pair of pants, but Brian gives him so many ridiculous choices, Eric just gives up and goes to another store.
Many large site owners treat Google the same way. They create so many pages that are essentially about the same thing, that Google gives up on trying to rank and index them. So how many pages should your site have? How do you determine which pages should stay and which should be cut? Find out in this video! (Transcript after the video)
Transcript
Eric: We see a lot of cases where sites end up creating many versions of the same pages, or many pages that are too similar to each other, because they think it’s going to be good for Google and it actually ends up getting them into trouble. So the question is, why are these sites creating so many unhelpful pages?
Brian: Well, fundamentally, search engines rank pages as opposed to sites, and the best way to rank for a given keyword or phrase is to have an individual page dedicated to that keyword.
Eric: So then, if I’m being a good webmaster or SEO manager, the best thing to do for SEO is to make sure I have an individual page for every single keyword phrase related to any step of the sales funnel for every product that I sell.
Brian: Well, we know that you’re being facetious about that. But actually, for a long time, that wasn’t such a bad strategy to take. The more pages the better is a philosophy that worked quite well for a lot of sites up until 2011, when Google first released the Panda algorithm as part of their ranking formula.
Eric: Yes. As an agency, we’ve dealt with around 50 Panda-penalized sites, and in nearly every case the problems were stemming from page bloat.
Brian: Right. Apart from Panda, Google has other algorithms including the doorway and quality updates that were targeted towards incentivizing the removal of low-value pages. Beyond the newer algorithmic adjustments, there are issues that have always existed of PageRank and crawl budget dilution, where at a certain point the marginal cost of adding pages simply becomes greater than the marginal benefit.
How do I determine the right number of pages for my site?
Eric: So if we need targeted pages in order to rank, but too many pages gets us into trouble, how do we determine what the right number of pages is?
Brian: Well, Eric, I’ll tell you. It’s a million pages.
Eric: Wow. There you have it, folks. No, wait. Really?
Brian: All right. Well, I lied. There’s no best number. But there are some guidelines you can use to determine where you should draw the line.
Would this page be part of my user experience if SEO wasn’t a factor? If the answer is no, then it’s a sign you may be on shaky ground.
Does this page have unique value, and is it substantially different than other pages on this site?
The Special Case of Faceted Navigation
Eric: So okay. Are there cases where a page might be good for users then, fulfilling rule number one, but not differentiated enough for Google?
Brian: Yes. The classic example of this is with faceted navigation, which you created pages in order to give users lots of flexibility, to filter products. Then, you end up creating thousands or millions of combinations of attributes within each category that have very little differentiation between them from a search perspective. I’ve even seen an example on a major retailer, where there were over a billion URLs created…
Eric: Wow.
Brian: …just from the color options on the page. That’s the type of thing that’s going to make Google simply give up on crawling that section of the site. So if you have faceted navigation, you need to look at all the pages you’re creating and what the best way is to let Google index the most important pages without letting them choke on everything.
Eric: This is also a place where having a crawler that can crawl 100 million-plus pages is a great deal of help. At Stone Temple, we’ve built a crawler that can operate at that level, and it gives us incredible insight into what’s going on in situations like these. All right. So we have some solid guidelines. But what do you do about the pages that fall in the grey area?
What if SEO risky pages are bringing traffic?
Brian: Well, part of that is really going to come down to your level of risk tolerance. So if you want to know that you’ll be absolutely safe, then it probably means losing some incremental traffic, since we don’t know exactly where Google is going to draw the line. In fact, the place they’re drawing the line could change tomorrow. But with those grey area pages, take a look at whether there’s subsets you can cut out that aren’t bringing in traffic, and those are the easy cuts to make.
Eric: Yes. It’s quite a bit more difficult when the risky pages are bringing in traffic.
Brian: That’s right. There are a few things to look at if the pages are bringing in traffic:
Make sure they’re also bringing in conversions. If it’s bringing in traffic, but not impacting revenue, then it’s safer to cut.
Look at whether this type of page is normal within your industry and across your competitors. Now, just because everybody else is doing it, doesn’t necessarily make it right. But it may help you determine where Google is currently drawing the line.
Try to figure out if there’s a way to improve the pages, so that they do provide unique value.
Eric: Got it. That’s a great point, and there’s a lot more we could drill into in each of these areas. Any other parting advice for someone who’s struggling with what their page count should be?
Brian: Well, as with most things, if it’s important to your bottom line, definitely get an expert opinion. So in this case, an expert would be someone with substantial experience with large sites, and maybe even with thin content or Panda recovery experience, who’s seen lots of examples of what gets people on the wrong side of the line with Google and what has gotten them out of trouble as well.
Mobile page speed is important, but not (yet) for SEO
Mobile page speed is important, but not (yet) for SEO
Columnist Bryson Meunier suggests that mobile page speed, while good for conversions and customer retention, might not be doing much for your mobile search rankings.
There are plenty of good reasons to make your mobile site fast, and Google just reminded us of them with their new industry benchmarks for mobile page speed. Among them:
Improving conversion rate and increasing profit, as 40 percent of consumers will leave a page that takes longer than three seconds to load.
Customer retention, as 79 percent of shoppers who are dissatisfied with site performance say they’re less likely to purchase from the same site again.
But is SEO one of those reasons? Will businesses that optimize their Google page speed score to 100/100 on mobile be able to rank higher than businesses that don’t?
[Read the full article on Search Engine Land.]
8 Ways Technology Is Improving Your Health
8 Ways Technology Is Improving Your Health
We hear all the time about how technology is bad for us. Since the introduction of computers, we spend more time sitting at a desk than moving around at work. We have created this sedentary lifestyle that is causing havoc in our overall life.
What if I were to tell you that technology has produced benefits? Would you believe me if I said that technology is good for your health?
Most of you wouldn’t look at first. Well, you may be able to think of a couple of ways that the computer has helped, but you are still stuck on all the negatives that ‘experts’ have shared in the past. The problem with the ‘experts’ is that they are only focused on the negatives. They haven’t looked at so many of the benefits.
So, that’s what we’ll do today. We’ll consider all the ways that technology improves our health. We’ll discuss just how it has boosted results in certain areas of healthcare and what it does for us daily.
Technology Is Everywhere in Medicine
Before we do move onto all the benefits, it’s worth discussing just how technology is used. It is found everywhere in medicine. Think about the x-ray machines, MRI scanners, and even the research equipment used daily.
There are people using it every day of the week to find cures to ailments, discover why diseases spread and creating ways to prevent the diseases. There are individuals performing tasks far more accurately than they ever did before, with keyhole surgery now a popular option for some of the most routine medical needs.
And the technology isn’t just in the hospital. It’s used in your own doctor’s office and even at home. It’s used to prolong life and create a better quality of life for those on around the clock care.
The improvements don’t just lead to better physical health. They support better mental health, which in turn improves the physical health. Technology improves connections and relationships, offering support to everyone.
We can’t get rid of technology. If we did, we would suffer greatly. Here are just eight ways that technology is improving our health and our lives.
It Pushes Us to Do More Activity
Sure, technology has led to us sitting more. And sitting is the new smoking when it comes to health problems. However, technology has also helped to push us to do more activity.
We just must take the examples of the Fitbit, pedometers, and apps that track our steps. They all encourage us to meet our daily targets—setting personal targets to get us to walk more and meet the goals that we know are realistic to us. While there is the goal to walk at least 10,000 steps a day, that just doesn’t seem realistic for many. The pedometers and smartphone apps give us more control.
The chances are that as we get closer to a goal, we’re going to work harder to achieve it. We see how we do daily and look for ways to improve the chances of meeting those goals. They don’t mean getting to the gym daily. They just involve getting out and doing more. Some can involve doing home workouts and even walk on the spot to increase our step count.
There isn’t much that we need to do to set up these pieces of technology. Most of them involve some type of phone app or computer software just to sign up and create free accounts. We sync devices, and we get to go off and work our ways to being healthier and fitter.
The devices also come with different settings. Some are just designed to count your steps. They’re basic items to get you to do a little more throughout the day. Those who want to increase the amount of exercise they do and track their heart rate will be able to get more advanced options. Some will have exercise modes, count stairs, count calories burned, and even monitor your sleep.
The aim for so many of these new devices isn’t just to improve your activity levels. They are there to improve your overall lifestyle. Devices are set to help you live a healthier and more fulfilling life, helping you monitor your sleep patterns and make sure you drink enough water throughout the day. There’s more to them than just improving one element of your life and making sure your whole body and mind are working together to create a better quality of life.
These apps and devices can also monitor your weight loss efforts. They help you stick within a healthy BMI, so you focus on protecting your heart health. You will feel better for it, knowing that you can keep yourself from accidentally going over regarding calories throughout the day or over a certain weight. Of course, being within a healthy weight range is essential to help keep yourself healthy overall.
These are all personal devices. There’s no major cost for them, with many of them available for less than $200. Some of the apps are completely free to download, so you don’t even need to spend a penny on technology to improve your health.
Better Ability for Communication Between Doctors and Patients
With technology being widely available, there are chances that everyone has some sort of access to doctor and health websites. These sites can create chat boxes and instant messengers, where real doctors and nurses can monitor communications. When a patient comes on with a question, the doctors and nurses can provide factual answers and share their thoughts and advice.
Better ability to communicate is essential for keeping the health protected. It helps to keep the questions over information online to a minimum and reduces the number of people queuing up in the hospital with fears they are dying. The people online can read the symptoms and share their beliefs based on them, helping to minimize worry.
Individuals who do need to seek medical help will be able to get to the hospital or their own doctor right away. They can take the transcript of the chats to aid with a discussion of the symptoms and working through the reasons for certain medical beliefs. They also have a better understanding of how doctors or nurses come to certain decisions.
Those who don’t need to seek immediate medical attention can reduce their anxiety over their health. This helps to improve the health since anxiety leads to stress and that leads to high blood pressure and other health problems!
People who avoid doctors fearing that they are wasting time can get confirmation that they need to get the help. That’s that fear of people thinking they are silly for their thought processes eliminated, so they have more confidence in discussing all their health problems with their doctor.
When chat boxes aren’t available, telephones have made it easier to communicate and talk to a genuine doctor or nurse. This is the case with many emergency medical phone numbers, who can then arrange out of hours’ appointment when the case is necessary.
Getting people seen immediately protects their health. It also helps to reduce a number of times they will need to visit a doctor and keep the waiting times down since the minor ailments are taken care of before they can turn into something major.
More Ability to Do Research into Problems

The internet has certainly opened the ability to research. We all tend to turn to Google, calling it Dr. Google at times. The search engine allows you to input your symptoms or ask questions about a certain symptom to find out all the ailments that involve them/it. People can look through a list of other symptoms to determine the chances of suffering from certain ailments.
This is useful when it comes to determining whether to speak to a doctor. An individual can get the basic information and use it to decide whether their condition needs immediate attention. They can also use that basic research to get onto the chat boxes to get the advice from real doctors and nurses, as mentioned above.
Those that already have a diagnosis can take to the internet to do their own research into it. This is especially the case for a condition that they haven’t heard of before or that could be hereditary. They want to find out future symptoms, especially if it is a condition that doesn’t have a form of treatment or cure.
Individuals can find out if there are natural remedies that they can try and talk to others with the same condition. They can follow blogs for people who have that same condition and are living with it. They get to hear about success stories with treatments and learn about support groups in the area. This is especially important for treatments that are either terminal or lead to a lower quality of living.
Those caring for people with certain conditions can also get some support and help. There will be support groups online for carers and advice for people who care for individuals 24/7. Suddenly, the world doesn’t seem as isolating, which can quickly help to improve the mental health. People have more confidence in their abilities and find someone who can listen to vents or problems without judgment and with full understanding—friends are good, but they’re not always able to be the most supportive.
It is important to use the internet sparingly. Unfortunately, it can also have the opposite effect and make the health worse. You spend all this time researching conditions and fearing the worst, and you end up with problems with anxiety. You can end up researching more than talking to a real doctor, hearing about the horror stories of other patients. It’s important to take a step away and look out for success stories and real doctors’ opinions to help balance out some of the negatives.
When you are on websites, you will also need to check where the information is coming from. Who writes it and is it checked by someone in the medical profession? Does the person write a personal blog really suffer from the same condition? People can write absolutely anything, and there is plenty of misinformation online.
There are reputable medical websites. They usually include links to official studies and reports to help you get all the medical information that you could need. They will consider both pharmaceutical and herbal remedies to help you save money and put your health first. Check the reviews and reputation of any website before you start looking through the information and start trusting it!
There Are Devices That Keep the Body Working as It Should

Some devices are created purposely to help promote a healthy body. They are placed inside or outside to help keep the body working as it should. There are also other types of treatments that cause reactions in the body to support organs and the overall health.
The pacemaker is just one that will come to mind for everyone. This is a device created for those who have heart problems. The pacemaker helps to send electrical currents into the heart to prevent it from suffering from spasms. This little device is a lifesaver for so many people. It keeps the heart pumping as it should, which will support the rest of the body.
This is one of those small devices that you will barely know that you have. It can be used on the young and the old to protect the heart and make sure it works exactly like it is supposed to. In one episode of Grey’s Anatomy, a 16-year-old girl was fitted with a pacemaker to stop seizures, which turned out to be a side effect of a heart defect rather than epilepsy.
The small electrical device is battery-less and powered by the heart’s rhythms. Those without it would live shorter lives and must restrict the things they do, as there will always be the risk of the heart’s natural rhythm and beat getting out of sync.
Pacemakers aren’t the only devices that help to keep the body working as it should. Bypass machines also help to sustain organ health while waiting for treatments or transplants. They are also used throughout surgeries to protect the health while undergoing some transplants and operations. For example, heart bypass machines are regularly used during some cardiac operations and for heart transplants. Without them, there is a higher risk of bleeding out and death on the operation table.
Bypass helps to change the flow of the blood. It isn’t just used for the heart and can be used for the kidneys for operations that involve the intestines, colon, and other organs around this area. Bypass helps to keep the other organs working as they should while going through the operations to ensure a fully healthy life afterward.
There is now technology that keeps organs working while they are outside of the body. This helps to keep organs working while they are in the middle of transplants, which is exceptionally important when it comes to heart transplants.
This is another side of medicine that was touched on in Grey’s Anatomy. Cristina looked after a heart that was in a box—the technology kept the heart pumping until the time came to place it into the recipient’s body. It is a very real side of medicine that is being adapted and improved. Without it, there would be people on the transplant list that would need to wait longer for a replacement. They could end up dying or others lower down on the list would lose out on transplants because they can’t be moved up.
The use of technology to keep organs alive outside of the body will also help to reduce the problem of long donor lists. While the donor organ may not be a match for anyone immediately or anyone within a hospital immediately, the organ can be kept alive while waiting for a recipient to become eligible.
But can’t organs be used without being ‘kept alive.’ There is the use of ice, and organs are sent around countries without being kept alive by a machine. However, there is a risk that the organs won’t work when they get into the recipient’s body. They have lost the blood flow during transition causing other problems. The technology eliminates that issue.
Without the advances in technology, there would certainly be people who are left without. The transplant list would grow longer, and people would remain on the lists until they die.
Better Treatment Options for Various Ailments and Diseases
It’s no secret that treatments have advanced in recent years to the point where some ailments are virtually unheard of. Vaccinations and various medical advances have completely eradicated the likes of smallpox and led to the point where polio is now less common and far more treatable.
Some of the advances have only come in the last few years, and are all due to technology. We’re able to do more research and test without the use of animals and humans. There are ways to create vaccinations and treatments without putting people at risk, increasing the chance of a better quality of life. Just look at how HIV treatments have changed since the disease was noted in the early 1980s. It is now at a point where the virus doesn’t have the chance to develop into AIDS.
There are treatments for small and major ailments. Even cancer patients have better life expectancies than they would have done in earlier years. There is the technology for earlier diagnosis and treatments to eradicate the cancerous cells. While not all is successful, there are certainly some positive steps—and that is all because of technology advancements.
Some of the treatments are to help keep the body working until a cure or transplant is possible. For example, dialysis is used by many patients waiting for organ transplants. Dialysis helps to remove the waste from the body when the kidneys will no longer do the work for them. This is an intermediate treatment option to keep someone alive while they wait for a kidney transplant.
Others will be on other machines and treatments while they wait for aliver, heart, and other transplants. Technology has helped to prolong life, allowing them the time that they need. Some technology has even helped them live some sort of life outside of hospitals, rather than be hooked to machines.
There isn’t just a physical benefit to these treatments. The benefits have helped to support the mental health. Being stuck in a hospital bed forever is boring and depressing. Patients start to worry about the bills that are mounting up and the loss of time with their friends and family members. When they are in a positive mindset, the patients are more likely to fight against the ailments that are keeping them tied down to machines. They are in a better state to accept transplants and focus on fighting infections and diseases. Their positive mindsets help the treatments work, and this is all because of the technology advancements.
And we can’t forget about the ongoing research. This isn’t just about the treatment options but how the viruses work and adapt. While there are vaccinations and treatments available, there is always something new that comes out. Viruses adapt to their environment to avoid being wiped out completely in some cases. They mix with other viruses or bacteria to create a far more superior virus.
Technology helps to assess when this happens. Scientists can locate the newly created viruses and get to work almost immediately on a cure. There is the ability to transform some viruses into cures and help to create vaccines and treatments that have never been heard of before. It’s because of technology that the medical field can keep adapting.
Better technology has also made scans clearly. People can get better angles and catch problems early. Doctors can perform surgeries and use treatments that were never possible, simply because they couldcatch conditions before they advanced too far.
To top all this off, technology has opened the chance of developing organs and valves. While Grey’s Anatomy is just a TV show, it does rely on the current medical research and ideas. There are studies into 3D printing organs and heart valves to help support the health and life of an individual. The printing would use a person’s own cells to reduce the risk of rejecting organs, improving life expectancy and treatment of conditions.
There is still a long way to go until all the research is finished. In fact, it will never be finished. However, technology is opening doors to improve the health in ways that wouldn’t have been imagined just 50 years ago,
Improved Prediction of Diagnosis and Life Expectancy
Ever wondered if you could get a disease later in life? Maybe you wonder if a current symptom is a sign that you could develop a condition. You could even wonder just how long you have left to live when you are diagnosed with a condition.
Technology has helped to improve the prediction process of a diagnosis. Doctors will have information all in one place and can see all the symptoms at the same time. They have formulas to work out averages of when a condition occurs.
You get this type of risk assessment, and doctors will be able to predict if you are more likely to suffer from a certaintype of disease or ailment.
We just must look at the pre-diabetes checks. You may have been told that you are a pre-diabetic. This doesn’t mean that you currently have it but that you have a high risk of developing it if you continue in the way that you are going. Before technology advances, you would have only found out about diabetes once you started suffering from it. There wouldn’t have been the warning signs to help you change your lifestyle to prevent it from occurring.
In some cases, you wouldn’t have even known that you have the side effects. You wouldn’t have known that you had a disease until is cause a serious medical issue and even death. Doctors didn’t have the ability to predict anything because it was so difficult to get all the information.
Technology has made it possible for information to be kept in one place, updated in real life. Once blood test results come back, they can be added directly to your file; a file that is visible by any doctor by looking up your own details. Your family doctor has your hospital records, even if the records have nothing to do with a current ailment.
While looking at all this information, doctors can see similarities and warning signs earlier. They can see symptoms that crossover and lead to specific conditions—similarities that could have been overlooked due to loss of paperwork or not having all the information in one place.
At the same time as predicting a condition, doctors can use technology to work out how long you must live. There are plenty of cases in history where individuals have lost out on events because they have been given a life expectancy that isn’t right. Doctors give people six months to live and then find out three years later that they are still alive and could have done some of the things they wanted. At the same time, people are given years to live, and then their health deteriorates within six months because the doctors got it wrong.
Technology has allowed for the creation of algorithms. Doctors can input certain figures and information into the algorithm to get the information that they need. There is more information stored about other patients with the same condition to help ensure that the algorithms got it right. There is just far more accuracy to help with the life expectancy prediction because of technology.
With better prediction, people aren’t just living healthier and changing their overall lifestyle. Their mental health is supported. Patients find that they can act and are more interested in doing so.
Gene mapping has also become a technological advancement to help with the prediction of conditions. Patients no longer need to have early symptoms to make changes to their lives. Doctors can look at the genes to determine if they are at a risk of developing certain health conditions.
This has become popular for some of the most damaging conditions for the whole family. People want to know if they have the genes that put them at a higher risk of breast cancer or Alzheimer’s disease.
Angelina Jolie is just one celebrity that stands out when it comes to this technological advancement. She found out that she had a high risk of developing breast cancer and decided to take preventative measures to avoid it by having a mastectomy. Many patients before her have had to wait until cancer has occurred and hoped there is a treatment, but she could prevent the heartbreak for her family and protect her health because of technology.
Cervical screening for women has improved thanks to technology. Researchers will see when cells are abnormal between tests to make sure that there are no earlier signs of cancer. While the cells could be abnormal for other reasons, patients get the help they need immediately to avoid lifelong and potentially terminal diseases.
Faster and More Accurate Diagnosis of Conditions
While the prediction side of diagnosis is improved, technology also improves the accuracy of a diagnosis. Like before, doctors gather all the information in one place and will be able to keep an eye on results more closely. They can also put together symptoms and signs sooner than before, meaning an earlier diagnosis for many people.
There have been many cases where doctors just haven’t had all the information. In some cases, the conditions are so rare that doctors haven’t even bothered considering them. Instead, individuals are treated for conditions that are more common or more believed to have. The treatments do nothing, and by the time they are diagnosed with the rightcondition, there is nothing they can do.
People lose out on time with their family due to a lack of diagnosis or incorrect treatment. They lose out because the diagnosis has just taken too long—and not because of inaccuracy for the doctors.
In some cases, the technology hasn’t been fast enough to get blood work back. Technology has been too poor to assess all the symptoms, or the waiting list is too long, so patients lose out. Scans aren’t clear enough, so earlier symptoms aren’t picked up in time.
This slowness of diagnosis means that people don’t get the treatments soon enough. Their conditions advance and may become untreatable and terminal. This is the case with some cancers, as it takes so long to get the diagnosis that the treatments spread.
The accurate diagnosis means more accurate treatments. There are cases where treatments can make a condition worse if it is used in the wrong way or has been given the wrong diagnosis. For example, some over the counter medications can make the chicken pox virus far more severe and cause hospital admission.
Technology Improves Recording of Information in Real Time
Many of the benefits mentioned above rely on an accurate and timely recording of information. There is no denying that recording of symptoms between doctors has led to issues of conditions not being diagnosed and the right treatment not being administered.
Before computers, doctors would write all the information on charts. They would document it through paperwork, and that paperwork would need to be sent to various doctors. If you changed family doctor, there was a chance of the information going missing. If you went to see a different doctor in between visits, such as at the hospital or a locum, you ran the risk of the information not being sent to your regular doctor.
Some key symptoms are often missed. It’s up to the patient to remember to share the previous symptoms with their regular doctor to make sure all the information is up to date.
Blood work information and other paperwork would take thetime to be sent through practices. Individuals were left waiting for phone calls for diagnosis and treatments, and we’ve already looked at how that could lead to problems.
Technology has made it better for the recording of information. This initially started with the use of larger computers. While paperwork would still be used, the information could be typed up, and doctors around the country could get the information and any symptoms. They could make a more accurate diagnosis.
However, this didn’t help when it came to immediate diagnosis needs. There were also issues with inaccurate reporting. Some doctors wouldn’t do their paperwork or would miss out things that patients said. Others who were trying to transcribe notes may not have been able to read handwriting, so elements were missing.
This is where technology continues to advance. Many doctors will now update the information on tablets and smart devices. They all have the software that allows for easier and far more accurate reporting. The information is updated in real time, meaning other doctors will be able to get the information later.
If you return, doctors will be able to see how often you are in. They will be able to look at previous visits quickly. This means they can spot any precursors or underlying problems, as we’ve already discussed above.
There is the ability for the software to alert doctors to a problem. Doctors may have set the wrong dosage for medication, or there may be an issue with crashing medication. Doctors can stop themselves in their tracks and make sure your health is put first.
The better recording also helps with communication between doctors and patients. Patients feel like their health is being put first, so they are likely to be more forthcoming. They don’t feel ignored or like certain symptoms are being dismissed.
There Are Two Sides to Technology
Technology has helped to improve the health. It will continue to do this as there are more advancements made.
There is no denying that technology can be bad. We are at a point where we sit more because we don’t have the need to go outside anymore. Socializing is possible online, and recreation is often spent watching TV shows and movies. ‘Experts’ tend to focus on all these negatives of technology, without really focusing on the ways that technology is helping us.
While a lot of the advancements have meant that doctors have it easier. These advancements have helped us as patients. We will find it easier to get a more accurate diagnosis, and the treatments are more likely to work. It’s easier to make changes to our lifestyle because technology has noted the warning signs that we are more likely to suffer from something if we stick to our current paths.
There are ways that technology helps us daily. Smartphone apps and small devices have led to the ability for us to track our health and any symptoms. Fitbit and pedometers track the steps that we take, encouraging us to do far more exercise than we usually would. Food tracking apps help us to keep our calorie intake under control or boost the amount of water that we drink. Symptom trackers make it possible to keep an eye on potential health problems.
We can also get in touch with doctors and nurses much easier. We no longer must pay a fortune to see our family doctor and clog up the waiting room, feeling like we are wasting someone’s time. Technology opens the doors to discuss symptoms online and get advice immediately. This could make all the difference in getting the treatment we need.
And it’s not just about the physical health. Technology opens the doors to getting the mental and social support when it comes to living with a condition or caring for someone. There are support forums online and places to go to do our own research. We feel far more control in life and with our condition, and we can focus on holistic approaches. A better mental health will help to improve our physical health since we have a better chance of fighting infections.
Don’t just write technology off. This is something that really can improve our lives and our health. It just must be used in the right way.
additional tips can be found at https://www.positivehealthwellness.com
5 Important Audience Targeting Tips for Your SEO Campaign
5 Important Audience Targeting Tips for Your SEO Campaign
You can consider your search engine optimization (SEO) campaign a success if you’re able to attract more people to your site organically. For example, if you start with 100 monthly organic visitors, and you grow that number to 1,000 over the course of several months, most optimizers would take pride and call that a victory.
But what about the quality of those visitors? How can you be sure that those 1,000 people have an interest in your brand, or a need for your products, to begin with?
The Importance of Audience Targeting
You may be caught in the line of thinking that more is always better. In the case of SEO, more is often better; each new person that comes to your site is another potential chance to secure a conversion, and more outright brand exposure to build your reputation.
So let’s say the average visitor has a 20 percent chance of being interested in buying your product. You earn 1,000 visitors, so you’ll have 200 potential visitors interested in your product.
Now imagine your average visitor has a 50 percent chance of being interested, but you’re only able to attract 800—now, you have 400 potential visitors interested in your product, making the scenario with less total traffic the more valuable one for your brand.
Audience targeting is all about increasing that relevance, so every gain in volume is more significant to your bottom line.
Strategies for Audience Targeting in SEO
So how can you get better at audience targeting in an SEO campaign?
1. Find the right niche.
Your first job is to find the right niche. You need to make sure the audience you’ve selected is the one most valuable to you, which may require you to make a switch.
For example, New Zealand entrepreneur Sam Ovens was forced to target American visitors because of their higher propensity to convert and show interest in his services. Do some market research, conduct surveys, and ultimately settle on an audience that has the highest likelihood of being interested in your brand.
2. Segment your buying stages.
Next, you’ll want to segment your audience based on different stages of the buying cycle and update your site accordingly as Kissmetrics suggests.
For example, you may want to shift your content to focus on customers who are just starting to build awareness that a problem exists, or you might shift to content that caters to people ready to buy (such as buying guides). It all depends on who you’re targeting and what your customer goals are.
3. Discover appropriate long-tail keyword phrases.
Once you have a target audience (and buying cycle stage) in mind, generate a list of long-tail keyword phrases, and start whittling them down based on those criteria.
Use a tool like SEMrush to competitively research those phrases, and pinpoint key topics for development that appeal to your target audience. If you’re having trouble, consider conducting a survey of your target audience to determine what kinds of content they’d like to see from you in the future.
4. Implement social listening.
Next, you can tap into social listening to learn more about who your customers are and what they want to see. Monitor trending topics within those demographics, observe how they interact with other brands, and get a better sense of their values, desires, and dispositions.
5. Cater to specific search habits.
Finally, find out if your demographic has any telling search habits that can help you optimize your efforts to target them specifically. For example, Bing tends to attract older audiences and people with niche search needs, and some demographics disproportionately search using mobile devices, or tools like voice search. Learn these search habits and cater to them throughout the duration of your campaign. Don’t get trapped in thinking all forms of search are the same.
Other Possibilities
Note that you shouldn’t necessarily limit yourself to one audience for the duration of your campaign. If your brand targets multiple different demographics, it’s possible—and in some cases, advisable—to split your SEO efforts between them. Of course, if you’d like a more conservative system, you can start with one audience and expand to others as you gain prominence in that area.
Audience targeting doesn’t need to manifest in any one form, but it should be considered an integral part of any campaign. With higher-quality targets, any traffic gains you make will become more valuable, and your overall campaign ROI will multiply.
How are you targeting your current audience? What tools do you use to help you with targeting and demographic research?
Top 10 SEO Mistakes Web Designers Make When Designing a Website
Top 10 SEO Mistakes Web Designers Make When Designing a Website
There are tons of websites across the internet, some of which are quite attractive and informative. Research suggests that thousands of websites are created on a regular basis, but it’s a challenging task for web designers to make the website SEO friendly. There are some rules and guidelines of designing a successful website. It is a must for web designers to know the key elements of web design SEO.
Building a website can be daunting. The problem is that most designers forget the purpose of designing a website. They give priority to usability and practicality, which is good, but they should keep SEO in mind as well. In this post, I am going to highlight top 10 SEO mistakes web designers make when designing a website.
1. Ignoring title tags and meta descriptions
Whenever I see websites with their company name and logo on each and every webpage, I feel how they could do this mistake of ignoring title tags. If you want your website to be noticed by search engines, you should have unique title tags. Your page titles are usually shared on social media sites and used as texts when the website is bookmarked. Hence, it is very important to have unique title tags.
Similarly, every webpage should have a unique Meta description, describing the webpage. Your homepage and other pages should have a custom description that will make people click on your listing in search results.
2. Forgetting to Unblocking search engines
Forgetting to unblock the search engines will cause a major problem and happens more often than you imagine. Web designers usually set to block the search engines to crawl during website maintenance or design. However, once the site is done, they forget to unblock the search engines from crawling the website. With lots of things going around, it is not unusual to do this mistake.
Web designers can avoid this by checking with Google Webmaster tools’ robots.txt Tester and ensure that the site is indexed. You can also check robots.txt file, that is here for all sites: domain.com/robots.txt (kindly replace domain.com with your website) and make sure it is not blocking search engines. If you see following, then immediately change it yourself or ask your designer to do it.
User-agent: *
Disallow: /
Note: You just need to remove that ‘/’.
3. Neglecting local search
Local search is as important as any other SEO factor. If you are running a business that focuses on the local audience, you should consider local search. Web designers should include business name, address, phone number and other details on the website.
4. Not installing analytics properly
This is one of the common mistakes that web designers do. They should be very careful while installing the analytics code properly as they redesign or revamp the website. There are many analytic providers that will cause significant issues in the historical tracking of the website and it is must for web designers to avoid this.
5. Using new code and features
Websites should be updated on a regular basis to meet the requirements of changing algorithm. Don’t considering SEO while changing the code is the common mistake most of the web designers do while designing or revamping the website. There are valid reasons that web designers might want to introduce to improve the functionality of their website.
The improved functionality of the website should not dip the traffic. Hence, web designers should have a potential understanding of the website traffic. This will help them make an informed decision on improving the functionality without affecting the traffic.
6. Ignoring 301 redirects
A 301 redirect is considered as a permanent redirect from one URL to the other. Whether the web designer is switching domain names or changing the content as a part of website redesign, they should ensure that the 301 redirect is right. This not only ensures that the SEO is transferred from old URL to the new URL, but will enhance the performance of your website.
Web designers can take help of SEO experts in this regard to ensure that the SEO of website remains unchanged while revamping or redesigning the website.
7. Ignoring the structure of URL
Structuring the URLs of the website is very important when designing a website. The URLs of the website should be short and relevant to page’s content. For example, domain.com/about is good, while domain.com/?p=1 is bad. The use of short URLs makes possible to see where the click is taking the visitors.
8. Not using responsive design
Today, people use multiple devices to browse the internet. Hence, it is important to design a website that caters to different devices, which users will be accessing your website with. Responsive websites are the preferred configuration for search engines and if you want the website to deliver a great browsing experience, use responsive web design.
The good thing that happens with responsive design is that all the URLs are same across different devices. They serve up the same HTML code, which makes it easier for Google to crawl across the webpages and retrieve the content.
10. Navigation problems
Improper navigation is one of the biggest mistakes that impact the SEO of your website. Adding multiple navigation elements on both sides of the web page is worse than having too many items in the main navigation bar. The design of website’s navigation has a great impact on the success or failure of the website.
Navigation should be descriptive. It is a great opportunity to indicate your relevance to search engines. Use labels that include popular keywords or phrases that will help your website rank high on search engine page results. Also, make sure most important pages are accessible from the homepage of website i.e. about, contact, services etc.
If you did any of the above mistakes while designing the website, you should think of fixing them. When it comes to SEO, Google rewards user experience more than any other thing. It is important to consider how you want to find the information you would be looking for as a visitor to your website.
Think of giving the visitors the best information they are looking in an easy way! Please let me know your thoughts in comments and do share this post with your friends if you find it useful.
5 Tips for Boosting Organic Traffic to Your Ecommerce Site
5 Tips for Boosting Organic Traffic to Your Ecommerce Site
Increasing organic traffic to your ecommerce site can be a challenge; one that many (if not all) operations constantly face today. The entire SEO-marketing industry has evolved to help relevant websites rise to the top and thus net more organic traffic because of what they offer (service, product, content, etc.) that searchers find valuable. For most websites, generating organic traffic generally converts much better than traffic from social media, and paid advertising options (offline notwithstanding as it’s much harder to qualify).
For the new and aspiring ecommerce store owners, “organic traffic” refers to the visitors who come to your website as a result of an unpaid search listing, or in other words, a keyword or phrase they typed was relevant enough to one of your pages that Google, Bing, or Yahoo displayed it in its search results.
Keep reading for four tips on boosting organic traffic to your ecommerce shop.
Identify Prospering Links
Backlinks are an essential component of valuable SEO. Sure, without a lot of links, you can still get traffic from search engines like Google if your content and on-page optimization are relevant enough for your targeted keywords and you face little competition. However, you’ll never be able compete with the big dogs (highly searched keywords or a competitive industry) if you don’t create quality backlinks. If you really want to improve your organic traffic, you must know which links are helping you and which are hurting you.
Here are some features of a good backlink:
It is from a website relevant to your industry.
It comes from a long and detailed article — the higher quality the page, the more value the link will carry.
It’s from a webpage considered a trusted source with solid domain authority.
If you see that any of the links are working counter-intuitively to any of the above, you can go ahead and consider it a faulty link that should be replaced or deleted.
Identify Toxic Links
The main difference between good and bad backlinks is the quality of the website they are on. A lot of times it doesn’t take much to identify a faulty link, but sometimes they can be a little tricky. Here are some things to lookout for when hunting for bad backlinks:
The link is from a website that is set up for the purpose of SEO back-linking. Meaning that you won’t find anything of value from the site content-wise, just a lot of links to websites intended to boost SEO (they don’t work).
Links from a country domain where your business has no target audience or operational presence.
No social activity (Likes, Shares, Retweets)
Keep Your Blog Relevant
If you run an ecommerce store that generates any amount of steady business, you are likely to run into questions from potential or current clients. Instead of replying individually to each question by email, write a detailed article covering the concerns and issues of your users. People tend to have the same types of questions, and usually turn to Google for help. If you have a well-thought out article that can address such a problem and perhaps resolve it, you are now positioning yourself as an expert in your industry — which is a great way to increase traffic to your site and your overall SERP ranking.
Contribute to Reputable Websites
An easy and effective way to promote your business’ website is to contribute to popular websites within your industry. This is how you can get your website to the front of a community that is already established and drive referring traffic as well as positive backlinks that will affect your SEO. However, this isn’t a simple “one size fits all solution”; you can’t just post to any old website, and in fact, posting on the low-quality or spammy sites could produce a negative impact. With that in mind, you should be very picky about where you choose to post and promote your website.
Stick to the Plan
From an ecommerce perspective, SEO should be a main priority. You should be in it for the long haul. Like content, all too often companies give up before their strategies and work have had time to pay off. This is what makes monitoring the success (or iterating on the results) of your campaigns just as important as any of the above tips. Depending on the platform your store is setup on, you may not need to hire a dedicated person if you don’t know what you’re doing; enterprise ecommerce solutions like Shopify Plus have functionality to assist with SEO for precisely that reason.
However, if you’re planning to run a successful ecommerce business for years to come, staying vigilant and up-to-date with the latest search trends and news inside of your industry would be wise. The same goes for keeping outdated SEO elements inside of your website. Make sure you’re staying on top of these best practices in order to stay ahead of the curve.
SEO How-to, Part 7: Mapping Keywords to Content
SEO How-to, Part 7: Mapping Keywords to Content
When optimizing a large site for search engines, you need a plan of attack. Keyword mapping is the process by which you determine which keywords of the thousands identified in your keyword research will be assigned to each page for optimization. It’s also an excellent way to determine which pages to prioritize for optimization.
This is the seventh installment in my “SEO How-to” series. Previous installments are:
- “Part 1: Why Do You Need It?”;
- “Part 2: Understanding Search Engines”;
- “Part 3: Staffing and Planning for SEO”;
- “Part 4: Keyword Research Concepts”;
- “Part 5: Keyword Research in Action”;
- “Part 6: Optimizing On-page Elements.”
Think of your website as an army of pages fighting the competition for rankings in natural search. You wouldn’t have every fighter in an army attack the same target in the same way. Each would have a different target or way of approaching the target. Doubling up would leave some of the competitors’ targets free to win rankings without a fight, while other targets would have so many fighters trying to accomplish the same goal that they’d get in each other’s’ way. In both cases, you’re not using your army to its full potential.
The same is true of optimizing content. The keyword map is nothing more than a tool to ensure that you’re deploying your valuable keyword research data to your army of pages optimally. Assign a unique keyword target for each page on your site to ensure that all the valuable targets are covered and none of the pages are fighting with each other for rankings.
Start by outlining the pages on your site in a spreadsheet.
List every page you plan to optimize and its corresponding URL on the left. If this sounds tedious, consider that most people start by optimizing the major category pages. Those categories are probably listed in the header navigation on every page of your site. Try viewing the source of a web page and copying-pasting the header navigation into a document. Then you can remove the extraneous HTML coding around the page names and URLs, and you’ll be left with a list that you can paste into your spreadsheet.
Crawling your site with a tool that mimics search-engine-crawler behavior is another method of collecting page information. For a free option, try Link Sleuth. It’s older and now unsupported but has a useful basic feature set. Screaming Frog is a more complete crawler available as a free limited trial, or with a minimal annual subscription.
With the pages captured, turn to your analytics and layer-in the visits and revenue data for each page. Then add in the average ranking from the Google Search Console “Top Pages” report. If you’re not already familiar with the VLOOKUP formula in Excel, this is an excellent time to learn. It can automatically pull in the data for each page name or URL from an export of your data, saving you from having to match them up manually.
And lastly, assign keywords to pages based on the keyword research you’ve already done. For every page, choose the single, most highly searched and relevant keyword to list in the “Primary Keyword” field. Then use your VLOOKUP formula to pull in the searches per month for that keyword. Choose a closely related secondary keyword as well, and include columns for additional keywords if your research is deep enough.
Make sure that the keywords are closely related when you’re assigning keywords to support the primary keyword. Just like each keyword needs one page to target it to maximize your ranking potential, each page should have only one keyword theme. That one-to-one ratio is very important.
It’s also important to choose a primary keyword that represents the totality of the content on the page. For example, in the image above, the last page shown contains “Body Scrub” products. In this case, “sugar scrub” is nearly twice as frequently searched in Google in the U.S. as “body scrub.” I’ve assigned the lower-valued “body scrub” as the primary keyword, though, because all the products on the page are body scrubs but not all are sugar scrubs. Sugar scrubs are a popular type of scrub, in addition to foot scrubs, foaming scrubs, and spa scrubs.
When the keyword map is completed, you’ll also be able to use it to prioritize where to start content optimization. The “searches per month,” “visits,” and “Google ranking” columns provide the data needed to choose pages to optimize that will improve your natural search performance. Pages that improve the most are ones that rank on the bottom of page one or the top of page two in search results, and that represent a higher number of untapped searches per month. On top of the search data, layer in your knowledge of the areas of your business that have the highest priority and the highest profit.
Enterprise search platforms, such as BrightEdge and Searchmetrics, do some of these calculations for you in predicting which pages to focus attention on. However, if you can’t pull all the data needed into your search platform, or if you can’t afford a search platform, the keyword map can be a good, manual method for prioritizing content optimization, as well.