If you’re pursuing a PhD in SEO, stop reading now! These SEO basics are designed to help you grow your website traffic and learn more about your audience without going into all the technical mumbo jumbo.
The truth is that SEO is based on common sense, not complexity. Through the years people have made SEO more complicated than it really is. We wrote this guide so you can apply the important SEO principles on your own.
“Life is really simple, but we insist on making it complicated.”
SEO is about providing great content for the visitor that answers their questions and gives them the information needed to fulfill searcher intent. And at the same time, making it easy for search engines like Google or Bing to crawl your site and understand what you’re offering. Get these two elements right and you’ll be well on your way to increasing search traffic and landing in the top SERPs.
Learning what your visitors are searching for, how to optimize your web pages for keywords and measuring success will be an ongoing process. Instead of worrying if you are doing SEO “right”, keep experimenting and discovering what works for you.
This non-scary guide will take you through the SEO basics in bite-sized sections. When you get the fundamentals of SEO down, you’ll start to see results.
For tips on landing page optimization, check out this article.
What are people looking for? SEO work usually starts with keyword research and competitor analysis. But before diving into how to conduct keyword research, it’s important to understand the intent of the search query.
What does the person searching expect to see in the results?
Informational intent: Searching for a specific answer to a question. For example, “taking care of a dog”, “how to do SEO”, etc.
Navigational intent: Searching for a specific website to navigate to it. For example, Facebook, Airbnb login, Amazon return policy and similar.
Transactional intent: Searching for a product or a service to buy. For example, “Country Brook Petz Premium Nylon Dog Collar” or “website SEO services”, etc.
Commercial investigation: Searching for a best-fitting product or a service. For example, “best dog collars”, “best keyword research tools” and similar.
By checking the current top 10 results of your keywords, you can identify the searcher’s intent. You can then create the right type of quality content that you’ll need if you want to rank in the top 10 results. What type of content is showing up on the results page?
Informational intent queries: Articles, blog posts, tutorials, videos and regular website pages like feature pages.
Navigational intent queries: Display the exact website pages.
Transactional intent queries: Most commonly provide specific product pages or product categories.
Commercial investigation queries: Often show review articles, comparison pages, listings, etc.
You probably already have a couple of ideas about how people search for your product, service or content. Even if you only have 1 target keyword or phrase, that’s a great way to start. You can use those initial ideas to generate even more keywords and phrases.
Think of a phrase your customers would use to find you, enter it into Google and you will start to see related keywords and phrases that might be relevant to you.
Let’s go through this process using the example phrase “how to take care of your dog” (informational intent query).
You can see below that once the phrase is typed in Google, we see a list of what people are searching for and even potential content marketing ideas for our website.
You can get more ideas by scrolling down to the bottom of the search results.
When you are on the search results pages, there are several things you can observe that will help you plan out your web pages.
Take a look at the search results page below with these questions in mind:
1. What keywords are being used in meta titles?
2. What kind of content is ranking in the top Google results?
Are there a lot of images ranking? If so, create an image gallery or insert more images in your content.
Are there a lot of videos ranking? Consider creating your own video or choose a topic that’s not dominated by video content.
Visit a few of the top-ranking pages and pay attention to the following:
What is the name of the article and page?
What are the keywords used in the headers of the page?
What kind of content does it include (related to search intent) - a 3000-word blog post, a product page, a product category, or a short landing page with a very specific CTA?
While visiting the top ranking websites based on your query, briefly analyze the types of sites to get a better sense of the competition.
Check what kind of websites and how big they are:
Is it a small business or blog with a couple of dozens of blog posts?
Is it a Wikipedia article?
Is it an Amazon product category page?
If you see that the top 10 results for your query are coming from giants like Amazon, Ebay, Apple, Best Buy, etc, then maybe it’s a better idea to explore less competitive keywords and focus your efforts on building traffic using several less-popular long-tail KWs.
Here are 2 free tools to help you grasp the size of your competition:
1. SimilarWeb Chrome extension measures the traffic of the website. If a competitor website is receiving very little traffic, no value will be shown. Also, these are very rough estimates meant to compare your competition, NOT to make calculations.
2. MozBar Chrome extension provides DA (domain authority) and PA (page authority) metrics that attempt to mimic Google’s algorithms as best as possible and assigns a score to your website or page. The higher the score, the more authoritative the website is considered. Websites like Wikipedia, Facebook, Youtube, Amazon will have scores close to 100, which makes competing with them difficult.
If you notice some websites with a score of 0-30, that’s a good indication that you have a higher chance to rank among them. Please use this to get a general idea of your competition, but NOT to make calculations. This is not a ranking factor.
Finally, if the Google keyword suggestions and the search results don’t provide you with enough ideas, you can use this free tool Keyword Sheeter, which generates more related ideas based on your keyword. See below.
Learn more about what people are searching for by finding forums, communities and online discussions related to your niche on sites like Reddit, Facebook groups, LinkedIn, Quora and many more. Check the most common questions and popular discussions. You’ll get more ideas for new content.
Once you have a set of keywords you want to target, the next step is to find out their search volume, which will help you to choose keywords that have the potential of driving traffic to your pages. Google’s Keyword Planner can help you find traffic numbers.
Note: if you are not running ads on Google, it will show rough estimates.
Another free tool is Chrome extension Keyword Surfer. It doesn’t have the option to bulk check keywords, but it will display search volume in Google’s search results.
Make sure you are grouping similar keywords. Otherwise, you’ll end up with 1 list containing hundreds of keywords with different intentions and content needs. You want to keep your content (web pages) focused on 1 specific set of keywords.
Free: Google Trends helps you analyze the popularity of top search queries in Google across various regions and languages.
Free: Answer The Public generates search insights based on your inserted search query.
Paid: Ahrefs is one of the best allround SEO tools, especially for backlink, keyword and competitor analysis.
✅ Understanding the intent of keywords you want to target
✅ Identifying a set of keywords (based on intent and search volumes)
✅ Learning how your competition uses KWs
✅ Determining what types of content you need
When you are creating web pages, you’ll see an option for adding a meta title and description. This is the text that will be visible in the search results.
The meta title (1) is an important field as it impacts how Google ranks your website and is the first thing that potential visitors will see to decide whether or not to click on your search result. As a best practice, keep the meta title under 65 characters so it won’t get cut out on the search results. You can use a meta description checker to see how long your text is and how it'll look on Google.
Also, always include your target keyword as close to the beginning of the title as possible. The meta title should describe the main purpose of your page.
The meta description (2) doesn’t impact how Google ranks your website but it does influence your visitor’s decision to click or skip your search result.
Keep it under 155 characters so it won’t be cut out.
Include your target keyword and try adding a couple of supporting keywords (keywords in the meta description that match the ones of the search query will be bolded).
Keep the text informative and useful to the reader. Describe in detail what the searcher will find on your page if they click, and always end it with a CTA. Such as “Visit now”, “Read now!”, “Check to find out” or similar.
It’s important to use a clear hierarchy of your text using different sized headers. Always include the most important keywords at the beginning.
Here is an example of a header structure based on our previous example, “How to take care of a dog”:
H1 - How to take care of a dog?
H2 - Feeding a dog
H3 - How often should you feed a dog?
H3 - Best food for a dog
H3 - Food to avoid
H3 - …
H2 - Grooming a dog
H3 - Brushing a dog
H3 - Trimming a dog’s nails
H3 - ...
H2 - Dog’s health
H3 - Vaccination
H3 - Exercise
H3 - ...
H2 - Useful checklist for items to buy
This is just an example to show you that every page should be organized in a logical way, with the main topics as headers followed by supporting arguments or examples. It’s important not to get wrapped up with keyword stuffing when using keywords in headers. Try to always sound natural.
Keeping your searcher’s true intent in mind, use your target keywords (in our example it could be keywords like dog care, dog carings tips, take care of dogs) and supporting keywords (feeding dog, grooming dog, vaccinating dog and similar) throughout the text of your page
That said, don’t overload your pages with keywords. As with your headings, keep the text natural and useful. Stuffing keywords doesn’t work and can even have a harmful effect.
Make sure your formatting remains consistent throughout the text. If your page is long with a lot of sections, consider adding a table of contents with anchor links at the beginning of the page. For better scannability, use bullet points or number lists where appropriate, and include images.
Optimizing your images for search engines is one of the basic SEO steps to follow for better chances to rank in Google.
When you add images, don’t forget to add alt text. The image file name and alt text should be descriptive. Use appropriate keywords based on your keyword research. It’s more valuable to keep it descriptive than to stuff KWs.
Image alt text helps not only for search engines to better understand the content on your page, but also for vision-impaired people who may be browsing through your website.
To help give you an idea of what works best, let’s write a poor, average and good alt text for the below image of a super cute puppy.
- Poor alt text: dog
- Average alt text: a dog with a bone
- Good alt text: a pug puppy carrying a bone treat
You can name your image files based on the same logic as alt texts.
When writing your URL text (slug), keep it short, descriptive and include the main keyword. Also, avoid using symbols and non-English letters, see below:
Avoid URLs like:
https://yourdomain12345.com/blog/article-20200430-number-1223456
A good URL (using our dog example): https://yourdomain12345.com/blog/how-to-take-care-of-a-dog
Updating your content from time to time is another SEO fundamental, as it can have a big impact on your page rank.
1. Writing a new piece of content requires time, instead check your old content. You probably already have something that once performed well. By updating the older content with relevant information, you have a higher chance for a well-performing piece again.
2. If you have content that ranks at the start of a 2nd search engine results page, a slight update to your meta title and a few additional paragraphs with relevant information might give you a boost to jump from 2nd to 1st page. That can mean a huge increase in traffic!
3. Some content by nature must remain fresh to rank. Let’s take our example of taking care of a dog. Feeding, grooming and caring for your dog are things that remain relatively the same over time. But things like “best dog collars” or “top Christmas gifts for your dog” will change over time because of trends, innovation, fashion and other factors. Also, notice that the intention of the queries is different as well.
Let’s analyze a search query that doesn’t need very fresh content to rank successfully: ‘taking care of dogs teeth’. The 1st result is an article from 2020:
There are a few articles that date back to 2018, and a few more recent ones from 2019 and 2020. Then you’ll see videos from 2019 and 2017, and the oldest from 2009!
The vast gap between the time content was published is a good indication that content freshness is not as important for this topic. Probably because there is not as much innovation about taking care of your dog’s teeth.
However, if we search for something like “best dog collars”, the content publishing dates change by a large margin.
All of the top 5 results are from 2021.
The search term “best dog collars” has a commercial investigation intention (researching to buy a product). There could be more innovation for this query, requiring more frequent updates.
For example, a newer player enters the market with environmentally-friendly collars that become a huge hit. Someone launches a technical collar with Bluetooth tracking that becomes popular.
Duplicate content is content that appears on more than 1 website. While quoting someone on your website is perfectly fine, if 80-90% of your content is copied from somewhere else on the web, you will likely have a problem.
Duplicate content can have a negative impact on your search engine rankings. All of the pages you want to rank should have unique content (including meta titles and meta descriptions).
Copying descriptions or other related content from other sites for products that you plan to resell
Using the same descriptions for variations of the same product. E.g. for different sizes, colors etc…
Copying content from one section of your site to another
Having multiple languages on your website but forgetting to translate certain pages
You won’t get a penalty for duplicate content, but you will compete with yourself if you have duplicate content within your website. Since the content is already indexed and ranking on search engines, you will not have a chance to rank against another website that already has the content.
An easy and quick way to find out if you have duplicate content (or someone copied content from you) is to use Google search. Use some unique text within your page, copy it and insert it “between quotation marks” in the search box.
Use this duplicate content checker to see if any of your content is being copied on the web.
What results are popping out in Google? If there’s only 1 page with an exact match, you are good. If there are more pages with the same content, investigate further.
✅ Writing meta titles and descriptions
✅ Organizing content with headers
✅ Writing the main content of the page
✅ Optimizing images with alt text
✅ Inserting descriptive URL structure
✅ Keeping content fresh
✅ Removing duplicate content
This is the more technical part of the SEO guide. Don’t panic, we’ll make it easy to digest!
Technical SEO is important because here’s the truth: it doesn’t matter if your content is amazing, if Google can’t see it, your content won’t rank.
So let’s make sure your content is eligible for ranking.
When talking about website accessibility, you’ll hear a lot about crawling, done by a web crawler, spider or spiderbot.
It’s a process used by search engines to find your content and pass it further for indexing. Simply put, a crawler follows all the links it finds and then analyzes content on those pages (similar to how you would browse the internet).
There can be some limitations, which we will talk about below.
Robots.txt file is usually uploaded to your website’s root folder (if you have it, it can be found by entering https://yourdomain12345.com/robots.txt in your browser tab) and is used to instruct crawlers to stop crawling your website or certain parts of it.
Have a look at these robots.txt examples:
Our robots.txt file is simple and straightforward
Youtube’s robots.txt file has a funny note
Google’s is quite long
The first thing crawlers will check when coming to your website is the robots.txt file, where it will decide the crawling path. However, your pages can still be indexed if the crawler discovers your web page from a different website that is linking back to you (we will cover this in the next section).
If a person isn’t going to search for a specific page on your website using Google, it’s probably not worth it for search engines to crawl it.
For example, your website’s admin pages, a thank you page for successfully signing-up or thousands of unique search queries within your website’s search that generate unique pages but they serve no purpose being indexed on Google.
Sometimes mistakes happen and important pages get blocked. Especially when migrating to a new domain, CMS or when you make big changes to your website’s structure. Make sure your robots.txt file is in order.
Choose which crawlers you want to allow to crawl your site with a robots.txt generator. Tread carefully and inform yourself.
Noindex and nofollow tags are small snippets of code that are inserted in your website’s pages and can look like this if you inspect the code:
<meta name="robots" content="noindex, nofollow">
A Noindex tag is used when you want crawlers to crawl your pages but not index them.
A good example is when you have a couple of filter combinations for your products or services and they create hundreds or thousands of unique pages. They may not have any use being indexed in the search engines, but they link to other important pages within your website. Therefore it’s good practice to allow Google to crawl them but not index them.
Nofollow tags are used when you don’t want crawlers to “click” (visit) the page you are linking to. Such examples would contain paid links, comments in forums or under your blog posts, or other user-generated content. The reasoning for this is that those links may be low value, harmful and Google may not like that.
Noindex/nofollow tags guidelines:
Use robots.txt if you want to stop crawlers from crawling your website
Use noindex tag if you want to stop search engines from indexing your pages
Use nofollow tag if you want to stop search engines from following certain links
Use noindex AND nofollow tags combined if you want to stop search engines from indexing your pages AND following certain links
A sitemap helps crawlers to better understand the structure of your website and find new pages more quickly. If you’re migrating to a new domain, launching a new website, making big structural changes, or have a large website with a lot of new content, a dynamic sitemap is a very good practice.
You can also have separate sitemaps dedicated to images or videos if that is a big part of your website.
Although still in the realm of SEO basics, your website’s architecture and internal links play an important role in helping search engines determine the importance of your pages, and have an impact on your website’s user experience.
Every page within a small-medium website should be accessible within a maximum of 4 clicks.
The “closer” your page is to the homepage, the more important it gets in the eyes of search engines. The same logic applies to the visibility of your links.
For example, links on your website’s header will have more importance compared to the links in the footer. Links under your blog posts comments will have even less importance.
Following SEO best practices, anchor texts should be well thought out and follow Google’s guidelines.
A structure of a small-medium size website could look similar to this:
If you have a deep blog, add categories to help organize by topics.
Make sure all your important pages have links pointing back to them so they are discoverable by search engines. And don’t forget to link to your older but relevant articles when writing new content. The more relevant links an article receives, the better chance it will have to rank higher.
You can use tools like Whimsical to help you with your SEO efforts by making charts to better see the bigger picture.
Most commonly redirects are used to help navigate crawlers and visitors within your website, and to automatically redirect them from one page to another. Consider using redirects when changing the domain, URL, or removing pages from your website.
Over time, your website and its pages collect SEO value. They start ranking for keywords, receive backlinks from other sources on the internet, and start generating traffic.
When you’re changing domains or removing pages, it’s really important to save the value you accumulated over time and have redirects in place.
Imagine if you were to change the physical location of your retail store and not tell anyone the new address. That would not happen! It’s the same with your website. Don’t forget to redirect your pages.
Two main redirects you should focus on are HTTP status code 301 (permanent redirect) used to indicate that the page is removed for good and HTTP status code 302 (temporary redirect) used to indicate that the page is not accessible for a short period (a few days).
Use cases for 301 redirect:
When you move to a different domain
When you remove a product or a service from your website and do not plan to restock it
When you change the URL of a page
If redirecting to the same page is not possible, redirect it to the most similar category. For example, if you decide to remove your article about dog care, consider redirecting it to your “Dogs” category under “Animal care” in the provided example above.
302 redirect should only be used for a short period of time when a page is unavailable (a few days) and you plan on bringing that page online again.
Note: If the page is very low quality, provides no value to the visitor or search engines, it's perfectly fine to delete the page and return HTTP status code 404 (Page Not Found). No redirect is necessary in such a case.
What is an SSL certificate? SSL stands for Secure Sockets Layer, and an SSL certificate is a way of authenticating the identity of a website. It also enables a secure connection by encrypting the link between the web server and the web browser.
It’s crucial that your website is secure. An easy and free way to secure your site is to move from HTTP to HTTPS. Especially if your website has sensitive customer data, such as credit card details, addresses, emails or similar data.
Most hosting companies offer basic SSL certificates for free when you have a hosting package with them.
Website load speed is a very important ranking factor. Make sure your website loads as fast as possible.
There are plenty of effective and free tools that can measure your website’s loading speed and provide suggestions on how it can be improved. We recommend the official Google tool and GTmetrix.
The most common issues that may cause slower loading speeds:
Poorly optimized images
Excessive amount of code
JavaScript issues
Installing too many plugins
Too many ads
Disable caching
In July 2019, Google began indexing mobile content first, for all new websites. This involves using the content of your mobile website first to index it.
Mobile device usage increases every year. There’s no question how important it is that your website is mobile-friendly (responsive). If you are using quality no-code website builders (like Ycode or MailerLite), your website is likely already optimized for mobile devices. It’s still recommended to check your site on all devices, just to be sure.
You can use this Google tool to find out whether your website is mobile friendly and get suggestions on how it could be improved.
Google sums up what structured data is very well: “Structured data is a standardized format for providing information about a page and classifying the page content; for example, on a recipe page, what are the ingredients, the cooking time and temperature, the calories, and so on.”
Schema.org is what defines how each element on the page (ingredients, cooking time, temperature and so on) should be marked in your code, in a way that’s understandable for different search engines.
Structured data is not a ranking factor but it’s a perfect way to make your search results stand-out. A few examples of how schema implementation can look in search results:
If you are running an ecommerce store, star rating for your product pages:
If you are running a recipe website, star rating, preparation time, picture for your recipes:
If you are running an event website, listings of the events:
There are many other options that you can read more about, and learn how they can be implemented in the official Google structured data documentation.
As of August 9 2021, Google removed their structured data testing tool and recommended that the schema.org validator is a stable alternative. Read Google’s announcement here.
✅ Robots.txt file
✅ Noindex and nofollow tags
✅ Sitemap
✅ Website architecture and interlinking
✅ Redirects
✅ SSL certificate
✅ Core web vitals
✅ Mobile-friendly website
✅ Structured data
Link building is the process of acquiring links from other websites that point back to yours. These links are known as backlinks or inbound links.
Inbound links are an important ranking factor for search engines. However, it’s important to focus on quality over quantity.
Having hundreds of links from low-quality (spam) websites that generate no traffic will give you no benefit. In some cases, that approach can hurt your ranking.
It’s much better to invest your time getting a few high-quality links from a relevant website that’s considered an authority, and can actually drive some traffic back to your website.
Guest posting: Write content to be published on other websites that links back to your site. Google is pretty good at identifying unnatural behavior, so make sure your content is relevant and not a shameless promotion. Find a website in your sector, make sure it’s a quality website that generates traffic, and pitch them on an idea that will help them as much as you. You can read more about guest posting in QuickSprout’s guide.
Broken links: Most websites link out to other sources on the web and some of those links get broken. Maybe the content was removed, the website moved to another domain and forgot to redirect, or perhaps the domain expired. If you have similar content that would be a good replacement for the broken link, contact the website’s administrator, inform them about the broken link and share the content that could replace it. You can check Backlinko’s guide about broken links, which also includes link building using Wikipedia.
Web directories: This is especially useful for local businesses. Getting listed on Yelp, Tripadvisor or other local listings that people use, can give your local SEO a little boost. If you are putting your business up in the directories, make sure they are high quality and people are using it. Don’t sign up just for the sake of a backlink.
If possible, always try to link to a specific page rather than a homepage. Having a few high-quality backlinks to your “How to take care of a dog” article will give you a much better chance to rank in the search results.
Free: Backlink Shitter - Insert your competitors website or web page and find the backlinks it has. This will give you more ideas on where you can hunt for backlinks.
Paid: Ahrefs - Great for finding your competition’s links and broken backlinks.
Measuring SEO success is important to ensure you are moving in the right direction. Effective search engine optimization requires testing different things, seeing what works and what doesn't. If you don’t track your metrics, you won’t know if your tactics are working.
Here are a few metrics you should be keeping tabs on.
This can be tracked by 2 free tools.
1. Google Search Console
You can check the following data:
Clicks (actual traffic to your website)
Impressions (how many times you appeared in the search results)
Average CTR (percentage of impressions that resulted in a click)
Average position (position of your website, based on highest position whenever it appeared in the search)
Filter the data by:
Search type (web, image, video)
Date
Search query
Specific landing page
Country
Device
Search appearance (types of different rich snippets)
New feature - Google Search Console insights.
2. Google Analytics
Navigate to Acquisition → All Traffic → Channels → Organic Search → Landing Page.
Google Analytics reporting is advanced. You can use a lot of different types of data and filter different variables.
To keep it simple, you can start by monitoring the following:
Users
New Users
Sessions
If you want to filter out and check a group of pages (for example all pages that fall under /blog category) or a specific page (for example an article), you can click on Advanced, select Containing next to Landing Page and insert part of the URL you want to filter out.
Based on your keyword analysis, you will select the most important keywords you want to rank for. Track the progress of these KWs and how the position changes over time.
The Google Search Console report based on average position is a free report you can check, but it’s not very accurate.
If you are starting small, you can check the Google results manually and write down the results.
If you want a more convenient and precise solution where you set up keywords once, you’ll need to use a paid solution like Ahrefs or SEMrush. They’ll update your KWs automatically based on your selected frequency and you only need to come back to check the report.
You can use Google Analytics to track conversions and sales, and filter them by organic traffic. More details for setting up goals can be found here and details for setting up ecommerce tracking can be found here.
In short, you can choose to track many different conversions based on your website’s and specific page’s purpose.
Ideas for conversions to track:
Sign-ups for your newsletter
Sign-ups for a course
Sign-ups for webinar or podcast
Sign-ups for the trial version of your product
Sign-ups for a demo
Downloads of your ebook
Purchases of your product or service
In order to make sense of all the data you have gathered, it’s recommended you make a quick and simple report to see the global view of how your SEO efforts are paying off.
Google data studio is a free data visualization tool that enables you to create custom interactive dashboards and insightful reports on the fly.
Whether you are an SEO beginner, launching a new website, or optimizing an existing one, these SEO basics are meant to help you or increase your organic search traffic.
Now you have an idea of how search engines work and how these SEO techniques can help you craft your SEO strategy and improve your overall digital marketing efforts.
By implementing the fundamental tactics of SEO, you’ll ensure that your website will be crawled and indexed correctly by Google. What’s more, you’ll more effectively make use of keywords to attract visitors, increasing the chances of them clicking on the search result and boosting your click-through-rate (CTR).
Keyword ideas and analysis
Google itself
Keyword Sheeter - Generating keywords
Keyword Planner - Search volume and ideas
Keyword Surfer - Search volume and ideas
Google Trends - Search trends
Answer The Public - Ideas for content
Competitor analysis
SimilarWeb Chrome extension - Website details like traffic, bounce rate, demographics
MozBar Chrome extension - Comparing the strength of website
Technical SEO
Google’s PageSpeed Insights and GTmetrix - Measuring website’s loading speed
Google’s Mobile-Friendly Test - Check website’s mobile compatibility
Link building
Backlink Shitter - Link building ideas from competitors
Measuring success
Google Search Console - Organic traffic, impressions, CTR and average position
Google Analytics - Organic traffic, conversion tracking
Allround tools
On-page technical SEO tools
Screaming Frog - One of the best tools for monitoring the more technical side of the website. It can help you with sitemaps, redirects, noindex and nofollow tags, checking multiple meta titles, descriptions and headers at once. It’s free up to 500 URLs, which will be enough if you have a small website.