We present you with a complete SEO Article with all the terms related to the subject so that you can solve your digital doubts.
Within SEO, there are many terms and concepts that we need to be clear about when optimizing and positioning our site or project in the best possible way.
A reasonable interpretation of a term is essential. For this reason, in this section, we will expand new terms and "words," all related to SEO, so that you are constantly updated.
1. Google algorithm
The Google Algorithm is the search engine's way of positioning pages before a search. That is, it is what decides if you go first, second, or on the second page.
This algorithm changes about 500 times a year and is challenging to keep track of. That is why it is preferable to know essential changes such as Panda and Penguin, how they affect SEO and how to recover.
Anchor text
The Anchor Text or anchor text is the visible text in a link or hyperlink that provides information about the content to which we want to direct the user and the search engines.
Search engines have improved over time and increasingly use more factors to create their positioning rankings. One of these metrics or factors is the relevance of a link. The significance of a link depends both on the authority of the page where that link comes from and on the visible text of the anchor text. Of course, the association should always be as natural as possible, or Google will understand it as a bad practice.
- We can classify anchor text into the following types:
- Naked or no anchor text. Only one URL is displayed. For example www.40defiebre.com
- Generic. Include words such as: "this blog," "click here," "this page."
- Keyword. Depending on whether we are interested in positioning one or another keyword, we use a different anchor text and choose the terms we want to highlight, such as "Link Building."
- Name. When it consists of a text other than the previous ones, the objective is to link to a brand, a website, etc. The link would be: "ratechpoint" or "storialtech."
2. Backlinks
The Backlinks are the links or inbound links that point from other pages to your own. The number of backlinks on your page is essential because the more relevant pages that link to you, the more notoriety your website will gain in the eyes of Google. Make sure they are natural and convenient links, always quality before quantity.
Black Hat SEO or negative SEO
In SEO, Black Hat is called the attempt to improve the search engine positioning of a web page using unethical techniques or techniques that contradict Google's guidelines, "cheating." These practices are increasingly penalized by Google. Some examples of Black Hat SEO are:
- Cloaking
- SPAM in forums and blog comments
- Keyword stuffing
Keyword cannibalization
Keyword cannibalization occurs when several pages compete for the exact keywords on a website, confusing the search engine by not knowing which page is the most relevant for that keyword, causing a loss in positioning.
How is this solved? The easiest way is to focus each page on one or two keywords at the most. If it cannot be avoided, we will have to create a main page of the product from where you can access the pages of the different formats in which we will include a canonical tag to the home page of the product.
3. Cloaking
Cloaking is a widely used Black Hat SEO technique that displays different content depending on whether it is a user or a search engine robot that reads it.
Google is very hard with this practice, and although years ago it could have given results, forget it, it is outside what search engines are looking for with their updates: a more natural, ethical, and user-focused SEO.
Duplicate content
Duplicate content occurs when the same content appears in multiple URLs. In principle is not a reason for penalty unless a high percentage of your website has duplicate content. Having a few identical pages won't make Google mad at us, but avoiding it will give you clues that we're on the right track.
Although it does not imply a penalty, it can generate a loss of positioning potential because search engines do not know which are the most relevant pages for a specific search.
CTR
The CTR (Click Through Rate) is the number of clicks a link gets compared to its number of impressions. It is always calculated in percentage, and it is a metric that is typically used to measure the impact that a digital campaign has had.
How to calculate the CTR?
As we said before, the CTR is calculated in percentage. It is obtained by dividing the number of clicks that a link has received by the number of times it has been seen by users (impressions) multiplied by 100.
Let's see an example: Let's imagine that we have a result in Google that has been seen 2000 times and that has obtained 30 clicks; our CTR would be calculated like this:
- CTR = (Clicks / Impressions) x 100 = (30/2000) x 100 = 1.5%
- CTR = 1.5%
4. Keyword density
The density of keywords is the percentage of times a word (or series of talks) appears in the whole text compared to the total number of words.
A few years ago, keyword density was one of the most critical factors in SEO positioning, as it was the method used by search engines (Google, Yahoo, Bing) to identify the main topic of a page.
However, SEO has changed now. Google's guidelines recommend writing in the most natural way possible. You have to register for the user instead of for the search engine.
Although there are still people who recommend not to exceed 3% of the density of keywords, there is no ideal percentage.
Canonical Label
The Canonical tag was introduced by Google, Yahoo!, and Bing in 2009 to solve the problem of duplicate or similar content in SEO.
If there is no canonical tag in your code on a set of pages with duplicate or similar content, search engines will have to decide which URL is best suited to what the user is specifically looking for. However, if we introduce this tag, we are the ones who tell Google and other search engines which is our favorite page. This will improve the indexing and positioning process of our website in SERPs.
Let's see an example: if our website is the platform from which we sell flats in the Chueca neighborhood of Madrid, and we have several pages with very similar content, we must choose the URL by which we want to position ourselves as canonical. This may be the one that has brought us the most traffic or brings the most significant benefit.
To use the canonical tag effectively in SEO, just follow these steps:
- Choose which is the primary or canonical page.
- Decide which or which are your secondary pages can compete in the positioning with the main one.
- Add the canonical tag in the secondary pages pointing to the main page between "<head>" and "</head>."
- Put the canonical tag on the main page pointing to itself between "<head>" and "</head>."
Robots label
The Meta Robots tag is an HTML tag used to tell search engines to treat a URL in a certain way.
This tag is necessary if we do not want our website indexed or positioned in search engines.
This function can also be performed through the Robots.txt file on the page.
The difference between using the Meta Robots tag and the Robots.txt file is as follows:
- We tell Google that we do not want to index certain pages through the tag, but we do want bots to crawl them.
- However, if we use the Robots.txt file, we tell bots not to bother directly entering and crawling certain pages.
This difference is crucial that you take into account. You will understand it better with an example:
Imagine that you have 2 URLs that you don't want to appear in the Google index.
Url 1: blocked by robots.txt file
This URL will not be crawled, nor will it be indexed (a priori, never trust Google 100%.
Url 2: blocked with meta robots tag
When blocked with the meta robots tag, this URL will not be indexed, but it will be able to be crawled by search engines, which will cause all content to be analyzed, and therefore, search engines can track and follow the links other pages.
5. Google panda
Google Panda is a change in Google's algorithm published in the United States in February 2011 and in Europe in April of the same year. It affected more than 12% of all search results at its exit.
The maxim with Panda to avoid being penalized is to be sure that your content is totally original and adds significant value to your user. You keep the page updated or even look for new formats to enrich your contribution to the user. The most actionable metrics, in this case, will be the bounce rate, the CTR in your search results, the time spent, and the number of page views.
Google penguin
Penguin is the official name for the Google algorithm update designed to fight webspam. This update was released in April 2012.
It focuses on the off-site factors of a website, rewarding those sites that have a link profile with links from high-quality domains and are not manipulated, and trying to punish those pages that have violated Google guidelines, which have profiles of unnatural links, too many links on low-quality sites, etc.
It was Google who, from the beginning, decided that the links generated to a website were a sign that its content was relevant. Hence, everyone started developing links to gogó. However, Google Penguin is a "where I said I say, I say Diego."
The improvements implemented by the algorithm include better detection of links of little value, purchased, in article networks, directories, and basically any dynamic that involves trying to modify the link profile of your website. The best way to ensure that you are not penalized by Penguin is to adhere to Google's guidelines and passively link through your content.
How does SEO change with Google Penguin?
- Natural links, that is, are generated passively or through real value. The syndication of articles, spinning, hidden links, directories (free or paid), promotions that result in links, etc., is prohibited.
- Variety of anchor text: It no longer makes sense to generate links with a link text that you want to position. If Google detects a pattern that it does not consider natural, it can penalize you.
- Search your niche: The most valuable links are those of domains and pages in your place or that talk about topics with a relationship.
- Quality, not quantity: It is preferable to generate few quality links than many of little value.
6. Keyword
It refers to the keyword (or keywords) to the terms we want to attract traffic to our website through search engines. You must consider some factors associated with Keywords (abbreviated KW), such as competition, the number of searches, the conversion, or even the potential as a branding tool.
The choice of one or another keyword will determine the strategy, the content of a page, the appearance of that keyword in texts and tags, and other SEO positioning factors.
Keyword stuffing
Keyword Stuffing is a Black Hat SEO technique that consists of excessive keywords within a text with the wrongly focused objective of giving this word more relevance. Google very often penalizes this type of over-optimization.
To avoid any type of adverse action by Google, the texts should always be written to provide value to the user and in the way that best suits your audience profile. If the reader provides helpful, original, and well-synthesized information, that will be a better indicator for Google than any variation in the number of keywords in the text.
No percentage defines a perfect keyword density, and Google recommends above all naturalness.
7. Link baiting
The technique of attracting links organically by creating high-value content. One of the essential factors for search engine positioning is the number of links to a given page.
Link Baiting intends for many users to link to content on our site. To do this, we must create original, relevant, and novel content, such as articles, videos, or infographics, that attract users' attention.
Link building
Link Building is one of the foundations of web positioning or SEO, which seeks to increase the authority of a page as much as possible by generating links to it.
The algorithms of most search engines such as Google or Bing are based on SEO on-site and SEO off-site factors, the latter based on the relevance of a website, whose primary indicator is the links that point to it or backlinks. There are other factors, such as the anchor text of the association, if the connection is followed or not, brand mentions, or links generated in RRSS.
It is essential to remember that good content is often linked naturally, so the effort to get links happens organically and with less effort than in other ways.
Link juice
It is the authority that transmits a page through a link. Google positions web pages based on their head and relevance, this is transferred from one page to another through links, and this transmitted authority is what we call Link Juice.
To understand it, we have to understand a web page as a large glass of juice (web) to which we make various holes (links) at the base. In this way, a drink that has a hole will transmit all of its link juice through that single hole. If you have 10, each hole will pass 10% of the total link juice, and so on.
Long-tail
The long tail or the long tail is a statistical term that refers to the distribution of a population.
Let's suppose that your website attracts traffic through 100,000 keywords, and you focus on the 100 with the most visitors. Let's imagine that they represent around 20% of total traffic (depending on the nature of your website) corresponds to these terms, and the remaining 80% will correspond to terms with a deficient number of searches. So the vast majority of traffic that your website attracts comes through words that you are not analyzing and that you do not even know what they are.
We call long-tail searches with more specific terms that individually generate very little traffic, but together they are the largest source of visits on the web. The time applies to other realities apart from online marketing; It was popularized by Chris Anderson in a Wired article, citing examples of companies that have succeeded thanks to the business generated by their long tails, such as Amazon, Netflix, or Apple.
8. Meta Tags
Meta tags or meta tags are information included in web pages but are not directly viewed by the user. They are used to provide information to browsers and search engines in a way that helps them better interpret the page and are written in HTML language within the web document itself.
Meta tags have been necessary at the SEO level due to their ability to affect search engine behavior, providing information on which pages a website should position for, giving a description of it, or blocking access or indexing by users. Search engine robots.
Microformats
Microformats are a simple form of code that gives content semantic meaning so that machines can read it and understand our products or services.
If we add Microformats to our website, Google can read it and display it in search results. This information can include user votes, photos, author's name, video, audio, etc.
9. Not provided
The term "not provided" is a term used in Google Analytics that identifies all "safe" traffic within Google, or what is the same, all traffic that comes from users who have logged into their Google account.
What about this data? What do I do with them? In this post, we will discover the different options to know how to interpret this data.
Off-site SEO
The part of the SEO work focuses on factors external to the web page in which we work and that affect our site. It includes external links, social signals, mentions, and other metrics that reinforce the page's authority.
One of the most critical tasks of off-site SEO is link building, generating links that point to your page on external websites, with which Google will give it greater relevance.
On-site SEO
On-site SEO or On-page SEO is a set of internal factors that influence the positioning of a web page. They are those aspects that we can change ourselves on our page, such as:
The meta-information, such as the title or the meta-description
- The URL
- The content
- The <alt> attribute in images
- The web structure
- The internal linking
- HTML code
Optimizing SEO On-site is an essential process that every web page must take care of if it appears in search results.
10. Pagerank
The Page Rank is how Google measures the importance of a website; the search engine classifies the value of the websites on a scale of 1 to 10.
When a page links to another website, it transmits a value, and this value depends on the Page Rank of the page that connects.
Currently, Google has stopped publicly updating the Page Rank, and now no one can see what score a website has for the search engine.
However, although they continue to use it internally to establish their search results, it has less and less weight within the whole algorithm.
The Page Rank is given by factors such as the number of links and domains pointing to the web, their quality, the age of the environment, etc.
11. Query
The English term "query" means doubt or question. When we talk about databases, query or query string is a request for data stored in said BB.DD., although generically, it can refer to any interaction. When we talk about search engines, a query is a term that we write in Google, which will later lead to a SERP.
12. Search Engine Ranking
Search Engine Ranking is your website's position on a search results page. That is, the position in which you appear in Google, Yahoo. Bing…. when a user performs a search.
To improve our positioning, we must use strategies and tools that help us optimize our website, increasing accessibility, usability, and content.
13. Schema Markup
Schema Markup is the specific vocabulary of tags (or microdata) that you can add to the HTML code of your website to provide more relevant information. This will help search engines understand your content better and provide better results. Also, improve the way your page is rendered with rich snippets that appear below the page title.
Schema.org is the reference website for this type of strategy, where you will find all kinds of hierarchies and ways to organize your content. But wait, what can I structure? Hundreds of things! Nowadays, there are many labels to structure, and indeed with time, there will be more: places, events, movies, books, recipes, people, etc. Also, to make it much easier, Google created "Markup Helper." Very useful.
SEP
(Search Engine Results Page) refers to the results page of a search engine, such as Google or Bing.
It is the page that appears after conducting a search. It is where the results are displayed in order.
The more a website is optimized according to the quality criteria of the search engines, the more likely it will be to rank better in the SERPs.
Sitemap
A sitemap or website map is an XML document sent to search engines. This document allows search engines to have a complete list of the pages that make up a website to index pages that their robots cannot access because there are no direct links, being behind a form, etc.
Spinning
Spinning is a Black Hat SEO technique that refers to creating an article by reusing different original texts.
In this way, the generation of content is accelerated simply. It can be carried out using software that automates modifying the content or manually, making people believe that they are different texts using synonyms or changes of order and words.
Although this technique has been widely used, it automatically falls within Google's penalty factors. Since its now-famous Penguin, Google detects these practices more frequently.
14. White Hat SEO
White Hat SEO is those ethically correct techniques that meet the guidelines set by search engines to position a website.
Its objective is to make a page more relevant to search engines. To get a suitable SEO White Hat, there are some characteristics that you should take into account:
White Hat SEO is the most beneficial way to optimize the positioning of a website in the medium-long term.
Learn More: