DiccioSEOnario of the real digital language: we present a complete SEO dictionary with all the terms related to the subject so that you can solve your digital doubts.
Within SEO there are many terms and concepts that we need to be clear about when optimizing and positioning our site or project in the best possible way.
SEO dictionary with all the necessary terms to optimize and position your website Tweet this
The good interpretation of a term is essential. For this reason, in this section we will expand new terms and «words», all related to SEO so that you are always up-to-date.
The Google Algorithm is the way the search engine has to position the pages before a search, that is, it is what decides if you go first, second or on the second page.
This algorithm changes about 500 times a year and it is difficult to keep track of it. That is why it is preferable to know well important changes such as Panda and Penguin , how they affect SEO and how to recover.
The Anchor Text or anchor text is the visible text in a link or hyperlink that provides information about the content to which we want to direct the user and the search engines.
Search engines have improved over time and increasingly use more factors to create their rankings . One of these metrics or factors is the relevance of a link. The relevance of a link depends both on the authority of the page where that link comes from, and on the visible text of the anchor text. Of course, the link must always be as natural as possible or Google will understand it as a bad practice.
We can classify the anchor text in the following types:
- Naked or without anchor text. Only one URL is displayed. For example: www.seomoves.com.au
- Generic. It includes words such as: “this blog”, “click here”, “this page”.
- Keyword. As we are interested in positioning one or another keyword, we use a different anchor text and choose the terms that we want to highlight, for example ” Link Building “.
- Name. When it is formed by a text different from the previous ones, and the objective is to link to a brand, a website, etc. The link would be: ” Seomoves “.
Backlinks are inbound links that point from other pages to your own. The number of backlinks on your page is important because the more relevant pages link to you the more notoriety your website will gain in the eyes of Google. Make sure they are natural and convenient links, always quality before quantity.
Black Hat SEO o SEO negativo
In SEO, Black Hat is called the attempt to improve the search engine positioning of a web page using unethical techniques or that contradict Google’s guidelines, cheating. These practices are increasingly penalized by Google . Some examples of Black Hat SEO are:
- SPAM in forums and blog comments
- Keyword Stuffing
Cannibalization of keywords
Keyword cannibalization occurs when on a website there are several pages that compete for the same keywords, confusing the search engine by not knowing which page is the most relevant for that keyword, causing a loss in ranking.
How is this solved? The easiest way is to focus each page on one or two keywords at most and in the event that it cannot be avoided, we will have to create a main page of the product from where the pages of the different formats are accessed, in which we will include a canonical tag to the main product page.
Cloaking is a widely used Black Hat SEO technique that consists of displaying different content depending on whether it is a user or a search engine robot that reads it.
Google is very hard with this practice and although years ago it could have given results, forget it, it is out of what search engines are looking for with its updates: a more natural, ethical and user-focused SEO.
Duplicate content occurs when the same content appears in multiple URLs and that in principle is not a reason for penalty, unless a high percentage of your website has duplicate content. Having a few duplicate pages won’t make Google mad at us, but avoiding it will give you clues that we’re on the right track.
Although it does not imply a penalty, it can generate a loss of positioning potential because the search engines do not know which are the most relevant pages for a given search.
The CTR (Click Through Rate) is the number of clicks a link gets regarding its number of impressions. It is always calculated in percentage, and is a metric that is normally used to measure the impact that a digital campaign has had.
How to calculate the CTR?
As we said before, the CTR is calculated in percentage. It is obtained by dividing the number of clicks that a link has obtained by the number of times it has been seen by users (impressions) multiplied by 100.
Let’s see an example: Let’s imagine that we have a result in Google that has been seen 2000 times and that has obtained 30 clicks, our CTR would be calculated like this:
- CTR = (Clicks / Impressions) x 100 = (30/2000) x 100 = 1.5%
- CTR = 1,5 %
Keyword density is the percentage of times that a word (or series of words) appears in the text as a whole versus the total number of words.
A few years ago, keyword density was one of the most important factors in SEO positioning, as it was the method that search engines (Google, Yahoo, Bing) used to identify the main subject of a page.
However, SEO has changed, now the Google guidelines recommend writing in the most natural way possible, that is, you must write for the user instead of the search engine.
Although there are still people who recommend not exceeding 3% the keyword density, there is no ideal percentage .
The Canonical label was presented by Google, Yahoo! and Bing in 2009 to solve duplicate or similar SEO problems.
If your canonical tag does not exist in your code in a set of pages with duplicate or similar content, search engines will have to decide which URL is best suited to what the user is specifically looking for. However, if we introduce this tag, we are the ones who indicate to Google and other search engines which is our preferred page. This will improve the indexing and positioning process of our website in SERPs .
Example of canonical tag: <link rel = ”canonical” href = ”http://www.miweb.com/principal» /> ”
Let’s see an example: if our website is the platform from which we sell flats in the Chueca neighborhood, in Madrid and we have several pages with very similar content, we must choose as canonical the URL by which we want to position ourselves. This may be the one that has brought us the most traffic or the one that brings the most benefit.
To use the canonical tag effectively in SEO, just follow these steps:
- Choose which is the canonical or main page.
- Decide which or which are your secondary pages that can compete in positioning with the main one.
- Add canonical tag on child pages pointing to main page between “<head>” and “</head>”
- Put the canonical tag on the main page pointing to itself between «<head>» and «</head>»
The Meta Robots tag is an HTML tag that is used to tell search engines to treat a url in a certain way.
This tag is necessary if we do not want our website to be indexed or positioned in search engines .
This function can also be performed through the Robots.txt file on the page.
The difference between using the Meta Robots tag and the Robots.txt file is as follows:
- Through the label we indicate to Google that we do not want to index certain pages, but we are interested in bots tracking them.
- However, if we use the file Robots.txt we tell bots not to directly bother entering and crawling certain pages.
This difference is important to keep in mind. You will understand it better with an example:
Imagine that you have 2 urls that you do not want to appear in the Google index.
Url 1: blocked using the robots.txt file
This url will not be tracked nor will it be indexed (a priori, never trust 100% of Google :-P).
Url 2: blocked with meta robots tag
This url, when blocked with the meta robots tag, will not be indexed but it can be tracked by search engines, which will make all content analyzed and therefore, search engines can track and follow links to other pages.
Google Panda is a change in the Google algorithm that was published in the United States in February 2011 and in Europe in April of the same year. On his departure, he affected more than 12% of all search results.
The maxim with Panda for not being penalized is to be sure that your content is totally original and provides great value to your user, that you keep the page updated or that you are even looking for new formats to enrich your contribution to the user. The most actionable metrics in this case will be the bounce rate, the CTR in your search results, the dwell time and the number of page views.
Penguin is the official name for the update of the Google algorithm designed to fight against webspam. This update was released in April 2012.
It focuses on the off-site factors of a website, rewarding those sites that have a link profile with high-quality and unmanaged domain links, and trying to punish those pages that have violated Google guidelines, which have profiles of Unnatural links, too many links on low quality sites, etc.
It was Google who, from the beginning, decided that the links generated to a website were a sign that its content was relevant. Hence, everyone set about generating links to gogó. However, Google Penguin is a “where I said I say, I say Diego”.
Improvements that implement the algorithm include better detection of links of little value, purchased, in article networks, directories and basically any dynamic that involves trying to modify the link profile of your website. The best way to ensure that Penguin does not penalize you is to respect Google guidelines and passively generate links through your content.
How does SEO change with Google Penguin?
- Natural links, that is, generated passively or through real value. Syndication of articles, spinning , hidden links, directories (free or paid), promotions that result in links, etc., are prohibited .
- Variety of anchor text : It no longer makes sense to generate links with a link text that you want to position. If Google detects a pattern that it does not consider natural, it can penalize you.
- Search in your niche : The most valuable links are those of domains and pages in your niche or that talk about topics with a relationship.
- Quality, not quantity : It is preferable to generate fewer quality links than many of little value.
It refers to the keyword (or keywords) to refer to the terms by which we want to attract traffic to our website through search engines . You must take into account some factors associated with Keywords (abbreviated KW) such as competition, number of searches, conversion or even potential as a branding tool.
The choice of one or another keyword will determine the strategy, the content of a page, the appearance of that keyword in texts and tags, and other SEO positioning factors .
Keyword Stuffing is a Black Hat SEO technique that consists of the excessive use of keywords within a text with the poorly focused objective of giving more relevance to this word. Google very often penalizes this type of over-optimization.
To avoid any type of negative action by Google, texts should always be written to provide value to the user, and in the way that best suits your audience profile. If the text manages to give useful, original and well-synthesized information, that will be a better indicator for Google than any variation in the number of keywords in the text.
There is no percentage that defines a perfect keyword density and Google recommends naturalness first of all.
Technique of attracting links in an organic way by creating content of great value. One of the essential factors for search engine positioning is the number of links to a certain page.
Link Baiting aims for a large number of users to link content from our site. For this we must create original , relevant and novel content , such as articles, videos or infographics that catch the attention of users.
The Link Building is one of the fundamentals of web positioning or SEO, which seeks to increase the authority of a page as much as possible by generating links to it.
The algorithms of most search engines such as Google or Bing are based on the factors of on-site SEO and off-site SEO , the latter based on the relevance of a website, whose main indicator is the links that point to it or backlinks . There are a number of other factors, such as the anchor text of the link, whether the link is follow or not, brand mentions or links generated in RRSS.
It is important to keep in mind that good content is often linked naturally, so that the effort to get links happens organically and with less effort than in other ways.
It is the authority that transmits a page through a link. Google positions web pages based on their authority and relevance, this is transferred from one page to another through links, and this transmitted authority is what we call Link Juice.
To understand it we have to understand a web page as a big glass of juice (web) to which we make various holes (links) in the base. In this way, a glass with a hole will transmit all its link juice through that single hole. If you have 10, each hole will pass 10% of the total link juice, and so on.
The long tail or long tail is a statistical term that refers to the distribution of a population .
Let’s suppose that your website attracts traffic through 100,000 keywords, and you focus on the 100 with the most visitors. Let’s imagine that they represent around 20% of the total traffic (depending on the nature of your website) correspond to these terms, and the remaining 80% will correspond to terms with a very low number of searches. So the vast majority of traffic that attracts your website comes through terms that you are not analyzing and that you do not even know what they are.
That is what we call long tail, searches with more specific terms that individually generate very little traffic, but together they are the largest source of visits on the web. The term is applicable to other realities apart from online marketing; It was popularized by Chris Anderson in a Wired article, giving as examples of companies that have succeeded thanks to the business generated by their long tails, such as Amazon, Netflix or Apple.
Meta tags or meta tags are information included in web pages but which in turn are not directly viewed by the user. They are used to provide information to browsers and search engines to help them better interpret the page and are written in HTML language within the web document itself.
Meta tags have been important at the SEO level due to their ability to affect search engine behavior, providing information on why pages should rank a website, giving a description of it or blocking access or indexing of the web by search engine robots.
Microformats are a simple form of code that gives semantic meaning to content so that machines can read it and understand our products or services.
If we add Microformats to our website, Google can read it and show it in search results. This information may include user voting, photo and author’s name, video, audio, etc.
The term “not provided” is a term used in Google Analytics that identifies all “safe” traffic within Google, or what is the same, all traffic that comes from users who have logged into your Google account.
What happens to this data? What do I do with them? In this post we discover the different options for you to know how to interpret these data.
It is the part of SEO work that focuses on factors external to the web page we work on and that affects our site, including external links, social signals, mentions and other metrics that reinforce the authority of the page.
One of the most important tasks of off-site SEO is link building , generating links that point to your page on websites external to it, with which Google will give it greater relevance.
SEO On-site or SEO On-page is a set of internal factors that influence the positioning of a web page. Are those aspects that we can change ourselves on our page such as:
- The meta information , such as title or meta-description
- The content
- The <alt> attribute in images
- The web structure
- Internal linking
- HTML code
Optimizing SEO On-site is an essential process that every web page must take care of if it wants to appear in search results.
Page Rank is the way that Google measures the importance of a website, the search engine classifies the value of websites on a scale of 1 to 10 .
When a page links to another website, it transmits a value, and this value depends on the Page Rank of the linking page.
Currently, Google has stopped publicly updating the Page Rank, and now no one can see how well a website scores for the search engine.
However, although they continue to use it internally to establish their search results, it has less and less weight within the algorithm set.
The Page Rank is given by factors such as the number of links and domains pointing to the web, their quality, the age of the domain, etc.
The English term “query” means doubt or question. When we talk about databases, query or query string is a request for data stored in said BB.DD., although in a generic way it can refer to any interaction. When we talk about search engines, a query is the term that we write in Google, a query that will later lead to a SERP .
Search Engine Ranking
Search Engine Ranking is the position your website occupies on a search results page . That is, the position in which you appear in Google, Yahoo. Bing…. when a user performs a search.
To improve our positioning we must use strategies and tools that help us optimize our website, increasing accessibility, usability and content.
Schema Markup is the specific tag vocabulary (or microdata) that you can add to your website’s HTML code to provide more relevant information . In this way, search engines will better understand your content and provide better results. Also, improve the way your page is displayed with rich snippets that appear below the page title .
Schema.org is the reference website for this type of strategy where you will find all kinds of hierarchies and ways to organize your content. But wait a minute, what can I structure? Hundreds of things ! Currently there is a wide variety of labels to structure and surely there will be more over time: places, events, movies, books, recipes, people, etc. Also, to make it much easier, Google created “ Markup Helper ”. Very useful.
(Search Engine Results Page) refers to the results page of a search engine, such as Google or Bing.
It is the page that appears after performing a search, it is where the results are displayed in order.
The more a website is optimized according to the quality criteria of search engines, the more likely it is to rank better in SERPs.
A sitemap or website map is an XTML document that is sent to search engines. This document allows search engines to have a complete list of the pages that make up a website, so that they can index pages that their robots cannot access due to the lack of direct links, being behind a form, etc.
Spinning is a Black Hat SEO technique that refers to creating an article by reusing different original texts.
This accelerates the generation of content in a simple way. It can be carried out using software that automates the process of modifying content or manually, making people believe that they are different texts through synonyms or changes in order and words.
Despite the fact that this technique has been widely used, doing it automatically falls within Google’s penalty factors. From its already famous Penguin, Google detects these practices more frequently.
White Hat SEO
White Hat SEO are those ethically correct techniques that meet the guidelines set by search engines to position a website.
Its objective is to make a page more relevant for search engines. To get a good White Hat SEO there are some characteristics that you should take into account:
White Hat SEO is the most beneficial way to optimize the positioning of a website in the medium-long term.