Growthworx Marketing Solutions

Carrying out an SEO analysis is an essential process when setting up a web positioning strategy since it allows us to know where we are and what path we should follow. But carrying out a complete SEO analysis is not done in 5 minutes, it requires a lot of effort, time, and patience, and at certain times we must prioritize what modifications we want to carry out on our website. If we fall into impatience and must race the strategy, not only will we be wasting time, but we can also harm web positioning.

In this post, we are going to carry out a real case of SEO analysis so that you can learn essential parameters of web positioning and discover how your website is and what tasks you must do to grow organic traffic on your website. At the same time, we will see tools that will facilitate auditing SEO On-Page factors, usability and user experience that affect performance, and even external aspects both at the level of your website and the competition.

General SEO analysis

1. Starting an SEO audit

When we talk about auditing we are referring to analyzing the current situation in which our website is concerning SEO, that is, finding out where we are and steps we must follow to optimize the website. Here we will carry out a practical analysis of the Seo company Growthworx Marketing Solutions, looking in detail at the detail the elements that are affecting the performance of the web positioning of said page.

At the end of the post, we will have carried out a complete SEO analysis, and obtained a large list of tricks and actions that we can extrapolate to other websites to improve their performance.

2. Check the indexing of your site

Search engines carry out a series of steps to "accept" or not if your page should be positioned. The first of these steps consists of tracking it by robots, these robots called bots analyzing your page and crawling the content, once they have analyzed the content you can move on to the next room, indexing.

In the indexing process, Google offers its first evaluations of your website, analyzing whether or not it should rank well according to different factors. Depending on whether or not you comply with these SEO factors, your page will appear in the last ranking of Google or it will get a good position for some keywords.

Therefore, verifying that the indexing of our website has been adequate is the first step in good SEO analysis. To verify it we must enter the following search in Google: site: "your domain"

Once the search is done, we see that Google has indexed a total of 36 pages in the results, and we see that the first result is the main page, which is confirming that the domain architecture is correct, since it gives higher priority to the cover, and therefore, the distribution of the page rank of the domain will be done correctly.

Conveniently, we write down all the details that we observe when doing the SEO analysis, for example, we observe that the majority of results are blog posts and that the blog categories are also indexed so we would have to evaluate if it is beneficial for this type pages are indexed or not.

We see that other of the pages that are indexed are those of services (videos for companies, video editing, etc ...), these pages should be the ones with the highest visibility since we try to reach the users who require these services. However, if we access the interior of the web we see pages such as the contact page that is not indexed by Google, this is correct since it is a page that does not provide value or relevant content to the user, so it is better not to index it and avoid Google Spend energy browsing these pages.

If when analyzing your website you find that worthless pages like the one in the legal notice are indexed, you should point it out in your SEO analysis. Later I will tell you how to solve this type of "problem".

When carrying out this first general SEO analysis we discover the number of pages that Google has indexed, as well as architectural problems of our website.

3. Evaluate our identity

We use the concept of identity to refer to Google queries about our company or brand. It is very important that at the beginning of our positioning strategy we dedicate a very important part to this aspect since it is the most intentional organic traffic, who is looking for our brand is because they already know what they want and they are going to contract our product or service. For this reason, we are talking about the most valuable searches and on which we must concentrate much of our effort at the beginning of a project.

You are the owner of that company, product, brand or information, and therefore, you must have the highest visibility for those search terms, you must be the first result when a user searches for your term in the search engines. Otherwise, we will be losing great "sale" potential, and we will be facing a big problem.

Therefore, the second part of our SEO analysis tries to assess our identity. To put it into practice, we will carry out searches related to the name of our company to see what results we obtain, but in case we have our products or concepts related to our brand, we should also analyze those searches.

When searching for our name on Google, Growthworx Marketing Solutions, we observe that although it appears first on the map, in organic search, which is what interests us, we are second, so we should try to position our brand at number 1 As we have already mentioned before, we must strive in the SEO optimization of these terms since the user's intention is very high, they are usually users who already have knowledge of our services and are more likely to become clients.

4. The Google “Cache”

The cache is nothing other than copies of the pages that Google saves on its servers, the advantage that these copies provide is that it saves certain elements once you have accessed the page allowing a faster load on future occasions, which translates into the better user experience.

Apart from that stored version, we can see the last time Google cached our site, and based on that date determine how “quickly” Google caches our site. A relatively high caching frequency will mean that we have a "healthy" site.

To corroborate the tracking frequency, we must search the browser for the following: cache: yourdomain, in our case cache:


We see that the date of the last cache version is September 9th, which means that our page is cached quite frequently.

Another positive point that this cached version gives us is that we can see the website by removing CSS and javascript, that is, we simply see the text version, that is, as Google reads our website. In this way, we can get an idea of ​​how Google reads our website, and see if the structure of the elements is ideal.

For complete SEO analysis, the cache of each of the most important pages of a website should be analyzed. Only by using Google and in a simple way will we know the frequency in which Google passes, when it was the last time that we searched our pages, and if the text version has a correct structure.

5. Mobile Keyword Search

Currently, the use of mobile phones is already above the use of desktop computers, and how could it be otherwise, this aspect also influences SEO. Recently Google has confirmed that it uses different algorithms for positioning on mobile phones and computers.

Factors such as domain authority or quality content are still important for mobile positioning, but it is also necessary to offer an excellent user experience when browsing. These changes in Google's algorithms mean that when searching for keywords, the results are not the same on mobile phones and computers, so in a good SEO analysis, we must corroborate search results on both devices.

To perform the search for keywords on mobile we can use the developer tools provided by Chrome or directly search on mobile. To do the test we are going to look for the keyword "Corporate videos".


In this particular case, the results have been identified and, when searching for "Corporate Videos" on both devices, it appears in the Top 3 of Google, only below the video results and a traditional organic result. After the analysis of this keyword, we obtain a very positive evaluation, since if in desktop navigation, finding ourselves on the 2nd page is synonymous with finding no one, in mobile phones the results still require more importance and we must be above the first 5 results so that our website does not fall into oblivion.

If the results on mobile move us to positions below the Top 5, we must highlight it in our SEO analysis, since it could be said that we do not exist. Mobile web positioning requires more effort since Google gives us less space in which to show ourselves.

Analyze SEO On-Page

1. Titles and meta descriptions

Among the multiple factors that make up SEO On-Page, titles, and meta descriptions are essential when it comes to achieving a good position and getting higher positions in search engine results.

The title of "title" is the main heading that Google displays in the Snippet, and we can also find it in the upper part of the web (the tab), the meta description is the rest of the content that makes up the snippet. In summary, when we create a new page or a blog entry we have 3 fundamental elements of SEO On-Page that cannot be forgotten: title, URL, and meta description.


The title is an essential factor and we must optimize it with the inclusion of a keyword since this will allow us to achieve better positions in the search results. However, meta descriptions do not have a direct influence on positioning, but this does not mean that we do not have to work on them. The keyword is no longer important in the meta descriptions  but we have to find a way to get the attention of the users so that they click on our results, for this we can use calls to action, thus helping more users to decide on our page and thus attract a higher amount of traffic. Therefore, checking that all the important pages of our website consist of an optimized title and meta description is essential in our SEO analysis.

How can we check titles and meta descriptions?

At first glance, we can see the source code of the page (Ctrl + U) and search for the title tag and the meta description, and check that both tags are optimized for SEO. In a good SEO analysis, it is not enough to check the cover, but we must also check valuable pages and see if they comply with some premises such as:

  • The title tag is present and contains keywords
  • That the titles are not duplicated on different pages
  • The character extension does not exceed 55
  • There is a meta description with calls to action
  • The meta description is extensive but does not exceed the characters visible in the snippet (160)

Having a title and an optimized meta description is very important, in case we do not have them Google will auto-complete it randomly and we will be missing a great opportunity to invite our users to click.

Looking again at the previous snippet we see that there is a job behind it, but even so, the meta description would have to be optimized since there are no calls to action. We should use more relevant information that attracts the user's attention, such as:

"Multimedia services for companies. Everything your company needs to grow. Free estimate!

In addition to that, we should verify that there are no duplicate titles on other pages since it is one of the most common errors that we can find in domains and that it has more importance in SEO since you are fighting for the same keyword on two pages or more of your site instead of optimizing it for other keywords and having more opportunities to get traffic and conversions.

To carry out these analyzes of the titles and meta descriptions in a slightly faster and more professional way, we can use tools such as SEO Quake, which thanks to the diagnosis made by the domain that we indicate, gives us at first glance the title and meta description, as well as the characters that compose it to avoid exceeding the limit that Google is going to show in search results.

In order not to obtain an incomplete SEO analysis, we must check all the titles and meta descriptions of the valuable pages of the site we are auditing. For this, the most effective tool is Screaming Frog. This tool allows you to do a very complete analysis of your web page, allowing you to see at a glance factors such as the absence of meta descriptions, duplicate titles, titles that exceed the limit ... and in this way be able to solve factors that can offer you a very positive impact in the web positioning of your site. In our article Screaming Frog Seo Spider: analyze Seo On Site you can learn all the utilities that this tool offers you.

2. Analyze the presence of Keywords in the content

For a long time the keyword/text relationship has ceased to be an important positioning factor, but optimizing the content by attending to Internet users and that said content is made up of "synonyms" of the keyword, which must be frequently in the content, are aspects that will make Google more visible to your content. While you should not fall into the trap of constantly repeating the same keyword. In the current SEO, we have addressed the plurality of content so that Google takes us into account and we are more relevant than our competition.

To evaluate the content of the website, and the frequency with which keywords are present in the text, we can use the SEO Quake tool again, in this way we will avoid exceeding the percentage of keywords about the text in which Google would penalize us. This tool has a density option, which evaluates the content according to the frequency with which the words appear in the content.

How do we interpret keyword density?

Once we see the most repeated words in the content, we have to certify that it is the keyword that we want to position. Here we realize that the most repeated keyword is “Seo Melbourne” that appears up to 11 times, being 3.26% of the content and that is also present in the title and the H1 tag, so it is evident that this page It is focused on positioning the keyword “Seo Melbourne”.

Although the density could be somewhat higher ( never exceed 10%, Google could penalize us for over-optimization), the optimization of the content is correct, it is observed that the page has been focused on a keyword and the content has subsequently been written with to achieve the objective of positioning itself on the first page of Google.

It is highly recommended to review in our SEO analysis those terms that are most repeated (producer, audiovisual, Zaragoza ...) to corroborate if they are relevant concepts for the web or if we should optimize other different terms.

It is important to take into account the density of the keywords when positioning to optimize it, although the truth is that Google's algorithms have become more "complex" and are betting on Latent Semantic Indexing.

3. Labels for headings (h1, h2, h3…)

The HTML h1, h2, h3, h4, h5, and h6 tags are very useful to correctly structure the content in headings in a hierarchical way. When we create an article for our blog or when creating a service page, we must establish a hierarchical order where the H1 tag must be the main title, and the rest of the tags must respond to subtitles or subsections of it. Without going any further, you can analyze this page and you will see that there is a title (H1): SEO Analysis [Case Study], and later we find subtitles that divide the content into different sections but that have a relationship with the article.

With SEO, we must bear in mind that the H1 tag must be unique, and during the content, we must have at least h2 and h3 tags that allow Google to understand the different sections of the article. With this hierarchical structure, we help Google to "read" the article. In summary, we need an h1 tag and subtitles for a correct structure and for Google to reward us with good SEO results.

To verify what type of tag is a title or subtitle, we can right-click on that text and inspect the item, or use a tool such as the Web Developer extension, which shows us the complete tag structure of the page.

On the website, we are auditing we observe that it has a correct tag structure, although it could be improved by including an h3 tag under the h2 tag "Growthworx Marketing Solutions", and optimizing some tags that do not provide SEO value such as "Your satisfaction ”,“ The highest level ”or“ What they say about us ”.

Another aspect to consider about the h1 tag is that it should not match the title. The h1 tag must include the keyword but without being a replica of the title, a small modification adding a relevant term will help our positioning.

4. Image optimization

The optimization of images for SEO is an opportunity to improve the SEO that occurs in 90% of the websites that we audit, so we must take advantage of those cracks that some websites forget to improve our positioning.

Since Google implemented Universal Search, the opportunities have increased, since when performing any search we can directly visualize images, with the importance that this entails, since an image will always attract more attention than any text result, because it is highly optimized or many calls to action have the snippet.

Therefore, if we can make our images appear in Google's organic results, we will reach a much higher amount of traffic than we would if we are on the first page with a normal text result.

How do we get our images to position themselves?

The first step is to verify whether or not there are organic results of our images and then see if those images meet Google's requirements for positioning.

The first point can be verified by searching site: domino [site:], and limiting the search to the images section. We see that there are more than 100 indexed images from that domain, a good indicator.

The second point is to check that they are optimized. One of the most important points when optimizing images is to include keywords in the image name, to check it we can try to save the image and there we will see the name with which the image was named. These in question have names like: "3d Canva design" or "video editing" so they are designed to position, a factor that we must indicate in our SEO analysis.

In addition to the name, the other key factor when uploading our images to the web is the alt (alternative) attribute. It is an HTML tag that provides more information to browsers, something that Google values ​​positively, and that is considered a very good SEO practice. Originally, this tag was designed to facilitate the reading of images to people with difficulties to make it easier for browsers to identify those images, nowadays it is more SEO practice to achieve better positions. To check if an image has this attribute, we simply have to inspect the element, or use tools such as the aforementioned Screaming Frog, which allows you to analyze the entire domain at once and indicates each of the images in which the attribute is absent. alt.

5. Friendly URLs

Having friendly URLs on our website is not a very important factor in SEO, but it is still a small contribution to improve the SEO of your website. Also, a website with well-worked URLs facilitates navigation, and therefore, the user experience. It can even facilitate the user to access our address directly without having to search for it through the web, since they may be able to remember that URL.

If we go to a page of services of the Growthworx Marketing Solutions website we see that the URLs have been worked, shortening the characters so that the URL does not become inflammable and with keywords in it to facilitate its positioning.

Therefore, we will document it in our SEO audit since it is one more small step for our website to perform better.

SEO content analysis

1. Quality content

Google has constantly modified the algorithms seeking to be as fair as possible when it comes to positioning, but there is one factor that is still as essential as in the origins, and it is none other than quality content. We need quality content on our website to tell Google what our site is about and for which searches it has to position it, without quality content Google can't know who we are targeting and what our goal is.

Each of the pages that we want to position must have a minimum of 300 quality words oriented to a keyword, if the content is better or is copied from some other site, Google can classify it as spam and will never reach the top positions. With SEO Quake we can analyze content quickly and efficiently by accessing the density tab.

The tool counts 674 words on the cover of the website we are studying, but we must analyze if those words are of quality. Analyzing quickly we see that the content is oriented to a Seo Company in Melbourne so we could say that the content of this page is correct, and viewing the rest of the service pages we also see quality content, although the content can always be optimized and get a content of something more quality.

For comprehensive SEO analysis, we will study the content of all the pages of value and if it is not quality content, something more relevant should be written to optimize performance.

2. Subdomains

One of the elements that are not usually taken into account when performing an SEO analysis is to check if the said domain has a linked subdomain.

The subdomains are usually in the root domain, but that is not enough for the page rank to pass if our subdomain does not have any type of link from the main domain, the page rank that is transferred will be null, losing positioning force. Therefore, it is important to identify if a website consists of subdomains, and if they are correctly linked, and if not, to be able to correct it to achieve good results.

When scanning the website under study manually, we see that there is no subdomain of any kind, but to make sure that there is none, it is advisable that we use a tool such as Screaming Frog, or that we do the site search: your domain.

After the search in Screaming Frog we confirm that there are no subdomains, but checking if they exist or not is a good first step to analyze if the structure is correct, and if the page rank step is appropriate.

3. Duplicate content

Duplicate content is one of the most common mistakes that lead to poor web positioning. This practice is very evident in e-commerce as pharmacies, where those in charge of uploading products use the descriptions of the suppliers to fill in the product sheet if we take into account that 800 e-commerce has already done this before that your Google store will never prioritize your site against them since they will have a longer publication date and a domain with more authority.

To check if your content is duplicated, we must search for an extract of the text in Google in quotation marks: "Extract of text taken from the web". So we not only check if we have content copied from other websites, but we also check that there is no duplicate content on our own website , that is, if there are canonization problems .

Checking text extracts from the analyzed website, we certify that the content of the website is original and unique.

4. Canonical version

In line with the problem of duplicate content, the "option" of having multiple versions of the same website , or what is the same, duplicate content on different pages, arises . The technical name for this problem is called canonization. Below we will see 2 ways to correct this problem , one is through the rel = ”canonical” tag , where we indicate to Google what the original content of our website is, and a second option is through a 301 redirect to that original content of our web.

A canonical failure very often happens when the Google bot interprets several versions of the same page (with www, and without www) , with the damage that this represents since it understands that the same content exists twice and does not know what the version is. original website. So we will proceed to check that our domain works well in this regard, and we see that is the main page and that if we enter the domain without www it redirects to that page so that its operation is correct and there is no canonical error on the cover page.

If we discovered that Google has both versions as independent content, we should determine that one of them is the preferred one, we can establish it in the Search Console configuration. If you do not know which domain to set as the main one, it is preferable that you choose the one with the most results indexed in Google (search site: www.yourdomain, and site: yourdomain).

If the content that we find "versioned" on the web is not due to this problem, but because the content is duplicated on some page, we can establish a rel = "canonical" tag to tell Google what the original content is or with which we want to attack a keyword.

In Growthworx Marketing Solutions we see that there are two pages destined to attack the keyword “Seo Melbourne” (;, so we should proceed to include the rel = ”canonical” tag on the service page to tell Google that we want to position the cover for that keyword.

Tools like SEO Quake provide us with information on whether or not the pages have this tag. In short, the canonical versions are a huge problem, and in turn, a huge opportunity to position if we detect the canonization.

Website accessibility

1. Check the robots.txt

The Google bot has time or space in which it is dedicated to browsing your website, which depends on factors such as how extensive your website is or the modifications you make, so we must take advantage of that time that the Google robot is on our site to navigate through the pages that interest us and offer value to the user.

For this we use the robots.txt, in it, we indicate to Google in which pages we do not want it to enter or index them. This file is found in the root hosting of our domain, and it is mandatory that we optimize it in all the domains in which we are going to carry out a positioning job.

That is why in SEO analysis we have to identify this file since it allows us to prevent Google from spending energy on pages of our site that are not relevant. A very clear example is pages like those of the author in a blog or those of the purchase process in an online store, they are pages that we should block.

If we access we realize that if you have this document and there are pages like those of the author blocked, so we deduce that if you are taking advantage of this file correctly.

When creating a website it is very important that we take it into account and block the pages of little value, since this file is valid in all search engines (Bing, Yahoo, Google ...).

2. User Agent

In the SEO world we can find two types of professionals: the pro Black Hack (they use tricks to deceive Google and improve their positioning), and the pro White Hack (they use practices that Google values ​​positively).

One of those tricks to deceive Google is called Cloaking, which consists of teaching content to the user and a different one to Google oriented to SEO results. In OptimumWeb we are not in favor of this type of techniques since in a short period of time they can work but sooner or later we know that a liar is caught before a lame, and Google will end up discovering you.

However, in certain pages that have different versions for countries, or modifications due to languages, they are configured so that the user experience adapts depending on who enters your page. This factor should be well seen by Google since the objective is to improve the user experience, but the truth is that it can be interpreted as Cloaking, so we must view our site as Google does and see that the content is the same as in our browser, we achieve this by changing the User Agent.

Chrome extensions like User-Agent Switcher allow us to see the content as if we were Google. In case there is any visual problem on the page, we should write it down in our SEO analysis, but once this step has been carried out, we check that there are no cloaking problems on our page.

3. Check the sitemap.xml

Like the robots.txt, the sitemap is an indispensable file that our website must have and that will facilitate navigation to the contents of our website, making search engines easier.

In case of not having a sitemap.xml file, our positioning would not collapse but it is a positive factor that Google will value. So in our SEO analysis we will include the verification of the sitemap, the most common route is yourdomain / sitemap.xml


By accessing this route the sitemap exists and perfectly loads the pages, the sitemap is present and well configured.

Is it essential for the indexing of our website?

Having the sitemap will not cause our website to reach or abandon the top 3 of Google, but it will help to properly structure our website and that Google properly crawls the web and transfers the SEO force from one page to another.

4. No follow and No index tags

Preventing the passage of "link juice" to some pages is very important so that our SEO force is not subdivided into many links and ends up practically disappearing, to avoid that we can use the robots.txt as we have already seen or include the non-index tags and do not follow on those pages that we want to be outside of our linking strategy.

These tags are really useful but we must avoid making the mistake of blocking pages that can be relevant and obtain a completely opposite result to what we want. Therefore, in our SEO audit we will analyze that there are no important pages blocked and, if you are, those of little value.

In a quick way we can check it with the source code of the page, and if we can not use that tool that we like so much called SEO Quake that provides us with the information of whether a page consists of non-follow or non-index tags.

In this example, we see that you have blocked the budget page, which hardly adds value.

5. 404 Errors

Finding out all the 404 errors a website has is a very important step in an SEO analysis, since each page with that error is an opportunity that we are missing, and with a little effort we can take advantage of them to transform those 404 errors into conversion opportunities .

The best way to check 404 errors on a website is Screaming Frog, by entering our url in the tool it is in charge of analyzing our website and detecting all 404 errors and temporary and permanent redirects (301 and 302).


We see that there are no 404 errors so everything is correct, but in the case of having 404 errors, they are pages that are linked from other content with which we should redirect that content to a similar page so as not to lose the accumulated page rank who can have that url.

Regarding redirects, we note that there are 9 cases with Status code 301, that is, permanent redirects. The 301 redirects usually spend most of all the Page Rank of the page to be a permanent redirect. In contrast, 302 redirects do not pass page rank, so those pages will not pass their respective page rank, although in this case we did not find any.

Analyzing in more depth the 301 redirects that we found, these are because the final bar (/) appears in the url and redirects to the same page without that bar. In any case, we will proceed to correct it in order to save loading times, thus avoiding the server response time when performing this redirection.

In short, both 404 errors and 301 and 302 redirects are something that we must analyze in order to correct it and pass the entire page rank in a convenient way.

SEO link strategy structure

The link structure of a website is very important in SEO matters, since through links we are passing an important part of the page rank from one page to another.

Based on that we must control the number of outgoing links to control the page rank of each page. As a guideline measure, it usually establishes a maximum of 50 links per page, in case there is a greater number, the page rank may be diluted excessively.


On the Growthworx Marketing Solutions homepage we have a total of 37 links, of which 35 correspond to the internal link, and 5 to the external link, so we would be complying with the SEO parameters and we would not have to limit the number of links.

In the event that the number of links is much higher, we should limit links, either by removing links or applying the no-follow tag in order to avoid the passage of page rank. In this way, we would control the passage of page rank.

2. Use of anchors text

Both inbound links to our site and outbound links require text, which is also SEO relevant. Just as receiving a link from the Zaragoza City Council is not the same as receiving a blog from your cousin who barely works and has no authority, it is not the same receiving a link where the anchor text is “Click here” than “Seo Company in Melbourne” if our website is dedicated to that.

The relevance of the links will be higher the better the context in which we get the link, if for the Seo Melbourne we get a link to an article about Seo Melbourne, we will be passing much more relevance than from a website that has nothing to do with it. with our website. If we also accompany it with the appropriate anchor text, that is, the correct keywords, we will be passing a highly relevant link.

Using words like "Find out more" or "Click here" is going to add absolutely nothing to our internal linking strategy, so we must be smart enough to use keywords in anchor texts.

By analyzing the internal links of Growthworx Marketing Solutions with SEO Quake we can see the links as well as the anchor text, and we see that there is anchor text such as corporate SEO, SEO Melbourne, Website design, etc ... So we can quickly identify that the use of anchor text is correct.

It is always important to check these links for our SEO analysis, as they transfer relevance and importance.

3. Enlaces No Follow

Thanks to the tag rel = "no follow" we can tell Google which links should not follow, and therefore not transfer authority. In this way we work the SEO referring to the page rank, we determine which links we want to pass link juice and which not.

First of all, we must verify which links allow the passage of page rank and which are blocked from passing to certify that there is no error, since we may be blocking access to pages that do provide value.

In this step we go back to using SEO Quake's internal links report, and see which links we are blocking. In our case, we are not blocking the passage of page rank to any page, so we should attribute no follow tags to pages such as social networks or the budget or contact page so that they do not take part of the cover page rank. Since they are pages of little value in terms of relevance, blocking the way to these pages would be a good idea.

What's the point of passing page rank to domains like facebook, instagram or youtube? It doesn't make any sense so we can block the link juice from passing through the tag no follow.

The rel = ”no follow” label makes it easy for us to make a proper internal link structure without removing links. In this way we manage to increase the authority of the pages that it is convenient for us to pass relevance.

Technical aspects of SEO

1. Analyze the speed of the website

Having a website that loads fast today is essential and should be an absolute priority when it comes to web development. A user immediately leaves a page if the loading has been slow or the user experience is not adequate with all that this entails, loss of traffic and therefore revenue. This problem is enlarged in online stores where the web service has yet to be more effective because if not, they will go to another online store where the page does not take long to load and the user experience is positive.

And in terms of SEO, a slow site is also harmful, since the google bot has less time to analyze your site, consequently it limits the SEO potential of your website since it does not get to review all the pages that you would like. .

Another point to take into account the speed of your site is that it is already part of the SEO parameters that Google takes into account when positioning. If you achieve a faster loading speed, you will get better positions.

We can consider that optimal loading time is below 2 seconds, to check the loading speed of our site we can use online tools such as GTMetrix, PageSpeed ​​, or Pingdom Tool. We will use the latter to find out the loading times of the web that we are testing.


When entering the web in the online tool it gives us a score of 87 with a load time of 3.11 seconds. It could be considered that it is a logical load time but the tool also provides us with a series of tips to optimize the speed of the website, since we can optimize the load time substantially.

One of the basic recommendations that these tools usually make you is the optimization of images, which is one of the main causes of slow sites and an easier and more efficient solution. Some other errors such as cache compression or "Leverage Browser Caching" can be solved through code with the aim of optimizing the load.

Ultimately, fast loading websites is equivalent to a better user experience and better search engine performance.

2. Analysis of the mobile experience

It is a fact that every day mobile traffic increases and gains ground to the computer, this must be more than valid reasoning to analyze the mobile experience of a website independently.

This mobile analysis work is not simply a way to verify that the user accesses a website optimized for these devices, but also has its reward in SEO terms, a good mobile experience will cause an increase in search results. That is why it is very important to include this section in a good SEO analysis.

Google offers us a tool to analyze the mobile experience in a simple way:

Analyzed the url we see that the web is optimized for mobile. But even so it warns us of a series of resources that could not be loaded and that it is convenient that we review and notify them in our SEO analysis.

This tool is very useful since, in case it is not optimized for mobile phones, it tells us what the flaws in the mobile experience are and offers us a series of recommendations to optimize mobile browsing.

If when we perform this test we discover that our website is not optimized, we are facing a big problem, because nowadays it is normal that a large part of the web traffic comes from this device and we would be offering a bad experience to quality traffic.

In addition we would have to count the SEO potential that we are wasting for not optimizing the mobile experience, so it is essential to carry out an analysis with the mobile optimization test and document the information to later make the necessary adjustments to offer a good experience.

3. Country versions

In the case of the website that we are analyzing, we have already realized that it does not require any type of label to establish languages ​​or countries, but it is very likely that you will find projects that do require this previous analysis work, we are talking about international SEO .

For a correct international SEO, we must follow some guidelines so that Google indexes the content correctly and does not accuse us of Cloaking or duplicate content.

To establish the language of a website, the best solution is found in the hreflag tag , it is a tag that allows us to relate the url of a website to its respective language . Therefore, in an SEO analysis that requires this section, we should identify if there are versions for different countries and look for the hreflag tag.

As we have already mentioned, the website we are analyzing does not require this type of tag, but if you want to check it, you would simply have to look it up in the source code. And in the SEO analysis we would indicate if the website is intended for different languages, and if it lacks or has the aforementioned tag.

Links, authority and SEO analytics

1. Check the incoming links

Many SEO professionals focus exclusively on working the SEO On Page, but as important as the internal factors are the external ones. And to achieve optimal SEO results we must work both ways, otherwise there will always be competition that surpasses us and achieves better performance.

What external SEO factors improve the reputation of a domain?

This is where external links enter , an essential factor for our site to gain authority. In a good SEO analysis we must count the number of inbound links and the quality of those links , as we have seen before, a link from a prestigious website is not the same as an abandoned or recently created blospot.

A good tool to find out inbound links to a website is Open Site Explorer, which in addition to quantifying inbound links gives us an authority score for our website.

In our case, it gives us an 11 out of 100 domain authority, and a 24 out of 100 cover authority. Regarding the links, 5 have detected us, although it is true that it does not detect all the links that link to our site, for this we must use Premium tools such as majestic or check the links that Google has detected in the Seach Console.

In short, if we have to draw a general conclusion from this domain regarding external SEO factors, we would not say that it does not have a specific job focused on generating authority since there are not many inbound links, in the SEO analysis we would document that external links must be obtained from quality.

In a good web positioning strategy you must take into account internal SEO and external SEO, if we neglect any of them our strategy will not work.

2. Analyze SEO competition

Analyzing the competition in terms of SEO is mostly based on analyzing our keywords and seeing what level of competition they have to see if their positioning is viable or not. We can divide the analysis of a keyword into 3 aspects:

  1. Relevance : we analyze whether the search carried out by the user is intentional and if it fits the service, product, or information we offer. In other words, if we have a Seo Company in Melbourne and a person comes to our site looking for a Seo Company in Los Angels, it is not relevant, even if they have a volume of traffic and generate visits since that person will never become a customer. The relevance of a keyword is an essential point when determining quality.
  2. The number of searches: we must be able to attract quality traffic, but that is not enough, that traffic must be large so that it can have an impact on benefits. For example, " Seo Melbourne" may be a good keyword, but " cheap Seo company in Melbourne that makes 1st ranking on Google" , we would already be talking about an excessive term of the long tail, and it is evident from the length of the query that the volume of traffic is not going to be enough. It may be relevant because our Seo company makes quality SEO and is located in Melbourne, but the volume of searches will be insignificant.
  3. Competition: finally we can have found a relevant keyword and with a large volume of searches but we must analyze our competition and be realistic, if we compete with giant companies that spend millions in achieving top positions such as Amazon, we must be realistic and know that we are going to have a very difficult time competing with them.

For these 3 reasons it is essential that we carefully analyze the keywords that we are going to position and the competition that we have to face.

On the website we are studying, it is evident that it is aimed at positioning itself as a Seo Melbourne, so we will proceed to analyze this keyword and interpret the results.

We see that there are 3,060,000 results and that some of our competitors are the SEOMELBOURNE or the IMPRESSIVE. A more precise way of determining the real competitors is to use the attribute: "Seo Melbourne", so we restrict the results to those sites that have that exact word in the title. Proof of this is that we have gone from 11,800,000 to 148,000 results.

Analyzing these data we can classify the competition as high. We should repeat this same exercise for all the keywords and thus determine in which keywords we should put all our effort, in keywords that will be feasible to position and that will give us good results.

3. Measure your presence on social networks

Being present in different social networks can not be considered an SEO factor but if they are going to allow us to increase our brand image and give greater scope to our publications, and this, in the long run, is going to be an SEO factor and to grow the authority of our domain.

To measure the presence in social networks of a site we access their profiles and check if they are active and spend time promoting their content or services on those social networks.

Entering the Growthworx Marketing Solutions website we see that they have a presence on different social networks (Facebook, Instagram, Twitter, Vimeo, and Google +) but they hardly have any followers and there is little or no activity.

If we do not create engagement with social networks it is a waste of time to have them.

In conclusion, social networks directly do not help the position but they are going to allow us to reach more audiences and as a result, our brand will increase its notoriety.

4. Measure the results

Once we have carried out the complete SEO analysis, we cannot stay there, we have been able to achieve very good positioning results, but that must pay off monetarily, we must transform that traffic into profitability.

The best tool to measure our actions and see where our users leave our website is Google Analytics. To check if our website has correctly installed the tracking code that allows us to measure the results, we can use the Google Tag Assistant extension. In our case, we see that the Growthworx Marketing Solutions website has an Analytics code installed correctly.

Knowing how to analyze the results is essential, it is not only about getting Internet users to reach your website but also giving them a good experience when they arrive at it, and looking at where on the web they get stuck or cannot find what they want, and in this way correct it and get translate that traffic into profitability.



"No other Seo in Melbourne provides so good case study as SeoMoves."

— Shan Evan
Spartans Warrior Zone.

"SeoMoves Case Study was super. I was so impressed."

— Ayman Sadiq
10 Minutes School.

"Sam's Case Study helped me to understand SEO and their team helped our biz a lot."

— Danny Vorhauer
Assemco Outsourcing Solutions.

A Team That Helps You Succeed

A Team That Helps You Climb Up!

Get In Touch Now with Growthworx Marketing Solutions.

We would love to hear from you!

p +61 03 9751 7903