What is SEO?
The first search engines emerged within the early 90’s. People realized that you could really make money from them. Thus they concluded that they needed to attract traffic. What was the best method of attracting traffic? Search engines. At that time the owners of the websites began to think how they could reach the top positions … SEO was born!
But hey, let’s go to what matters and the reason why (I think) you are reading?
Search engine optimization or program optimization is that the process of improving the visibility of an internet site within the organic results of various search engines. It is also often named after its English title, SEO (Search Engine Optimization).SEO is one among the “disciplines” that has changed the foremost in recent years. We only have to look at the large number of updates that have been made to Penguin and Panda, and how these have given a 180-degree address what was understood by SEO until recently. Now SEO is pursuing what Matt Cutts himself calls “Search Experience Optimization” or what’s an equivalent , “all for the user”.
Although there are thousands of things on which an inquiry engine relies to position one page or another, it could be said that there are two basic factors: authority and relevance.
Authority is basically the popularity of a website. The more popular the more valuable the information it contains. This factor is the one that a search engine takes into account more since it is based on the user’s own experience. The more content is shared, the more users have found it useful. Relevance is that the relationship a page has got to a given search. This is not simply that a page contains the search term a lot of times (in the beginning it was like this) but rather that a search engine relies on hundreds of on-site factors to determine this.
SEO can be divided into two main groups:
1. On page SEO
2. Off page SEO
|Search Engine Optimization Strategy|
On page SEO:
On-page SEO is concerned with relevance. It makes sure that the web is optimized so that the search engine understands the main thing, which is its content. Within the SEO On-site we would include
On Page | Off Page
the optimization of keywords, loading time, user experience, optimization of the code and format of the URLs.
Off-page SEO is the part of SEO work that focuses on factors external to the web page we work on. The most important part in off-site SEO are the number and quality of links, presence on social networks, mentions in local media, brand authority and performance in search results, that is, the CTR that our leads to an enquiry engine. Surely you are thinking that all this is very good and that it’s very interesting but that you simply are here to understand why you would like SEO on your website and what benefits you’ll get if you integrate it into your online strategy.
Once we know what SEO is, we must differentiate whether or not we follow the search engine’s “recommendations”. Black Hat SEO or White Hat SEO
Black Hat SEO: Black hat is the attempt to improve the search engine positioning of a web page using unethical techniques or that contradict the guidelines of the search engine. Some samples of Black Hat SEO are Cloaking, Spinning, SPAM in forums and blog comments, or Keyword Stuffing.
The black hat can provide benefits within the short term, but it’s generally a risky strategy, without continuity within the future which doesn’t add value.
White Hat SEO: Consists of all those ethically correct actions that comply with the guidelines of the search engines to position a web page in the search results. Since search engines give more importance to the pages that best respond to a user’s search, White Hat understands the techniques that seek to make a page more relevant to search engines by adding value to its users.
Why is SEO important?
The most important reason why SEO is important is because it makes your website more useful for both users and search engines. Although they still can’t see a web page like a human does. SEO is necessary to help search engines understand what each page is about and whether or not it is useful to users.
Now let’s put an example to see things more clearly:
We have an e-commerce dedicated to the sale of children’s books. Well, for the term «coloring pages» there are some 673,000 searches per month. Assuming that the first result that appears after doing a search on Google gets 22% clicks (CTR = 22%), we would get about 148,000 visits per month.
Now how much are those 148,000 visits worth? Well, if for that term the average cost per click is € 0.20, we are talking about more than € 29,000 / month. This only in Spain, if we have a business oriented to several countries, 1.4 billion searches are carried out worldwide every hour. Of those searches, 70% of clicks are on organic results and 75% of users don’t get to the second page. If we take all this into account, we see that there are many clicks per month for the first result.
SEO is that the best way for your users to seek out you thru searches during which your website has relevancy . These users look for what you offer them. The best way to reach them is through a search engine.
How do search engines work?
The operation of an inquiry engine are often summarized in two steps: crawling and indexing.
A search engine scrolls the web crawling with what are called bots. These run through all the pages through the links. Hence the importance of a good link structure. Just as any user would do when browsing the content of the Web, they go from one link to another and collect data on those web pages that they provide to their servers.
The crawl process begins with a list of web addresses from past crawls and sitemaps provided by other web pages. Once they access these websites, the bots look for links to other pages to go to them. Bots are especially interested in new sites and changes to existing websites.
It is the bots themselves that decide which pages to visit, how often and how long they will crawl that website, so it is important to have an optimal loading time and updated content.
It is very common that on a web page it is necessary to restrict the crawling of some pages or certain content to prevent them from appearing in search results. For this, search engine bots can be told not to crawl certain pages through the “robots.txt” file.
Once a bot has crawled a website and collected the necessary information, these pages are included in an index. There they are arranged according to their content, their authority and their relevance. In this way, when we make a query to the search engine it will be much easier to show us the results that are more related to our query.
Search engines were initially based on the number of times a word was repeated. When doing an inquiry , they tracked those terms in their index to seek out which pages had them in their texts, better positioning the one that had it repeated the most times. Currently, they are more sophisticated and base their indexes on hundreds of different aspects. The publication date, if they contain images, videos or animations, micro formats, etc. are some of those aspects. Now they give more priority to the standard of the content.
Once the pages are crawled and indexed, the time comes for the algorithm to work – algorithms are the computer processes that decide which pages appear sooner or later in search results. After the search, the algorithms review the indexes. This way they’re going to know which are the foremost relevant pages taking into account the hundreds of positioning factors. And all of this happens in a matter of milliseconds.