Advertisements
Blogging,  Ecommerce,  Learning,  Machine Learning,  Marketing

The Basics of Search Engine Optimization

“The beauty of SEO is that, instead of pushing a marketing message onto folks who don’t want to hear what you have to say, you can reverse-engineer the process to discover exactly what people are looking for, create the right content for it, and appear before them at exactly the moment they are looking for it. It’s pull vs. push.”

By Cyrus Shepard.

Search engine optimization is the technical process through which changes are made to the structure and information of a web page with the aim of improving the visibility of a website in the results of the different search engines.

People who perform search engine optimization tasks are called web positioners.

Resultado de imagen de seo

Some History

Web site administrators and content providers began optimizing websites on search engines in the mid 90’s, as search engines began cataloging the first Internet.

In the beginning, all the web page administrators had to do was send the address of a web page to the different engines, which would send a web spider to inspect that site, extract links to other pages that had that web and return the information collected to be indexed.

The process involves a web spider belonging to the search engine, which downloads a page and stores it on the company’s servers, where a second program, known as an indexer, extracts information about the page. Between it, the words it contains and where they are located, the relevance of specific words and all the links that the page contains, which are stored to be tracked later by the web spider.

Website owners began to recognize the value of having their pages well positioned and visible to search engines, which created an opportunity for users of white hat and black hat SEO techniques. According to the analysis of the expert Danny Sullivan, the term search engine optimization began to be used in August 1997, by John Audette and his company, Multimedia Marketing Group, documented on a page of the company’s website.

The first versions of the search algorithms were based on the information provided by the administrators of the web pages, such as the keywords of the metatags, or files indexed in engines. Meta tags provide a guide to the content of each page. Using metadata to index a page was a method that was not too precise, since the words provided by the website administrator in the metatags could be an inaccurate representation of the actual content of the web page.

Due to the importance of factors such as the density of keywords, which depended entirely on the administrator of the website, the first search engines suffered abuse and manipulation of the classifications. To provide better results for their users, search engines had to adapt to ensure that their search results pages showed the most relevant searches instead of unrelated pages. Assuming that the success and popularity of a search engine are conditioned by its ability to produce the most relevant results for any search, allowing the results to be false would make users opt for other search engines. Search engines responded by developing more complex ranking algorithms to rank websites, taking into account additional factors to make them more difficult to manipulate by web administrators.

Graduate students at Stanford University, Larry Page and Sergey Brin, developed Backrub, a search engine based on a mathematical algorithm that rated the relevance of web pages. PageRank was the name of the number calculated by the algorithm, a function that counts on the number and strength of incoming links. PageRank estimates the possibility of a web page being viewed by a web user who randomly navigates the web, and follows links from one page to another. This means that some links are stronger than others, so a page with higher PageRank is more likely to be visited by a random user. They eventually founded Google in 1998. Google attracted loyal followers among the growing number of Internet users, who liked its simple design, motivated by the fact that the founders did not know HTML and limited themselves to placing a box of Search and the company logo.

External factors to the page were considered along with internal factors, to allow Google avoid the type of manipulation seen in search engines that only considered internal factors of the page for classifications.

In 2000, Google launched the Google Toolbar, a toolbar that, among other things, showed the public PageRank metric. The PageRank of the Google bar goes from 0 to 10, 10 being the maximum, an assessment reached by very few websites. The public PageRank was updated periodically until December 2013, when the last update to date took place.

Although PageRank was more difficult to manipulate, webmasters had already developed link creation tools and plans to influence the Inktomi search engine, and these methods were also effective in manipulating PageRank. Many sites focused on exchanging, buying and selling links, often on a large scale. Some of these systems, or link farms, included the creation of thousands of sites for the sole purpose of creating junk links.

In 2004, search engines had incorporated a large number of unpublished factors into their ranking algorithms to reduce the impact of link manipulation. In June 2007, Hansell, of the New York Times, stated that search engines were using more than 200 factors. The main search engines, Google, Bing and Yahoo, do not publish the algorithms they use to position web pages. Some positioners or SEO have studied different ways of dealing with search engine optimization, and have shared their opinions. Patents related to search engines can provide information to better understand search engines.

In 2005, Google began to customize search results for each user, depending on their history in previous searches, Google offered personalized results for registered users.

In 2007, Google announced a campaign against the payment links that transferred PageRank. On June 15, 2009, Google published that it had taken measures to mitigate the effects of PageRank cutting, by implementing the no follow attribute in the links. Matt Cutts, a well-known software engineer from Google, announced that Google Bot would treat links with no follow differently, with the idea of ​​preventing SEO service providers from using no follow for PageRank cutting.

The result was that the use of nofollow led to the evaporation of PageRank. To avoid the above mentioned, some web positioning engineers developed different alternative techniques that would change the nofollow tags with obfuscated Javascript, allowing the PageRank carving. Also various solutions that include the use of iframes, Flash and JavaScript.

In December 2009, Google announced that it would use the search history of all users to develop the search results.

Google Instant, real-time search, was introduced at the end of 2010 in an attempt to make the most relevant and recent search results. Historically, web administrators have spent months or even years optimizing a website to improve their positioning. With the increase in popularity of social networks and blogs, the main engines made changes in their algorithms to allow fresh content and quickly position in the search results. In February 2011, Google announced the “Panda” update, which penalizes websites that contain duplicate content from other sites and sources. Historically web pages have copied content from others benefiting from the classification of search engines using this technique, however Google implemented a new system in which it penalizes web pages whose content is not unique.

Resultado de imagen de seo

In April 2012, Google announced the update “Penguin” whose goal was to penalize those sites that used manipulative techniques to improve their rankings.

In September 2013, Google announced the Hummingbird update, a change in the algorithm designed to improve Google’s natural language processing and semantic comprehension of web pages.

Organic positioning

The organic positioning is the one that gets a web spontaneously, without mediating an advertising campaign. It is based on the indexing carried out by some applications called web spiders for search engines. In this indexing, web spiders browse web pages and store the relevant keywords in the database.

The interest of the webmaster is to optimize the structure of a website and its content, as well as the use of various linkbuilding techniques, linkbaiting or viral content, increasing the visibility of the web, due to the increase of mentions. The goal is to appear in the highest possible positions of the organic search results for one or more specific keywords.

The optimization is done in two ways:

On-page SEO: Through improvements in content. Technical improvements in the code.

Off-page SEO: It seeks to improve the notoriety of the web through references to it. This is achieved mainly through natural links and social media.

Search engines usually show organic or natural results in an area, along with payment results. The positioning in these payment areas requires the payment of certain special services, such as Adwords or Microsoft Ad Center, and is known as search engine marketing.

The AdWords service can be contracted by impressions or by clicks.

In the following image you can see graphically the difference between organic and paid positioning. It shows the results that Google shows naturally and the results it shows by AdWords paid ads, which are also known as SEM (Search Engine Marketing).

Techniques to improve positioning

The activities to be developed involve changes in programming, design and content, which are aligned with the guidelines issued by the search engines as good practices.

Resultado de imagen de seo

It is divided into internal and external positioning:

Internal positioning

These are improvements that the web developer can apply to the site in terms of content, appearance, accessibility or others.

Responsive web design Since April 2015, in a new change of algorithm, the word is spread that Google will penalize with a considerable descent of position in the SERP to those websites that they lack adaptability to mobile devices.

The web can be easily tracked by search engine spiders is the first step. The computers of the search engines have to have access to the website in order to process it and display it in the search engines. For that reason, the crawl budget directly influences the positioning: the higher the tracking frequency of a web and the more pages it tracks, the better is its positioning.

Create quality content. Since 2015, Google increasingly assigns importance to the so-called “user’s web experience”, being able to measure it in statistical terms as long as said web has been indexed by this search engine. The user experience is related, above all, to the adaptability to mobile devices, the content mentioned above, the usability and the speed of loading time, among other factors.

Carrying out the structuring and design of a web page with positioning in mind means paying attention to its functionality, easy access and capturing the user’s attention.

Create unique titles and relevant descriptions of the content of each page. Each page is a business card for the search engine. The titles and descriptions are starting points for the identification of the relevant terms throughout the web by search engines.

Make the web as accessible as possible: limit content in Flash, frames or JavaScript. This type of content does not allow the tracking or tracking of information by the robot in the different pages or sections. For them they are a flat space through which you can not navigate.

Link internally the pages of our site in an orderly and clear manner. A “website map” in the code will allow the search through the different sections of the site in an orderly manner, improving its visibility. Including RSS files that can also be used as sitemaps.

Improve user experience with design improvements and decrease in bounce rates.

External positioning

The techniques used to improve the visibility of the website in online media. As a general rule, we seek to obtain mentions online, in the form of a link of the website to be optimized.

Get other related websites related to your website. For this it is interesting to perform a search for those terms that you consider should bring traffic to your website and study which of them have complementary content. If for example you want to position yourself by the term “hairdressing madrid” it can be interesting to try to get backlinks from hairdressers from other cities.

Right now there are hundreds of social networks, for example Hi5, Facebook and, Orkut, in which to participate and get visits from our new “friends”. For Google, the social network that has the greatest impact on SEO is Google Plus, which has taken the place in importance of Twitter and Facebook.

Register in important directories like Dmoz and Yahoo!. The directories have lost much interest in search engines but they are still a good starting point to get links or a first crawl of your website by search engines. Both require a human filter for inclusion which ensures the quality of the added webs, but also slows down and hinders their inclusion.

Register and participate in forums, preferably in thematic forums related to the activity of your website. The frequent participation has to be accompanied by real and valuable contribution to be taken into account as a qualified user, the detail of success to get visits and increase positioning is the link to your website presented in your signature.

Write articles on other websites. Articles are a very powerful method to improve positioning and attract visitors. If you can write some articles of a course, of the tricks of the day, the usefulness of the product of your web.

Traits of the search algorithms

  • Public features: these are understood to be those of a nature declared by the administrators of said algorithm, for example we can know that Google’s algorithm has certain technical aspects that penalize certain actors of web administrators or content writers.
  • Private features: Are those that are kept secret by the creators or administrators of such content, this is so that a person can not see the algorithm in its entirety and draw a fraudulent web positioning strategy.
  • Suspected traits: These are secret traits discerned in the very practice of the activity, they are not of an official nature, but when doing the practice of optimization of a webpage for search engines certain characteristics of these algorithms are seen, for example has discovered that despite everything said by the creators, the most important thing for search algorithms is the creation of relevant, fresh and necessary content.

Link building

Linkbuilding is an SEO technique that consists in getting other web pages to link to the page that interests searchers to consider relevant and position them better in their rankings. The technique can be done in a natural way, when other websites link without prior agreement, or artificially, when it is simulated that the links have been achieved naturally.

This is based on the concept that one of the factors that are included in the evaluation of the ranking of a page is the number of inbound links that a page has, a concept based on the fact that the number of incoming links was one of the factors evaluated in PageRank in 1999 by Google.

The advantages are:

  • Possibility of measuring the demand and quantity of people who are searching through a keyword
  • Positioning effectiveness
  • Positioning of the brand

Techniques

Registration in directories: consists of registering the web in different directories, whether general or thematic. It means entering the links in relevant directories and choosing the category that best suits the page. Since 2013, Google does not take into account the directories, which has disabled this strategy.

Directories of articles: it consists of writing articles to publish them in directories that, in exchange for the content, allow to include links towards a web.

Bookmarking: it is about saving what is interesting to position in the search engines in the different bookmarking webs.

Link baiting: it is one of the most valued techniques by search engines but one of the most difficult to achieve, since only hundreds of links to an article are obtained if it really adds value.

Exchange of links: a good way to get links and one of the first that began to use. There are also many types of exchanges and services.

Purchase links: More effective than the exchange of links but also more expensive. According to Google’s official policy, this way of obtaining links is punishable.

 

 

Advertisements

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: