≡ Menu

Stop Chasing Links: How Artificial Intelligence is Revolutionizing SEO?

It has only been a couple of years since the almighty link ruled SEO. Links, after all, are the chains that hold the internet together. Without them, there is no world wide web. The more links a website could gather pointing to its pages, the higher it would rank in the SERP. For many years, this worked out well for businesses capable of building massive websites filled with all kinds of links pointing to internal pages. It did not last long, however, for the rest of the world to figure out that not all links were created equal. The entire SEO process was turned upside down as Rand Fishkin mentioned few months ago in his whiteboard friday, as those with less than honest intentions had spent years abusing the system by building useless, low quality links.

The Link Business

Suddenly, links became big business. We could buy links, swap links, sell links and load our websites with internal links – all with stellar results. With search engines blindly ranking link-laden websites at the top of SERPs regardless of content, websites that had little relevance to their keywords could literally buy their way to the top. Something had to be done, and the people who designed the search engines knew it. Smarter algorithms were developed to stem the flow of link abuse and level the SEO playing field.

Leveling the Playing Field

When the search engines began ranking links differently based upon the new algorithm, many people were caught off guard. Soon, however, the world began to realize this change is actually good for everyone as legitimate businesses no longer had to compete with websites that had no relevance to search queries, and no business being in the same search results page. Link-building suddenly made little sense as an SEO technique, and no longer ruled the internet like a tyrant. In fact, some incoming links could actually hurt a website, much like keyword stuffing and duplicate content.

Only Going after Links Can Hurt You

If you are still going after all the links you can get, you may want to stop, take a deep breath and read the following closely. Thanks to recent changes, Search Engine Optimization actually makes sense now, and that is the way it should be. Links are still an important part of a website’s SEO strategy, as long as they are used wisely. The real difference is in what kind of links one uses, and how they’re used in conjunction with all other facets of SEO. A website would be better served focusing on quality traffic resources than on accumulating an abundance of low quality links that can effectively damage ranking. Below is an example of the kind of link that can actually help locate quality targeted traffic.

Reciprocal Links

Reciprocal Links are links that you get from other websites by giving them a link too. In the past, a link from any website would do, but now, random link swapping can actually hurt your ranking. The trick is obvious enough: look for websites that have what you want, i.e. relevant content that is interesting, on topic and garnering plenty of traffic. The higher the website ranks, the better it is to swap links with them if you decide to go that route. Today’s search engines favor websites that link to similar websites, allowing for only few links to websites from other niches thrown in for good measure (and to show diversity).

Links from other websites leading to your website’s internal pages, and not just your homepage, are even more attractive to search engines. This is especially true if the aforementioned internal page’s links come from high-ranking websites that have been around for a long time. The older a high-ranking website is, the stronger the links they give. Always avoid linking to a new website with poor content and low or inexistent ranking.

The Authoritative Guest Link

Another way to obtain quality-targeted traffic is to find popular forums or blogs with topics that relate to your website’s niche, and leave comments or guest blog posts. Directories are particularly good for this since they do not archive as quickly as a forum or a blog. However, there is one catch to this, which you must always be aware of: simply leaving your link at the bottom of a comment/blog post, along with your signature, is not enough. Write a small paragraph describing your expertise, explaining who you are and why you qualify to comment or blog.

Anchor Text Linking

The algorithms of today’s search engines are much smarter than in the past. Text linking is no longer as strong as it used to be unless you make it stronger by using anchor text that is relevant and meaningful to the link it drives. Your website will only be as strong as its weakest link. With that in mind, any link that you can build must be one that uses the overall strength of the page it lives on and the strength of the page to which the link points. Most people do not realize just how smart web crawlers have become; they actually read your content to see who you are and if you are interesting and relevant, which includes your anchor text. Not to be confused with exact match anchor text building, your overall anchor story should be around your niche but not too repetitive. Keep it natural.

Social Linking as a Quality Traffic Resource

Facebook has really done the world a service by adding their new “Pages” function. It allows us to do so much more than we ever could with just a profile page, when it comes to getting our information to the world. With an astonishing half a billion users, it only makes sense to utilize the tools provided by Facebook to get your link to even a small percentage of them.

Twitter is also a great place to use a link to your advantage. Remember: it’s not how often you tweet but the quality of the tweets and how they are used that counts. Always try to use much less than the maximum 145 characters allowed, since the link that you place at the end of your tweet could be cut off in retweets unless you keep it down to around 100 overall characters.

Internal Linking

If you are not taking advantage of your Frequently Asked Questions and About Us pages for internal links, you are missing out on a wonderful opportunity to impress search engines. Using the same free writing style as the rest of your website and linking internally to your pages are what search engines are looking for. Luckily, making these pages interesting is not as hard as one might think.

Long Range Strategy

These types of links are necessary to protect your rankings as well as to obtain higher rankings. Unfortunately, however, this is no longer enough. There are many aspects of SEO that now account for much of your website’s popularity. The days of link building and keyword stuffing are behind us. What is now necessary is a long-range search engine optimization strategy that builds upon the strength of not only your website, but also the websites in your niche that can help you along. Link building is about teaming up with people from your industry and earning their vote.

Keywords and Content

Having the right type and amount of links is just one aspect of SEO that the search engines now take into consideration when evaluating your website. In addition to links, they also assess keyword use. Like links, if you use too many from the wrong kind, it can hurt you. For this reason, it is very important to use keywords naturally. Knowing how many times you should use a keyword in relation to the total word count of a page is not as important as using them sparingly and creating interesting content that people want to read. It is far more important to have the right keyword in the right sentence, than to stuff your page full of them.

Your Website’s True Potential

We are entering a new phase in SEO and page ranking that allows us to do more than just bring traffic to our websites. We can now reach out to specifically targeted users and attract visitors who want to find us and know more about what we have to offer. This is why we should strive for quality targeted traffic rather than just links, assuming we want to fulfill the true potential SEO. If you’re still going after links regardless of where you find them, you’re effectively damaging your website’s reputation and missing out on the actual success that SEO can bring. And now allow me to suggest a different point of view on links.

Is it Possible Google heading towards Artificial Intelligence?

Most of you are well aware of the change that is taking place in Google’s algorithm to assess the relevance of various websites. Recently, an argument arose surrounding whether Google had in deed revolutionized its algorithm, or simply made standard periodical changes.

I’d like to include you in an interesting discussion I had several days ago with my immensely experienced colleagues, Emmanuel and Dynamic Search’s CEO Asher Elran . All of the ideas expressed ahead are solely hypothesized and by no means based on solid data. We’ve gathered information, analyzed websites and came up with a claim which, as we foresee, may cause quite a stir. We’d appreciate your input regarding our theory, and if any of you care to dispute it, kindly elaborate on why.

According to the model most widely used until now, the mere existence of incoming links, in addition to their volume and quality, served as evidence of the public’s interest in the website and formed part of the relevancy scores. However, excessive artificial “link building” abused this model, prompting Google to invest in frequent, semi-daily measures to assess the value of links (0 or near 0). I write semi-daily to emphasize that these measurements were not singular, but repetitive.

At the time, my colleague Branko Rihtman wrote an interesting research about this phenomena on his blog. In short, he wrote that Google assigns a temporary value to new links, and then it used to take 3-4 weeks until the effect of all changes on the linked page were calculated, thereby changing the temporary link score to a permanent one. Emmanuel claims that given the arbitrary nature of temporary link scores, Google’s tendency was to score generously and favorably – which would explain the quick rise of newer pages to the tops of SERPs, and their rapid decline within 3-4 weeks.

Nowadays, the situation is different: Google’s bots can scan and react much quicker to changes. Think of any entirely new subject / keyword such as “Gangnam Style” or “Harlem Shake”; these keywords did not exist before, yet they spread among internet users in a number of ways. It is likely, then, that Google’s “Penguin” algorithm worked in conjunction with “Panda” by way of artificial intelligence in order to learn as much as possible about the new subject and deliver relevant results. This artificial intelligence is likely embodied in the algorithm’s ability to research and learn new and old topics independently, and provide estimated relevancy scores in the shortest possible time. Effectively, this means that the search engine that manages to ‘learn’ the quickest, and deliver the fastest and most relevant results, would prevail.

Under the new model, emphasis is placed on the satisfaction of Google’s clients: basically, all internet users, though not all use Google’s search engine. In the past three years, Google has developed tools to measure the behavior, preferences and satisfaction level of website visitors, even in conditions where there is insufficient/incomplete information regarding each user in a given population. This information is accumulated and then analyzed in order to determine, by way of artificial intelligence, the parameters that should be used in order to calculate the relevancy score for each keyword separately, and perhaps for each type of website within the search results for a given phrase. No longer is there an algorithm ridden with assumptions inserted by the algorithm builders (such as the assumption that the optimal keyword density is x% and therefore the highest score in this criteria will be given to pages with x% keyword density, while higher/lower densities will receive lower scores according to the algorithm). The new model uses artificial intelligence – different algorithms for each search phrase, which can independently learn the preferences of the visitor population. They study the reality instead of approximating it.

The above description may sound fundamental, but it is in fact abstract. This is also the main difficulty found by SEO professionals in adapting to the new model: they no longer have tools to know what the algorithm may have learned, whilst in the past they’ve been able to guess (more or less) what the algorithm builders assumed.

An example of the change we’re facing lies with the constant reference to a single algorithm. It is unlikely that there is only one algorithm complete with the assumptions inserted by its builders regarding the characteristics of a linked page or a linking page, and how they represent the desires of visitors. Instead, it is likely that there are numerous algorithms, one for each search phrase, that are able to independently learn the preferences of the searching populations. To illustrate, think of the phrase “Hamburger delivery”. When typed in a search window, the user’s purpose is likely to obtain the telephone number of a delivery service. On average, he’ll only linger in the site for a few seconds (the time is takes to dial a telephone number), leaving the website with a grim 100% bounce rate. According to all known parameters, Google ought to conclude that such a website is irrelevant for the search term, when in fact the user got exactly what he wanted. We feel that, with the new artificial intelligence based algorithm, Google will learn user behavior for this phrase and other similar phrases, leading to the conclusion that users in many different places, speaking many different languages act the same way in this situation. Therefore, this is acceptable, foreseeable behavior.

The task at hand, then, is to bring about customer satisfaction. Links have no value, unless they succeed in this task. If we are positive that our website is satisfactory/fulfills a need or desire among visitors, but the addition of content from another website may also be useful to our visitors, thereby increasing their level of satisfaction – then it would only be natural to place a link to said content.

Thus far I’ve described what ought to be done according to the new algorithm. Naturally, if we’re the only ones who do it, we’ll see higher relevancy scores compared to our competitors, and rise in SERPs rankings. However, what if 30 or even 300 competitor websites also choose to invest in superior, useful content, encouraging other website owners to link to them?

Under the old model, the top score was achieved by whoever reached the highest sum when multiplying the number of links by their quality. This meant that lower quality links could be balanced by simply increasing the number of links. Under the new model, the sole criteria are quality, and the competition will be based on the level of quality necessary in each niche in order to surpass the competitors. Website owners will have to compete for who can hire the more talented (and more expensive!) content creator.  I doubt we’ll still hear complaints such as “how can such a terrible website be ranked at number 1?”

Author Bio:- Asher Elran Practical software engineer and the founder of Dynamic Search™, enthusiastic about all things involving creative marketing, CRO, SEM, and killer content. Follow me on twitter at @DynamicSearch

{ 1 comment… add one }
  • Frankestein Abid

    May 3, 2013, 11:21 pm

    Try Building natural links don’t do spam you won’t get penalized.. nothing artificial or else needed .. Do everything honestly you are good for the best than..

    Reply

Leave a Comment