Content IS king after all !!

Imagine you make a search with your favorite search engine about "keyword phrase 1" and you visit the first result (1st URL) but you return immediately to the search results - using search engine terms you bounce back. You now click on the 2nd URL and you never return to the search results. Lets say the same exact thing happens with 1000 other users searching for "keyword phrase 1";   What should that search engine conclude ? The 2nd URL is more fit to be on the 1st position for "keyword phrase 1". The 1st URL might not even be fit in for displaying in the first page of the search results of that search !!

This is a very simple scenario but it is just an illustration made to explain the concept behind it. Imagine you are running a search engine and you want to keep your customers (searchers) happy.  That would mean to give them the most relevant results as high in the search results as possible, so they can click them as soon as possible. Wouldn't you want to utilize the past customer satisfaction of each particular search in order to improve subsequent searches ?? I would. If I could utilize the searchers' behavior I would be able to improve my search results over time. That is what the major search engines do or at least ought to do.

Google uses this concept at least in their personalized search results which are available to any google user that has a gmail, adsense or adwords account and needs to specifically enable them by using the google history service. Google with this service effectively rearranges the search results to match your preferences and past behaviors.

Neither the concept nor the news that search engines might be using this are new to us. They have been discussed in webmaster forums among search engine optimization professionals quite extensively. Click-through rates are usually discussed as being part of the same subject.  Click-through rate is the percentage of the times a listing (URL in the search results) gets clicked over the times it appears in them.

How much each one affects the position of each URL ?

Lately (at least Google's search results) don't seem to be steady. I see domains jump 10 20 or even more positions up or down in the course of minutes or hours. High search engine positions are not the privilege of a few established domains anymore it seems. I suspect it could be the effect of click-through and bounce rates been taken into account.

Google is recording click throughs

At the very least google is recording the click-through rates of the search results . Do any search on google and dig into the source code. Each search result link gets a mousedown event that calls a javascript function "clk" passing it the "href" (the URL) and a number corresponding to the URL's position in the results; The first result gets a mousedown event like this:

onmousedown="return clk(this.href,'','','res','1','')"

and the 99th result gets this mousedown event

onmousedown="return clk(this.href,'','','res','99','')"

When the user clicks on the link the clk function results in a complex url call similar to this HTTP/1.1

which as I tested gets called simultaneously with the request to the search result's http request. Of course when javascript is disabled on the client's computer no javascript can run but, according to web usage statistics 95% or more users have javascript enabled.

How about bounce rate ?

I was also looking for the presence of cookies set in order to record the time a searcher stayed at the visited url but did not find any (maybe I did not look hard enough).  Thinking about it, the exact time is not really needed in order to form an opinion about a url visit.  A subsequent visit to another url of the same search results would mean that the first url was not satisfactory enough.  The time between subsequent URL visits can also be an indicator of the level of satisfaction.

Lets make our original example a bit more detailed to make this point clearer. Lets assume the example user above clicks on the 1st URL at 12:00PM and then clicks on the 2nd URL at 12:01PM and also clicks on the the 3rd URL at 12:10PM and on the 4th URL on 12:11PM.  This by itself might not mean anything but, if similar behavior is recorded by lets say 1000 different users (unique users) one can see that the 2nd URL kept the visitors longer which probably means that it was more relevant to the "keyword phrase 1" search.  Again this is a very simple example but it only serves as an illustration.

What does google do with these statistics

Lately there is a lot of turmoil with search engine positions.  No site can keep a steady position for long.  Positions change even within the hour.  I have seen this with client and personal sites competing for positions in very competitive terms or even moderately competitive terms.  What is happening then ?  I believe Google can NEVER be sure which specific URLs are the only ones fit to be on the first or even second page of the results and keeps on changing the rankings.  It allows other sites on the top results giving all of them equal chances to be clicked by searchers and have their click-through and visited-time (or bounce rates) be compared with the rest of the URLs.  This in theory sounds to me a parameters in the Google algorithm that gives fair chance to more URLs.  Mind you, links are still relevant and believe will always be since without them Google would not even know a site exists.

Improve your rankings
With these in mind here is what I think is important to improve your search engine rankings.

a) Good and relevant link titles and snippets in the search results
The first impression a searcher has of your site is what google presents in the search results.  Usually google creates a listing that consists of the URL's TITLE TAG text and a small snippet taken from your sites content that usually contains the whole or part of the search phrase.  If google does not find such snippet to present, it will go to and fetch the site's listing as snippet.  If google does not find a DMOZ listing it will get the page's META description instead.   Therefore it is in the webmaster's hands what google will display in the results.  Nice short and relevant TITLEs are usually better than long spammy ones because a searcher usually reads them before clicking the link to your site; users have gotten smarter with the web and usually won't click on TITLEs that read like spam (everything and their dog is in them).  Also if a webmaster makes sure to include phrases they want to be found for then I am sure google will display those instead of the usually not relevant DMOZ listing.

b) Fast load, good presentation and structure and relevant content on the web page
The second impression a searcher has of your site is your site's immediate appearence.  But even before that how fast your site loads might be a deciding factor that will keep a visitor to your site.  Studies show that the user usually has some fraction of a second to decide whether he/she will spend their time on your site or go back.   And web surfers don't have all day so your page better load fast.   Once a visitor decides to stay and read your content that becomes the deciding factor that will keep the visitors or make them bounce back.

Content IS KING after all

These factors were always important in order to keep visitors happy and get them to bookmark your site in order to visit it again.  In light of the click-through and visit-times that search engines might be using in the ranking algorithm, those become even more important in your site's improvement.

Here I should also mention that the talk involved how google records and I believe handles this user behavior data but, it is very probable that the other major search engines utilize such data (please bring forward if you have any such evidence).This system can be manipulated !!!
Any system that has a finite set of rules that can be easily deduced can be manipulated.  This does not mean though that the search engines won't take measures against it.  If they actually have such a system in place they probably have ways to check whether a request is man made and from a unique computer.  Such anti fraud system were probably needed in many other applications search engines operate and they should already have them in place.

Please share your thoughts or findings regarding the use of click-through and bounce statistics.



"The 1st URL might not even be fit in for displaying in the first page of the search results of that search !!" that is true but even in Google's personalized search results the first result for a phrase wont be sent to the second page if you never enter it and only enter the nine others so that still seems for away.
I did not know that Google uses javascript to track the entered results but I guess that is a natural thing Google dose.
Good Post!!

I agree with you CONTENT IS KING. Website should be designed for the visitors and not only for search engines. A content rich website and relevant to the business is what matters most from seo aspect as well. Content rich and a well build website do attract visitors and it does help your business reach heights.

I started blogging about 2 months ago, writing good semantic content, got me on severall first positions in Google.

As far as I know, it is offpage optimization that most matters, rather than onpage optimization. Of course the content is important but the quality of backlinks is very important. Or is it changing?

I don't prepend to know the exact percentage each offpage or onpage factors that matter to the search engines but all I am saying is don't just focus on the offpage optimization.

The links are helpful for you site to be found in the first 100 results but what exact position you hold seems to be determined by click-through and bounce rates and that seems to be more important lately ...

And frequently updated, original content is even better. :)

Contents for Human, Links for Search Engines...

Good post! Content has become the number one element people overlook when it comes to optimizing pages for search engines. It doesn't matter how well written your meta-tags are, your titles, etc if your content is horrible.

Good job and keep up the good work.

Good to see like-minded individuals posting sound advise.

Visit my site @

But, imagine a scenario where a user is searching for an information on a specific topic. He searches for "phrase 1" and clicks the first result. Reads the page and returns to the search results and clicks on another result to get more info. This does not mean that the user is not satisfacted by the first page, it just means that he needs more information.

It would be rather very unfair from Google's side to lower the ranking of the first result then, because it may be the most useful and well build. However, as you say, shuffling the rankinkinds and giving the maximum exposure to all sites is a good idea the search engines must consider and include in their ranking algorythm (the amount of attention each site gets when it is on a specific position and rank it accordingly).

And however, here the search engines will have a very social bookmarking kind of style then - the most clicked site by the users will more up, just like it is with Digg for example.

In your scenario the person who clicks the first result stays and reads the page thus, its not an immediate bounce back and it could be thought of as a positive feedback. I Imagine feedback being weighted by the amount of time a person stays in to read a page with most negative effect having the immediate bounce (no stay in the page, which means irrelevant)

I can imagine a threshold-stay (measured in seconds) after which time the user is thought to have found the site "relevant" and if the threshold is not passed the site is voted as "irrelevant". And of course the threshold-time is not a set value (some topics require more attention)

As I said this is just a simple scenario and only the google engineers know the exact algorithm but I am quite convinced the above ranking algorithm is integrated into the google algorithm.

Yes! I always hear and say that content is soooo important!

This theory, if it exists, has a flaw though, or something that it would have to account for.

Namely. If a page has a clear layout and loads quick content will be easy to find. On a website with a poor or busy layout, where the content might be hard to find, but it is there, a user might spend more time just searching for it.
On the other hand a visitor will probably leave because they cannot find the content quickly. If they find another website where the content is easir to find they will prefer this. It is my experience that visitors will usually compare sites and choose the one that has the easiest content to find even if you offer the exact same content.

I doubt if a search engine would use such a way of ranking sites. It is too error prone. I imagine it would like at your markup and the content before anything. Would it look at retun vistors? I don't know a site may contain just the right information for the purpose, and a visitor would need no more information form that site. For example how to make blueberry jam. They might come back to to look on how to make strawberry jam at some point, but that could be theoritically years.

K.I.S.S. if your website has content that is interesting for the visitor than they will find you. Making it anymore complicated than it is will just increase the amount of fraud going on.

Also a search engine doesn't need links from other websites to find you, as you stated, you can register your website to the search engine.
Personally I don't get the whole link from other sites business. It is the cause, in my opinion of all the link bloat on a lot of websites. Your blog has a very nice layout, but look at the amounts of blogs that have a scraped content snippet somwhere buried between a ton a google ads (or any other search engine for that matter) and a million and one links in the navigation bar, that I am not even sure the owner of the blog can follow, let alone the visitor.

In contrast major newspaper sites do this very well. Content, generally a clear navigation structure, relevant articles, and a few large ads from respectable companies. People are so used to this from normal newspapers that they accept it pleasantly, and I think will actually enjoy the ads instead of clicking away right away.

Much to be learned. Webdesign could really benefit from the classical style of layouting that has been established over thousands of years instead of trying to create the next new hype which clutters the screen even more with yet another button which will track your every move. Just ditch the whole link importance and the web would probably be a cleaner place. Daydreaming...

thanks for stopping by ... commenting on the article.

As I already mentioned (on a different context) every system is bound to have flaws. I am a firm believer that the major search engines utilize the click through rates and the bounce rates. At what extend I could not tell but this theory above is just a basic principle which could probably be setup to compensate for what you are mentioning.

All you say about links are true that they have been used and abused but its not easy to reverse the flow of a river, is it ?

The internet has basic currency the link whether google dictates this or not. We all have to respect it and try to not abuse it and link to only valuable stuff but on the other hand small time bloggers deserve a little bit of the advertising pie and will include links in their pages because someone pays them to, I do to. Money does shape this industry ....

About design and layout I also agree, much to be learned from layouting but new ways have to also be discovered because a webpage Does not usually have SET WIDTH and HEIGHT ...

Hi! Being a SEO I always prefer building a site for the visitors and not for the search engines. I think there are two things that will automatically drive traffic to your website one is A Visually Appealing Website and the other one is the Content. I think if you feed visitors with quality content they will keep coming at you and as search engines also like fresh and relevant content it will escalate your site to the top of the SERPs. So the bottom line is "Content Is The King".

As a fellow writer, I like the content sections.
It is important to provide insightful information for fellow readers.

I have always been told this, and you have further consolidated my view. Not only is content good search engine wise but ofcourse human-user wise. cant go wrong really, and theres no better way to generate traffic

Couldnt agree with your more. I do believe the time a visitor spends on a site plays a major factor. I'm always thinking of ways to excite my readers into staying. Sometimes an interesting javascript test, addictive game and good old fashion content will give your visitors a reasons they need to stay and revisit. The question a website owner needs to ask is, "what would make me stay or revisit...?".

Content is KING!
I hate going to a site that has nothing to do with what I am searching for.. It is a big waste of time... I want my answer asap!
If people are not concentrating on quality content to keep visitors coming back, then what are they doing? I know....wasting their time.

It would be rather actual arbitrary from Google’s ancillary to lower the baronial of the aboriginal aftereffect then, because it may be the a lot of advantageous and able-bodied build. However, as you say, ambiguity the rankinkinds and giving the best acknowledgment to all sites is a acceptable abstraction the seek engines have to accede and cover in their baronial algorythm

Hello all,
I agree on " contant is king" , but behind the contant a clean and searchengine friendly HTML is like the queen.
I allway try to forget about to much graphics, like scripted menus etc.
using alt text where posible.
using text in stead of picures.
keep the HTML as clean and short as posible so your content will have the highest posible percentage on the entire code.

This is my advise to people still thinking about the HOW to do it.

another thing:
Google can allway record the google page you are on so when a search is discontinued google knows the result for frase was in on of the 10 results it gave.
so when google mixes the rankings a bit every time it can see what web pages gives the best content.
couse everytime a search is discontineud and a specific page is showed an the last searchpage.
For brand new site is allways say, do a search on all the key frases you want to be found.
(after google indexed the site ofcourse)
start at page 1 go to 2,3,4,5 etc till you found your site.
close the window there.
repeat this for every keyfrase.
this will tell google the keywords for that specivic content was found on the 10 search results of the searchpage where your site showed everytime.
this will bring new sites higher ranks couse google is still testing the new site to decide on ranking positions.
this would be effective in the firs one or 2 months fron the moment your site was indexed by the google bot.
tha last site i made using this method and search engine aproach was indexed by google a week ago and the most important keywords went to page 1 within a week.
other frases i wanted to be found on are in the first 10 pages .
now its the content that have to prove its king.
what i meen by this is you can get top listed afted the first index, but to stay there its the content and the way the contents is read is crusial.

Most important:
- Relavant good content.
- Clean readable content.
- High content v.s script.
- Use key frases whare you can in your html like alt texts.
- have your keyword density not to high

it doesn't matter if a website reached the top in search engines if when you click on it, it appears an empty site. most of people decide if they surf on that website or go to the previous page in less than a second. so that things like Colors, design, graphics and fonts can cause a strong impression on people and make them actually read and annalize the content.

“Content is king.â€

It sounds good in principle. Produce a truly great piece of content, and you’ll get all the links you could ever hope for.
Maybe it worked too, several years ago. The Web used to be a fairly quiet place compared to what it is now, and it was easier for people to notice great blog posts.
But not anymore.
Now great is no longer good enough. The Web is full of so much remarkable content that bloggers don’t have enough time to read it all, much less link to it.
If you want links now, you need to be more than great. You need to be connected.