Monday, October 31, 2016

Webmaster Tools For SEO Optimization

A webmaster is a person who have the control on the website. He can also be addressed as an admin too. Usually the webmasters are the responsible person for optimizing the sites. As we know that there are several updates done by Google for providing better results for the customers we must be very careful about the content. Webmaster tools are provided by Google in order to crawl and index the pages. It is defined with a term 'controlled SEO'. To access the webmaster tools just click here.

Google Webmaster Tools


 Through this panel we can find out our SEO defects. This post help you to do controlled SEO by using blog posts. Moving forward you need to change the URLs to.com version. For this you need to change '.in' to '.com/ncr' which will temporarily stops showing the regional version of a website. Next we need to add our property. For this copy our blog URL with '.com' extension to the text box provided. Then proceed with add button. Now our root URL is added. To crawl this in the left hand side you can see a crawl option click on it then select the option fetch as Google. Click 'fetch' button. After that you can see a Green tick mark with complete is shown beneath of the window. Now we click on 'submit to index' button. Then we will get a new window. There we need to verify as an individual and check the box near to crawl only this URL for indexing the base URL. Proceed with the go button. For the inner links you need to select the option crawl this URL and its direct links. Now your posts are crawled by Google bots and if we search with the title we can see them.

Thursday, October 27, 2016

On Page SEO Tips

Right now we saw how the evolution of SEO take place.Now we can find out the different key areas in on-page SEO.
Actually a webpage is made up of HTML codes and conventions. In an HTML page it consist of two major areas. They are the head section and the body section. For both these sections if we took little more care we can make that page with a good rank. First of all we can check the head section.

Tips in <head></head> section:

First of all in a head section we need to add title. Title is the first clue to bots saying the matter of a webpage. So we need to use relevant and simple titles. We can use around 5-6 words for the title. Also it should be limited to 50-60 characters. We need to avoid full uppercase titles. We need to have the correct spelling and avoid grammar mistakes. We should make some phrases including the focusing keyword for a better result.  For an example

<title>Computer Accessories Online</title>

This might be used as a title for a Computer accessories store which provides online shopping.

If we use the same title for different webpages it is treated as keyword cannibalism. When such a situation occurs bots will become confused and it might lead to lesser visibility. If we give long titles the search snippets will show dots such a way that user can't identify the full sentence.

Next, We need to focus on meta description. The word `meta` suggests that it shows the inner meanings of a webpage. it describes the contents of a webpage to bots in brief. We need to limit the contents by 155-160 characters. It is because the search snippet is 1024px wide. Here also we need to take care of spelling mistakes and grammatical errors. Our meta description should be unique. Unique in the sense that we should not reuse it and we should not copy the meta description from the competitors site. Meta description must be small, precise and simple for the user. We can use our focusing keywords in a meta description but limit its number.
 Here shows the example for meta description

<meta name="description" content="Leading Computer sales and servicing outlet of Dell computers - Kochi">

meta description in a snippet

Here the focusing keywords computer sales and servicing used efficiently. If wee forget to give meta description bots collect the words in the content of webpage and displays it. Else it may grab the content from open directory projects like dmoz.com.

Another important tag used in on page SEO is the 'robots' property. By default all the pages of a website are indexed and followed by the  bots. If we don't want the bots to follow and index a particular webpage robots property with no-index and no-follow specification is used.

Now  we can move to body section.

Tips for <body></body> optimization:

In the body section first of all we need to take care of <h1> tag. Usually this tag is used to emphasis the main heading. It must be small and simple. And we should use this tag only once. If we forget to give <h1></h1> tag then bots will check for the presence of <h2></h2> tag. Here also unique tag implementation is the first and primary law. If use multiple times bots will be confused and tend move our webpages into sandbox. If we add links in our webpage at most care should be taken. It is because links works as a fuel for visibility. If we use the text for a link as the same as focusing keyword then bots understand it as a reference from our webpage to a keyword which makes our efforts in vain. So we need to use other keywords as a text for link attribute.
Then we move on to the content section, we have the following tips for content optimization. To optimize the content we need to use the focusing keywords inside the content. We need to use the keywords in a limited range. Best range is using keywords up to 2-3 in a 100 words content. If we didn't use the keywords inside the content then bots can't recognise the relationship between title and content. If we wish, we can make the keywords in bold letters to help the user to identify them. Also we need to add a succeeding space after using a comma or a period.

While we adding images we should use the relevant names for them. It means that the name should match the content.If we have more than one word in the filename add hyphen for separation. We need to add the alt attribute of the image tag for a better visibility. We need to add the image very near to the exact content.

Wednesday, October 26, 2016

Search Engine Optimization Unveiled


After the emergence of the worldwide web, the world started to mumble the word ' internet'. People started to search various ways to build their own web pages. A huge outbreak of web pages developed at that time. It can be said as a second industrial revolution because people started to publish products through online. An unknown way of marketing.
On those days Larry Page was in a discussion about their research project. He eventually moved to an idea of  a search engine which right now  acts as the best personal assistant for you GOOGLE. Even though the domain google.com was registered on September 15,1997  they made it as a company on September 4,1998.


Larry Page Google Founder

After forming it as a company their responsibilities grew.At that time the search results were given as an email after 24 hours. There were also other web search engines like AltaVista,Yahoo,Archie and so on. In 2001 September 11 changed the performance of google to a great extent. After the September 11 attack, people started to search about this topic widely. But unfortunately, google can’t produce relevant information. When they go through an analysis they found that most of the web pages at that time are not crawlable. For the working of google search, it needs the web pages should be easily crawlable by the google bots or spiders.
After having so many hot arguments google decided to  publish the Search Engine Optimization Guide to make aware the webmasters about optimization of pages. Thus webmasters started to implement the techniques to build their sites which made google bots to provide better results.

WORKING OF GOOGLE BOTS

Let's talk about google bots. Actually, they are algorithms to crawl web pages. While crawling the pages they collect some information about them. This is called as catching, that means preparing a screen shot of the webpage.Then these details are arranged according to the information that page holds.This property is called as indexing of web pages.