Quantcast
Channel: Online Marketing Training Courses » The Geeks Guide To SEO
Viewing all articles
Browse latest Browse all 10

What Your Website Absolutely Needs

$
0
0

This section will go over some of the most important elements that a page that hopes to get high research engine rankings needs.  Make sure that you go through this while section very carefully as each of these can have a dramatic impact on the rankings that your website will ultimately achieve.

Just don’t focus on the home page, keywords and titles.

The first step to sales when customers visit your site to see the products they were looking for. Of course, search engine optimization and better rankings can’t keep your customer on your site or make them buy. The customer having visited your site, now ensure that he gets interested in your products or services and stays around. Motivate him to buy the product by providing clear and unambiguous information. Thus if you happen to sell more than one product or service, provide all necessary information about this, may be by keeping the  information at a different page. By providing suitable and easily visible links, the customer can navigate to these pages and get the details.

Understanding Your Target Customer

If you design a website you think will attract clients, but you don’t really know who your customers are and what they want to buy, it is unlikely you make much money. Website business is an extension or replacement for a standard storefront. You can send email to your existing clients and ask them to complete a survey or even while they are browsing on your website. Ask them about their choices. Why do they like your products? Do you discount prices or offer coupons? Are your prices consistently lower than others? Is your shipping price cheaper? Do you respond faster to client questions? Are your product descriptions better? Your return policies and guarantees better than your competitor’s? To know your customer you can check credit card records or ask your customer to complete a simple contact form with name, address, age, gender, etc. when they purchase a product.

Does your website give enough contact information?

When you sell from a website, your customer can buy your products 24 hrs a day and also your customers may be from other states that are thousands of miles away. Always provide contact information, preferably on every page of your website, complete with mailing address, telephone number and an email address that reaches you. People may need to contact you about sales, general information or technical problems on your site. Also have your email forwarded to another email address if you do not check your website mailbox often. When customer wants to buy online provide enough options like credit card, PayPal or other online payment service.

In the field of search engine optimization (SEO), writing a strong homepage that will rank high in the engines and will read well with your site visitors can sometimes present a challenge, even to some seasoned SEO professionals. Once you have clearly identified your exact keywords and key phrases, the exact location on your homepage where you will place those carefully researched keywords will have a drastic impact in the end results of your homepage optimization.

One thing we keep most people say is that they don’t want to change the looks or more especially the wording on their homepage. Understandably, some of them went to great lengths and invested either a lot of time and/or money to make it the best it can be. Being the best it can be for your site visitors is one thing. But is it the best it can be for the search engines, in terms of how your site will rank?

If you need powerful rankings in the major search engines and at the same time you want to successfully convert your visitors and prospects into real buyers, it’s important to effectively write your homepage the proper way the first time! You should always remember that a powerfully optimized homepage pleases both the search engines and your prospects. In randomly inserting keywords and key phrases into your old homepage, you might run the risk of getting good rankings, but at the same time it might jeopardize your marketing flow. That is a mistake nobody would ever want to do with their homepage.

Even today, there are still some people that will say you can edit your homepage for key phrases, without re-writing the whole page. There are important reasons why that strategy might not work.

Your homepage is the most important page on your web site

If you concentrate your most important keywords and key phrases in your homepage many times, the search engines will surely notice and index it accordingly. But will it still read easily and will the sentences flow freely to your real human visitors? There are some good chances that it might not. As a primer, having just 40 or 50 words on your homepage will not deliver the message effectively. To be powerful and effective, a homepage needs at least 300 to 400 words for maximum search engine throughput and effectiveness.

One way to do that is to increase your word count with more value-added content. This often means rewriting your whole homepage all over again. The main reason to this is you will probably never have enough room to skillfully work your important keywords and key phrases into the body text of your homepage. This may not please your boss or marketing department, but a full re-write is often necessary and highly advisable to achieve high rankings in the engines, while at the same time having a homepage that will please your site visitors and convert a good proportion of them into real buyers.

The Acid Test

Here is the acid test that will prove what we just said is right: Carefully examine the body text of your existing homepage. Then, attempt to insert three to five different keywords and key phrases three to four times each, somewhere within the actual body of your existing page. In doing that, chances are you will end up with a homepage that is next to impossible to understand and read.

One mistake some people do is to force their prospects to wade through endless key phrase lists or paragraphs, in an attempt to describe their features and benefits. The other reason they do that is in trying to please the search engines at the same time. Writing a powerful and effective homepage around carefully defined keywords and key phrases is a sure way you can drive targeted traffic to your web site and keep them there once you do.

If some people still say re-writing a homepage takes too much time and costs too much money, think of the cost of losing prospective clients and the real cost of lost sales and lost opportunities. In the end, writing a strong homepage that will achieve all your desired goals will largely justify your time invested and the efforts you will have placed in the re-writing of your homepage.

We discussed the importance of the Homepage. This section presents a recommended layout for your homepage in order to make it as search engine friendly as possible. This is where you set the theme of your site. Let’s suppose the primary focus of your site is about online education. You also have secondary content that is there as alternative content for those not interested online education. There is also other content that you would like to share with your visitors. For example, this might include book reviews, humor, and links.

The top of your homepage, as discussed earlier is the most important. This is where you set the keywords and theme for the most important part of your site, the thing you really want to be found for.

Step By Step Page Optimization

Starting at the top of your index/home page something like this:

(After your logo or header graphic)

1)   A heading tag that includes a keyword(s) or keyword phrases. A heading tag is bigger and bolder text than normal body text, so a search engine places more importance on it because you emphasize it.

2)   Heading sizes range from h1 – h6 with h1 being the largest text. If you learn to use just a little Cascading Style Sheet code you can control the size of your headings. You could set an h1 sized heading to be only slightly larger than your normal text if you choose, and the search engine will still see it as an important heading.

3)   Next would be an introduction that describes your main theme. This would include several of your top keywords and keyword phrases. Repeat your top 1 or 2 keywords several times, include other keyword search terms too, but make it read in sentences that makes sense to your visitors.

4)   A second paragraph could be added that got more specific using other words related to online education.

5)   Next you could put smaller heading.

6)   Then you’d list the links to your pages, and ideally have a brief decision of each link using keywords and keyword phrases in the text. You also want to have several pages of quality content to link to. Repeat that procedure for all your links that relate to your theme.

7)   Next you might include a closing, keyword laden paragraph. More is not necessarily better when it comes to keywords, at least after a certain point. Writing “online education” fifty times across your page would probably result in you being caught for trying to cheat. Ideally, somewhere from 3% – 20% of your page text would be keywords. The percentage changes often and is different at each search engine. The 3-20 rule is a general guideline, and you can go higher if it makes sense and isn’t redundant.

8)   Finally, you can list your secondary content of book reviews, humor, and links. Skip the descriptions if they aren’t necessary, or they may water down your theme too much. If you must include descriptions for these non-theme related links, keep them short and sweet. You also might include all the other site sections as simply a link to another index that lists them all. You could call it Entertainment, Miscellaneous, or whatever. These can be sub-indexes that can be optimized toward their own theme, which is the ideal way to go.

Now you’ve set the all important top of your page up with a strong theme. So far so good, but this isn’t the only way you can create a strong theme so don’t be compelled into following this exact formula. This was just an example to show you one way to set up a strong site theme. Use your imagination, you many come up with an even better way.

One Site – One Theme

It’s important to note that you shouldn’t try to optimize your home page for more than one theme. They just end up weakening each other’s strength when you do that. By using simple links to your alternative content, a link to your humor page can get folks where they want to go, and then you can write your humor page as a secondary index optimized toward a humor theme. In the end, each page should be optimized for search engines for the main topic of that page or site section.

Search engine optimization is made up of many simple techniques that work together to create a comprehensive overall strategy. This combination of techniques is greater as a whole than the sum of the parts. While you can skip any small technique that is a part of the overall strategy, it will subtract from the edge you’d gain by employing all the tactics.

Affiliate Sites & Dynamic URLs

In affiliate programs, sites that send you traffic and visitors, have to be paid on the basis of per click or other parameters (such as number of pages visited on your site, duration spent, transactions etc). Most common contractual understanding revolves around payment per click or click throughs. Affiliates use tracking software that monitors such clicks using a redirection measurement system. The validity of affiliate programs in boosting your link analysis is doubtful. Nevertheless, it is felt that it does not actually do any harm. It does provide you visitors, and that is important. In the case of some search engines re-directs may even count in favor of your link analysis. Use affiliate programs, but this is not a major strategy for optimization.

Several pages in e-commerce and other functional sites are generated dynamically and have “?” or “&” sign in their dynamic URLs. These signs separate the CGI variables. While Google will crawl these pages, many other engines will not. One inconvenient solution is to develop static equivalent of the dynamic pages and have them on your site.

Another way to avoid such dynamic URLs is to rewrite these URLs using a syntax that is accepted by the crawler and also understood as equivalent to the dynamic URL by the application server. The Amazon site shows dynamic URLs in such syntax. If you are using Apache web server, you can use Apache rewrite rules to enable this conversion.

One good tip is that you should prepare a crawler page (or pages) and submit this to the search engines. This page should have no text or content except for links to all the important pages that you wished to be crawled. When the spider reaches this page it would crawl to all the links and would suck all the desired pages into its index. You can also break up the main crawler page into several smaller pages if the size becomes too large. The crawler shall not reject smaller pages, whereas larger pages may get bypassed if the crawler finds them too slow to be spidered.

You do not have to be concerned that the result may throw up this “site-map” page and would disappoint the visitor. This will not happen, as the “site-map” has no searchable content and will not get included in the results, rather all other pages would. We found the site wired.com had published hierarchical sets of crawler pages. The first crawler page lists all the category headlines, these links lead to a set of links with all story headlines, which in turn lead to the news stories.

Page Size Can Be A Factor

We have written above that the spiders may bypass long and “difficult” pages. They would have their own time-out characteristics or other controls that help them come unstuck from such pages. So you do not want to have such a page become your “gateway” page. One tip is to keep the page size below 100 kb.

How many Pages To Submit

You do not have to submit all the pages of your site. As stated earlier, many sites have restrictions on the number of pages you submit. A key page or a page that has links to many inner pages is ideal, but you must submit some inner pages. This insures that even if the first page is missed, the crawler does get to access other pages and all the important pages through them. Submit your key 3 to 4 pages at least. Choose the ones that have the most relevant content and keywords to suit your target search string and verify that they link to other pages properly.

Should You Use Frames?

Many websites make use of frames on their web pages. In some cases, more than two frames would be used on a single web page. The reason why most websites use frames is because each frame’s content has a different source. A master page known as “Frameset” controls the process of clubbing content from different sources into a single web page. Such frames make it easier for webmasters to club multiple sources into a single web page. This, however, has a huge disadvantage when it comes to Search Engines.

Some of the older Search Engines do not have the capability to read content from frames. These only crawl through the frameset instead of all the web pages. Consequently web pages with multiple frames are ignored by the spider. There are certain tags known as “NOFRAMES” (Information ignored by frames capable browser) that can be inserted in the HTML of these web pages. Spiders are able to read information within the NOFRAMES tags. Thus, Search Engines only see the Frameset. Moreover, there cannot be any links to other web pages in the NOFRAMES blocks. That means the search engines won’t crawl past the frameset, thus ignoring all the content rich web pages that are controlled by the frameset.

Hence, it is always advisable to have web pages without frames as these could easily make your website invisible to Search Engines.

Making frames visible to Search Engines

We discussed earlier the prominence of frames based websites. Many amateur web designers do not understand the drastic effects frames can have on search engine visibility. Such ignorance is augmented by the fact that some Search Engines such as Google and Ask.com are actually frames capable. Ask.com spiders can crawl through frames and index all web pages of a website. However, this is only true for a few Search Engines.

The best solution as stated above is to avoid frames all together.  If you still decide to use frames another remedy to this problem is using JavaScripts. JavaScripts can be added anywhere and are visible to Search Engines. These would enable spiders to crawl to other web pages, even if they do not recognize frames.

With a little trial and error, you can make your frame sites accessible to both types of search engines.

Robot.txt More Than A Little Useful

We discussed the ROBOTS tag in brief earlier. Let us understand this tag a little more in detail.

Sometimes we rank well on one engine for a particular key phrase and assume that all search engines will like our pages, and hence we will rank well for that key phrase on a number of engines. Unfortunately this is rarely the case. All the major search engines differ somewhat, so what’s get you ranked high on one engine may actually help to lower your ranking on another engine.

It is for this reason that some people like to optimize pages for each particular search engine. Usually these pages would only be slightly different but this slight difference could make all the difference when it comes to ranking high.

However because search engine spiders crawl through sites indexing every page it can find, it might come across your search engine specific optimizes pages and because they are very similar, the spider may think you are spamming it and will do one of two things, ban your site altogether or severely punish you in the form of lower rankings.

The solution is this case is to stop specific Search Engine spiders from indexing some of your web pages. This is done using a robots.txt file which resides on your web space.

A Robots.txt file is a vital part of any webmasters battle against getting banned or punished by the search engines if he or she designs different pages for different search engines.

The robots.txt file is just a simple text file as the file extension suggests. It’s created using a simple text editor like notepad or WordPad, complicated word processors such as Microsoft Word will only corrupt the file.

You can insert certain code in this text file to make it work. This is how it can be done.

User-Agent: (Spider Name)
Disallow: (File Name)

The User-Agent is the name of the search engines spider and Disallow is the name of the file that you don’t want that spider to index.

You have to start a new batch of code for each engine, but if you want to list multiply disallow files you can one under another. For example –

User-Agent: Slurp (PositionTech’s spider)
Disallow: xyz-gg.html
Disallow: xyz-al.html
Disallow: xxyyzz-gg.html
Disallow: xxyyzz-al.html

The above code disallows PositionTech to spider two pages optimized for Google (gg) and two pages optimized for AltaVista (al). If PositionTech were allowed to spider these pages as well as the pages specifically made for PositionTech, you may run the risk of being banned or penalized. Hence, it’s always a good idea to use a robots.txt file.

The robots.txt file resides on your webspace, but where on your webspace? The root directory! If you upload your file to sub-directories it will not work. If you wanted to disallow all engines from indexing a file, you simply use the * character where the engines name would usually be. However beware that the * character won’t work on the Disallow line.

Here are the names of a few of the big engines:
Excite – ArchitextSpider
AltaVista – Scooter
Lycos – Lycos_Spider_(T-Rex)
Google – Googlebot
Alltheweb – FAST-WebCrawler

Be sure to check over the file before uploading it, as you may have made a simple mistake, which could mean your pages are indexed by engines you don’t want to index them, or even worse none of your pages might be indexed.

Another advantage of the Robots.txt file is that by examining it, you can get information on what spiders, or agents have accessed your web pages. This will give you a list of all the host names as well as agent names of the spiders. Moreover, information of very small search engines also gets recorded in the text file. Thus, you know what Search Engines are likely to list your website.

Most Search Engines scan and index all of the text in a web page. However, some Search Engines ignore certain text known as Stop Words, which is explained below. Apart from this, almost all Search Engines ignore spam.

STOP Words

Stop words are common words that are ignored by search engines at the time of searching a key phrase. This is done in order to save space on their server, and also to accelerate the search process.

When a search is conducted in a search engine, it will exclude the stop words from the search query, and will use the query by replacing all the stop words with a marker. A marker is a symbol that is substituted with the stop words. The intention is to save space. This way, the search engines are able to save more web pages in that extra space, as well as retain the relevancy of the search query.

Besides, omitting a few words also speeds up the search process. For instance, if a query consists of three words. The Search Engine would generally make three runs for each of the words and display the listings. However, if one of the words is such that omitting it does not make a difference to search results, it can be excluded from the query and consequently the search process becomes faster.

Some commonly excluded “stop words” are:

  • after
  • also
  • an
  • and
  • as
  • at
  • be
  • because
  • before
  • between
  • but
  • before
  • for
  • however
  • from
  • if
  • in
  • into
  • of
  • or
  • other
  • out
  • since
  • such
  • than
  • that
  • the
  • these
  • there
  • this
  • those
  • to
  • under
  • upon
  • when
  • where
  • whether
  • which
  • with
  • within
  • without

Image Alt Tag Descriptions

Search engines are unable to view graphics or distinguish text that might be contained within them. For this reason, most engines will read the content of the image ALT tags to determine the purpose of a graphic. By taking the time to craft relevant, yet keyword rich ALT tags for the images on your web site, you increase the keyword density of your site.

Although many search engines read and index the text contained within ALT tags, it’s important NOT to go overboard in using these tags as part of your SEO campaign. Most engines will not give this text any more weight than the text within the body of your site.

Invisible text is content on a web site that is coded in a manner that makes it invisible to human visitors, but readable by search engine spiders. This is done in order to artificially inflate the keyword density of a web site without affecting the visual appearance of it. Hidden text is a recognized spam tactic and nearly all of the major search engines recognize and penalize sites that use this tactic.

This is the technique of placing text on a page in a small font size. Pages that are predominantly heavy in tiny text may be dismissed as spam. Or, the tiny text may not be indexed. As a general guideline, try to avoid pages where the font size is predominantly smaller than normal. Make sure that you’re not spamming the engine by using keyword after keyword in a very small font size. Your tiny text may be a copyright notice at the very bottom of the page, or even your contact information. If so, that’s fine

The post What Your Website Absolutely Needs appeared first on Online Marketing Training Courses.


Viewing all articles
Browse latest Browse all 10

Latest Images

Trending Articles





Latest Images