Implementation is the process of applying your findings and proposed solutions to the website in question. If you are doing SEO for a client you should prepare a formal document listing the specific changes you recommend to achieve on-page benefits. It will cover all sections and typically show the â€œBeforeâ€ and â€œAfterâ€ states so that the reader can judge whether he or he can accept the changes. Clients will often negotiate some changes to the content but be careful that the changes don’t ruin the page relevancy (which is entirely possible).
If the website is powered by a Content Management System (CMS) then you could be modifying its templates and uploading the textual content through the CMS.
FTP is used to move files from your PC to a remote website or computer containing the website files. From an SEO perspective a common trap when using FTP is to forget to define certain file extensions as ASCII (text) so they are transferred as Binary. Another trap is forgetting to make scripts executable (on a Linux server) and certain text files writable.
A directory listing showing file permissions at left
You will also use FTP when using certain Sitemap sites e.g. Google that require you to place a unique file on the server to authenticate yourself as the owner. Filezilla is a good FTP client.
Build an XML Sitemap â€“ Important!
These days there are billions of web pages and not enough spiders to index them so you can help the spiders out (and improve your chances of being indexed) by building and submitting an XML Sitemap in the correct format.
Here is the layout of a sample sitemap containing just one URL supplied by http://www.sitemaps.org which is sponsored by Microsoft Google and Yahoo! and contains some more details.
<?xml version="1.0" encoding="UTF-8"?>
There are many tools both online and software ones to build a sitemap so you need not be too concerned about the technicalities. An online sitemap generator is at www.xml-sitemaps.com/ and another is at sitemap.xmlecho.org/sitemap/. To conserve bandwidth the online tools may only crawl a few thousand pages so if you run a huge site you should use an offline tool. A free downloadable tool is at enarion.net/google/phpsitemapng/ or gsitecrawler.com but you can find many more if you do a search.
GSiteCrawler in operation
Submit the Sitemap
You submit the sitemap to the three major search engines. Some blogging and CMS software does this automatically and you can do it via the robots.txt file (covered later) but you can always submit a sitemap manually. All these engines will require that you authenticate yourself against the submitted site by uploading a unique file or by adding a special meta tag.
- Google: www.google.com/webmasters/
- Live Search: webmaster.live.com/webmaster/WebmasterManageSitesPage.aspx
- Yahoo!: siteexplorer.search.yahoo.com/submit
- Ask.com: submissions.ask.com/ping?sitemap=SitemapUrl
Note: that â€œSitemapUrlâ€ should be substituted with the actual URL of the sitemap e.g.
Once you have prepared the sitemap add the sitemap auto-discovery directive to the robots.txt file as follows:
Caution for New Sites
It is all very well to make the Sitemap entry in the robots.txt file but new sites are not known to search engines so there could be a long delay before the first spider comes along and finds the robots.txt file. So a manual submission of the XML sitemap is advisable.
URL submission and removal
Submitting an XML sitemap or obtaining a link from an existing site should suffice the need to be crawled but it doesn’t take much time to make submissions of the home page URL to the three main search engines. Remember â€“ being fully crawled doesn’t improve your ranking; it merely ensures that the search engine is fully aware of your page URLs.
Submission to Search Engines
These days it is not necessary to submit a URL to a search engine because it will invariably pick up a link to a new site from an existing known site and subsequently crawl and index it.
Of course for a brand new site someone (the SEO) needs to organise that initial link. This is best done from a same-topic page that displays a good PageRank (as high as you can get) as this indicates regular visits by Googlebot. (Don’t assume that the high PR will flow to the new site on account of a single link.)
What to Submit?
- You should submit the main URL e.g. http://www.example.com/
- If the search engine accepts it try to submit a deep link e.g. http://www.example.com/product-list.html. For large sites obtaining deep links is very important.
Submission to Directories
Directories represent one-way links to your website so it is a no-brainer to get such links. However get such links gradually say over the course of many months. Getting a large number of one kind of links in a short space of time is not a good signal of quality. Spread the link love â€“ get a directory link then an article link then a blog entry and so on.
This is a checklist for submission (especially paid directories):
- Is this a specialist (vertical) directory e.g. Florida travel?
- Is there a presence of high Toolbar PR on the listings pages where your link will appear (not the first page that is full)?
- Does the site: operator show listings pages and are they largely free of Supplemental Results (Google)?
- Do the directory’s backlinks include high TrustRank sites? (Yahoo will show them first)
Don’t do reciprocal links unless the directory is a vertical in your niche.
Deprecation of Directory Links
Nowadays a lot of sites that sell PR-passing links have been penalised by Google to the point of having inner pages removed from the index or they show no PR. Be careful not to waste your money buying links from directories unless you expect to get human visitors from them.
Right and Wrong Ways to Link
You must learn how to obtain links the right way and how to recognise and avoid worthless links. There is a lot of outdated information that you must discard.
Understand these principles and your linking task will be much easier:
- The general principle is that you are penalised (or rewarded) for whom you link to not who links to you.
- Good links are often unsolicited and come from â€œauthorityâ€ sites about the same topic covered by the destination of the link.
- Links from off-topic sites â€“ say an SEO company gets a link from its client who is a plumber â€“ do not help.
Anchor Text and Linking
This is a linking principle too but is important enough to get its own paragraph. The actual words used to make the link are the anchor text. This name comes from the HTML code used to make the link (â€œaâ€ = anchor) for example compare the following:
<a href=â€http://www.trainsem.com/â€>Click Here</a>
<a href=â€http://www.trainsem.com/â€>SEO Training</a>
The second link is more valuable than the first as it improves the ranking of trainsem.com for the anchor text keyphrase (and more so if the link comes from another SEO training company site).
The better ones come from signatures in forum posts where I try to use a slightly different anchor text. Forums often use â€œBBcodeâ€ for user-generated content where the link is coded as such:
What Is an Authority Site?
Google has been experimenting with SiteLinks shown below. The authority site shows additional links below the first search result. It is not known what causes a site to be selected as an authority but we can speculate that such a site has a lot of high TrustRank inbound links among other factors such as usage data and age of domain name.
SiteLinks suggest that this is an authority site.
Reciprocal links are often worthless and potentially damaging. They involve agreements that require both websites to link to each other. In early 2006 Google changed its algorithm and suddenly reciprocal links were devalued. Analysis suggests that reciprocal links from off-topic sites were worthless e.g. a jewellery company’s links page linking to an SEO company. Links between on-topic sites e.g. a photography competition site and a camera manufacturer site were not penalised. Many users of reciprocal link scripts don’t know this and they continue to seek such links. Amateur SEO companies often do this.
Google warns us not to buy links. This is a curious situation. You can go to the Yahoo! Directory to Business.com or other well-known directory and pay for an annual listing in some category. Isn’t this a paid link? Yes and no. Yahoo charges to review your submission not necessarily to list it.
Free-for All (FFA) Links
Run away from FFAs! These are directories that have no human editorial control and your submission is live immediately. Therefore your business site’s link could be next to some adult or gambling site.
It is common for sites that have changed URL to place a meta refresh redirect to the new address. This is code in the head of the file.
<META HTTP-EQUIV="Refresh" CONTENT="0;URL=http://www.othersite.com">
This code means that the visitor is taken to the other URL after a zero-second delay. This is not advisable â€“ it may be viewed as fooling the search engines and the site could be penalised. Use a 301 (permanent) redirect instead.
When a website is taken offline for maintenance you should think about its effect on visitors both human and spiders. For example you might want to redirect them temporarily to another website. For any short-term requirement you use a Temporary Redirect which presents a â€œ302â€ numeric response from the server. Search engine spiders and algorithms understand this and do not try to remove the old address from the index. For Apache-based web servers you can effect this with the following text placed inside a text file called .htaccess which is placed in the home directory:
Redirect temporary / http://www.newsite.com/
The above redirects the entire site.
You can also redirect a single page (â€œ302â€ and â€œtemporaryâ€ are interchangeable):
Redirect 302 /old.html http://www.newsite.com/temporary.html
You can redirect a directory:
Redirect 302 /images http://www.newsite.com/pictures/
In PHP you can place the following in a file:
header(â€Location: http://www.newsite.com/somedirectory/ filename.htmlâ€);
As the name implies a permanent redirect tells search engines to forget the current (old) address and to replace it with the new one. Such an instruction delivers a â€œ301â€ server header response. If a spider encounters a 301 instruction a few times it stops visiting the old address and goes to the new address.
In PHP you can place the following in a file:
header(â€Status: 301 Moved Permanentlyâ€ false 301);
Reference: RFC 2616
You should write content that will attract unsolicited links. One such source is the social media and networking sites such as Digg or bookmarking sites such as Ma.gnolia.net or del.icio.us. It isn’t easy for a manufacturer of say industrial greases to get into Digg but you can always set up a page for example of safety tips that would be worth linking to by sites dealing with machinery.
TrustRank and Worthless Links
The TrustRank-based algorithms are believed to pass on Trust through links. Therefore a link from CNN carries CNN’s endorsement and trust value upon the site it links to. Let’s say that CNN page has a PageRank of 7.
Compare it with some other privately owned site that sells links â€“ a few directories do this. It has a PR7 page too. At one time PR7 pages were largely identical in their â€œlink punchâ€. You might find that this link confers less value than the CNN one. There isn’t any easy way to make such a comparison but people have begun to notice that their purchased links are giving them a PR boost but not a ranking boost. High PR but low traffic isn’t fun.
An HTML attribute used in a link called rel=â€œnofollowâ€ instructs the search engines (notably Google) not to pass on any PageRank to the destination of the link. It also prevents the spiders from following the link. Blog software usually does this to links in comments. If you have the PageStatus extension for Firefox you can see nofollow links highlighted in pink. You do not want such links other than for human visitors.