For the audit, I chose to look at the National Tiger Sanctuary. This organization is a non-profit, rescue organization that provides permanent homes to both exotic and domestic animals. I ran some AdWord campaigns with this site so I thought it would be beneficial to conduct a site audit. The first thing I looked at was the indexing status. I searched site:nationaltigersanctuary.org in Google.
Here are those results:
I am not sure how many pages are on the website, but the result can be a good or bad thing. The page results found can be fine if 252 is about how many pages are on the site. However, if the site has a couple thousand pages but only 252 show up, then there is a duplicate content problem.
The next thing to do was to run a crawler on the website. I looked at the crawl depth which shows how many clicks it took for a crawler to reach a page from the home page. Any number over 4 is considered a problem. Of the 100 pages scanned by the crawler, the highest crawl depth was just 2. There are some issues concerning Robots.txt files. Issues with this can lead to pages on the website being blocked by search engine crawlers. Whoever is running the site should go into those files manually and make sure they are not restricting access to important sections of the website.
There are also some pages that contain duplicate content. Upon looking at them, I believe it is not a huge deal considering that the pages are just tagged with “National Tiger Sanctuary” at the end.
Here is the overview section of the audit for the National Tiger Sanctuary:
No comments:
Post a Comment