Jon Clark is a Internet Marketing Consultant specializing in Pay Per Click Marketing (PPC), Search Engine Optimization (SEO) and Search Engine Marketing (SEM). This search marketing blog serves as a backdrop for educational search marketing conversation as well as a resource for cutting edge (and free) Internet Marketing tools to make all of our lives as search marketers easier!
I’ve had numerous requests to post a synopsis of what makes organic search engine optimization the preferred route to go when building site traffic. While organic seo doesn’t easily lend itself to a quick description (the complexity is what sets it apart from antiquated SEO methods), I put together a brief description.
Organic SEO recognizes the fact that older SEO models don’t work anymore. Instead of aiming for an overnight (and temporary) boost, the organic model uses keyword analysis, strategic inbound linking, on page SEO and a campaign tailored to each client website in order to reach a high position within 90 days and have the site stabilized and firmly entrenched in its higher position within 180 days.
Google recently posted quick and easy tips for the holiday rush and, after reading back over it, I quickly realized, these tips aren't just for the holiday season nor should they be recommendations limited to Google search.
As a result, I have compiled a list of quick and simple tips for websites to increase traffic - whether it's to make the sale online or to increase foot traffic to your brick-and-mortar location, your web presence is a critical part of your business plan. The tips below are fast, free, and can make a big difference in your organic SEO! Best of all, they were inspired by Google:)
Verify that your site is indexed (and is returned in search results) Check your snippet content and page titles with the site: command [site:example.com] -- do they look accurate and descriptive for users? Ideally, each title and snippet should be unique in order to reflect that each URL contains unique content. If anything is missing or you want more details, you can also use the Content Analysis tool in Google's Webmaster Tools. There you can see which URLs on your site show duplicate titles or meta descriptions. Label your images accurately Don't miss out on potential customers with poor or missing alt text. Good 'alt' text and descriptive filenames help search engines better understand images, make sure you change non-descriptive file names [001.jpg] to something more accurate [NintendoWii.jpg]. Image Search has the potential to send significant search traffic, so you should take advantage of it.
Know what the search engines know (about your site) Check for crawl errors and learn the top queries that bring traffic to your site through Webmaster Tools. You can use Google's diagnostics checklist or MSN's crawl issues tool to aid in this process.
Have a plan for expiring and temporary pages Make sure to serve accurate HTTP status codes. If you no longer sell a product, serve a 404. If you have changed a product page to a new URL, serve a 301 to redirect the old page to the new one. Keeping your site up-to-date can help bring more targeted traffic your way and provide an overall more useful visitor experience.
Increase foot traffic too If your website directs customers to a brick-and-mortar location, make sure you claim and double check your business listing in Google Local, Local.com, MSN's Local Listing and Yahoo Local.
Usability 101 Test the usability of your checkout process with various browswers and ensure the coding is valid. Ask yourself if a user can get from product page to checkout without assistance. Is your checkout button easy to find?
Tell the search engines where to find all of your web pages If you upload new products faster than the search engines crawl your site, make sure to submit a sitemap and include 'last modification' and change frequency' information. A Sitemap can point searchbots to your new or hard-to-find content. Both Google's webmaster tools, MSN webmaster central and Yahoo!'s Site Explorer allow for sitemap submissions. You can read a quick question and answer on Google sitemaps for more details. Additionally, be sure to utilize robots.txt to funnel the search bots to content you want crawled and content that should not be indexed.
So, that's all for now. I know there are many, many more tips but these should help capture some low hanging fruit. Have your own tips? Feel free to add them in the comments section!
During day 2 of SES Chicago, Sharad Verma, Senior Product Manager of Yahoo! Search Technology, broke down how Yahoo utilizes link building to determine your website's ranking, crawling & indexing in the Yahoo SERP. All of these relate to Inbound Links (IBL) to your site.
1) No Links to your site = NO Existence, Yahoo will Never find your site 2) Few Links to your site = Minor discovery, NO index 3) Some Links to your site = Crawl of your site, NO index 4) Some More Links to your site = Crawl of your site, Index of your site, NO Ranking in the SERP 5) Enough Links to your site = Ranking in the Top 10 of the Yahoo! SERP 6) LOTS of Links to your site – Helps you rank for the keywords and keywords found in the anchor text of your IBL in the Yahoo! SERP.
Now, I know the question everyone is asking themselves, but Sharad never got into the actual number of links that correspond with the above, however if you are an SEO or actively build links you can fully understand the value in the above items. The most important takeaway? There is no question of doubt if this is fact or fiction as the information came directly out of Sharad Verma with Yahoo! himself.
I recently posted a useful SEO Flowchart that Aaron Wall over at SEO Book had put together. While folks found the SEO flowchart useful, it was more of a 'high level' look at the SEO process. The guys over at SEO Book have now put together a VERY useful keword research flowchart that digs down into the deepest of detail. You can see the chart below - and feel free to stop by the SEO Book site for a larger image and PDF download.
USA Today recently asked Google's Matt Cutts for tips to help sites rank in their search engine. Cutts offered up 5 tips plus a word of advice in implementing the tips. Here they are:
Spotlight your search term on the page. If you want to be found for your keyword, make sure that term is on the page you want to rank. The term should be at the top as well as peppered throughout your copy.
Fill in your "tags." The two most important tags are Title and Description becuase that is what is displayed in the search results.
Get other websites to "link" back to you. This is one of the most important of the 100 factors Google considers when determining search engine rankings.
Create a blog and post often. This can help you get links and generate new content.
And that word of advice? Don't overdo it. In other words, don't stuff your pages full of keywords in order to 'cheat' your website into better rankings.
What do you think of Cutts' advice? Leave a comment!
A few months ago I wrote a blurb about robots.txt - what it is and how to use it. Google Webmaster Central just released their own Robots.txt Generator. To access the tool, log-in to your Google Webmaster Tools account, then click on the Tools menu option on the left-hand side of the screen after you select one of your verified sites. You'll see a "Generate robots.txt" link among the tool options. That's what you want.
The tool lets you craft specific rules for these particular Google crawlers:
Googlebot
Googlebot-Mobile
Googlebot-Image
Mediapartners-Google
Adsbot-Google
After you make your file, upload it to the root directory of your web site. If you don't know what that is, find someone who does! This is important!
For more about Google's webmaster tools, be sure to check out the quick start guide they offer.
There are many things to consider when beginning the SEO strategy for your website and each page within it. Aside from in-depth keyword research, the title you decide to assign to each page is arguably the most important thing to consider. Andy MacDonald, CEO of Swift Media UK provides a wonderful guest post at SEOptimize and provides a number of tips to follow:
There are several considerations when coming up with your page titles. Here are some of the key factors to consider:
Unless you’re Microsoft, don’t use your company name in the page title. A better choice is to use a descriptive keyword or phrase that tells users exactly what’s on the page. This helps ensure that your search engine rankings are accurate
Try to keep page titles to less than 50 characters, including spaces. Some search engines will index only up to 50 characters; others might index as many as 150. However, maintaining shorter page titles forces you to be precise in the titles that you choose and ensures that your page title will never be cut off in the search results.
Don’t repeat keywords in your title tags. Repetition can occasionally come across as spam and is known as when a crawler is examining your site, so avoid repeating keywords in your title if possible, and never duplicate words just to gain a crawler’s attention. It could well get your site excluded from search engine listings.
Consider adding special characters at the beginning and end of your title to improve noticeability. Parentheses (()), arrows (<<>>), asterisks (****), and special symbols like ££££ can help draw a user’s attention to your page title. These special characters and symbols don’t usually add to or distract from your SEO efforts, but they do serve to call attention to your site title.
Include a call to action in your title. There’s an adage that goes something like, “You’ll never sell a thing if you don’t ask for the sale.” That’s one thing that doesn’t change with the Web. Even on the Internet, if you want your users to do something you have to ask them to.
Being a relative newbie to the SEO industry myself I absolutely wish I had these resources at my disposal from day one. That being said, first things first! I want to say congratulations on your new job venture. Trust me, SEO is an addiction! Once you start SEO, you just can’t stop.
Now if you are that green to the industry you are probably wondering “Where the hell do I start?” Trust me; we have all felt that way! But don’t worry - Stephen Peron over e-Visibility Blog put together a list of SEO 101 sites, forums, links, tools and more. Here is the low-down!
SEO News Sites
You have to read every day! If you do not like reading, then I recommend you walk over to your boss right now and say “I quit.” It is important to stay updated on what is going on in the industry, because it is always changing.
Hopefully this points you in the right directions. Good luck, and most importantly remember to have fun. If you stop having fun then change your career.
For all of you who are experts please add anything I may have missed in the comment below and I will update the list.
I found myself using Matt McGee’s post How to SEO Your Site in Less Than 60 Minutes as a quick reference from time to time for my SEO efforts - basically a checklist of sorts to ensure I had covered all my bases. It was a great write-up that was very useful to many people judging from the comments and social buzz it recieved. I took the liberty of adding a few additional notes of my own, however, if there is anything I missed or overlooked please add it in the comments below. Contents SEO Checklist A: Homepage B: Site C: External SEO Checklist A1: Homepage - www.domain.com 1. Check for redirects and canonicalization issues 2. Choose http://domain.com or http://www.domain.com 3. Redirect domain.com/(index|main).(html|htm|php|cfm|asp) to domain.com
Apache redirects and editing .htaccess files: domain.com to www.domain.com Options +FollowSymlinks RewriteEngine on rewritecond %{http_host} ^domain.com [nc] rewriterule ^(.*)$ http://www.domain.com/$1 [r=301,nc]
www.domain.com/index.php to www.domain.com RewriteCond %{THE_REQUEST} ^[^/]*/index\.php [NC] RewriteRule . / [R=301,L] A2: Homepage – Navigation 1. Check for image, drop downs, javascript, image maps vs text navigation. Text is the best option. A3: Homepage – Content 1. How much text is present? The more the better. 2. Check for keyword density in homepage content http://www.ranks.nl/tools/spider.html 3. Check for use of H2 tags and bold fonts (light/appropriate use is good on keywords) 4. There should be a sitemap present 5. Do a select all (ctrl + A) to find potentially hidden text 6. Check to see how search engines will view your site with SEO Browser. Make sure everything is crawlable.
B1: Site – Meta Tags 1. Check Title tags. Are they using keywords and are formatted correctly?
Brand authority formatting: Brand Name or Domain | Keyword, Keyword & Keyword
Non brand authority formatting Keyword, Keyword & Keyword | Brand Name or Domain
2. Check Descriptions for keywords and composition. Make sure the description gets to the point and speaks to the purpose/content on its respective page in the first couple sentences.
3. Make sure the keyword tag contains around 5 – 10 keywords. No more or less is really necessary.
4. Make sure there are no duplicate meta tags anywhere, site wide.
B2: Site – URL Formatting 1. Check url formatting. Dynamic URLs are bad. URLs that are too long will be truncated in Google SERPs. 2. URLs should contain keywords separated by hyphens. 3. Hyphens are more preferable than underscores 4. Keywords in URLs should match the content contained within the page they are leading to.
B3: Analytics 1. Make sure you have some sort of analytics installed. It doesn’t have to be Google analytics but do remember that every page within the site should contain the analytic tracking code. B4: Site – Links 1. Links should contain keywords 2. Links should contain titles utilizing keywords 3. Anchor text, link keywords, link title, and page being linked to should be relevant to one another. 4. Site linking structure should be cyclical. There should be no dangling pages. 5. Use Xenu Link Sleuth to check for broken links
B5: Site – nofollows (advanced) 1. nofollow TOS, Privacy Policy, or other pages that don’t contribute to your site’s ranking. 2. If you know how to link funnel correctly this should be done. I haven’t written anything on this yet but you can consult Slightly Shady SEO or Andy Beard B6: Site – Robots.txt 1. Check for robots.txt file. Does one exist? 2. See what’s being blocked and what’s not. 3. Make sure it’s written correctly (consult Sebastian’s Pamphlets for best advice) B7: Site – Duplicate Content 1. Make sure there is no duplicate content within your site 2. Make sure there is no duplicate content on other domains. You can use CopyScape to check for dupe content.
B8: Site – PDF files 1. Does this site contain PDF files? If so these can be optimized with new titles, keywords, and comments. Use Adobe Acrobat Professional to edit PDFs.
B9: Site – Images 1. Images can have ALT tags. Make sure to utilize these appropriately with keywords. When implemented, your site may gain traffic from image search engines like Google Image Search. C1: External – Indexation 1. Perform a site:domain.com search on Google, Yahoo and MSN. Compare what’s being indexed and what isn’t. *Install FireFox Extension Search Status by Craig Raw You’ll be able to easily perform this operative plus many other functions with the Search Status plugin. C2: External – Backlinks 1. Perform a backlink count with the Search Status plugin. 2. You may also want to install Joost de Valk’s backlink checker plugin for FireFox to check the anchor text of your Backlinks within Yahoo Site Explorer or Google’s Webmaster tools.
So that’s about all I can think of for the time being. If I forgot anything please submit your additions to this checklist in the comments below.
Starting a new company is extremely hard, which is probably why most businesses fail within the first couple of years.
Challenges such as marketing and hiring the right staff are some of the major issues that even good managers struggle with.
Beyond that, starting a website is no small challenge either! It can be just as hard or even harder! As if things weren't difficult enough, Google gives people an extra hurdle which is sometimes called the "sandbox".
When you launch a new site, Google doesn't trust it at all. Even if the BBC launched a new site it would start off without any trust and would receive very little traffic.
As the site attracts links from other sites, it gradually earns enough trust to start ranking for some long tail search terms. Think of links as 'votes' that tell the Search Engines what your site is about and that it is trustworthy.
If the site gains enough trusted links it may start to rank highly for competitive keywords (ie keywords with lots of PPC advertisers) but this can take up to 24 months.
You are probably thinking that this is a harsh move by Google, but with the sheer volume of sites being launched it needs to have some method of making sure only the really good ones reach the top.
Luckily e-consultancy put together a couple of simple initial rules to follow to ensure your new website gets off on the right foot and reach the top quicker.
1. Your first links When you launch your brand new site the first few links can make all the difference. Linking to it from other trusted sites right out of the gate can dramatically shorten the time it takes to build trust. Conversely, starting a site off with low quality links is suicide.
Submit to a couple of trusted directories such as Business.com and the Yahoo! Directory and maybe link to the site from some of your other company's websites (link from the news sections to appear more natural).
2. Spin-off sites Sometimes a new site is a spin-off from a particular section of an existing site. For example, a car insurance site might have a caravan insurance section but then decide to launch a dedicated caravan insurance website.
In this case, the best method would be to create the new site on the new domain and not allow Google to access it. Then once the site was ready you would 301 redirect the old pages to the new pages and hope that the rankings and trust from the old domain passed across.
3. Building trust Most people try to get relevant links to their sites, which are great for improving rankings in a particular niche but not quite as good for building trust. The ideal links for building trust are from major blogs and news sites.
These can also be quite relevant as usually the article linking to you is related to your niche - even though the rest of the site isn't.
Sites that receive a lot of attention from the mainstream press almost always start ranking a lot more quickly than sites that don't have the benefit of a large PR budget.
So, I was browsing through one of my favorite blog sites, Search Engine Land, and stumbled upon a pretty cool posting.
It is always a struggle to explain SEO and its intricacies to clients with little to zero knowledge and even more so when you get down to the minute details - such as anchor text and why it should mention a keyword instead of your company name.
Below is a quick illustration that can help explain anchor text to anyone!
Having relevant keywords as the anchor text of a link is extremely beneficial to an SEO campaign. Because search engines depend on links as a sort of pathway around the Internet, anchor text helps categorize what a destination page will be about as well as what it may be relevant for in a search result.
This week's infographic demonstrates what anchor text is and how it can positively affect your rankings:
Graphic by Elliance, an eMarketing firm specializing in results-driven search engine marketing, web site design, and outbound eMarketing campaigns. The firm is the creator of the ennect online marketing toolkit. The Search Illustrated column appears Tuesdays at Search Engine Land.
I submitted a Sitemap, but my URLs haven't been [crawled/indexed] yet. Isn't that what a Sitemap is for?
If it doesn't get me automatically crawled and indexed, what does a Sitemap do?
Will a Sitemap help me rank better?
If I set all of my pages to have priority 1.0, will that make them rank higher (or get crawled faster) than someone else's pages that have priority 0.8?
Is there any point in submitting a Sitemap if all the metadata (, , etc.) is the same for each URL, or if I'm not sure it's accurate?
I've heard about people who submitted a Sitemap and got penalized shortly afterward. Can a Sitemap hurt you?
Where can I put my Sitemap? Does it have to be at the root of my site?
Can I just submit the site map that my webmaster made of my site? I don't get this whole XML thing.
Which Sitemap format is the best?
If I have multiple URLs that point to the same content, can I use my Sitemap to indicate my preferred URL for that content?
Does the placement of a URL within a Sitemap file matter? Will the URLs at the beginning of the file get better treatment than the URLs near the end?
If my site has multiple sections (e.g. a blog, a forum, and a photo gallery), should I submit one Sitemap for the site, or multiple Sitemaps (one for each section)?
Robots.txt files are often mentioned as being an important foundation of a search friendly web site. To site owners and small businesses who are new to search marketing, the robots.txt file can sound daunting. In reality, it's one of the fastest, simplest ways to make your site just a little more search engine friendly.
What is Robots.txt?
Robots.txt is a simple text file that sits on the server with your web site. It's basically your web site's way of giving instructions to search engines about what how they index your web site.
Search Engines tend to look for the robots.txt file when they first visit a site. They can visit and index your site whether you have a robots.txt file or not; having one simply helps them along the way.
All of the major search engines read and follow the instructions in a robots.txt file. That means it's a pretty effective way to keep content out of the search indexes.
A word of warning. While some sites will tell you to use robots.txt to block premium content you don't want people to see, this isn't a good idea. While most search engines will respect your robots.txt file and ignore the content you want to have blocked, a far safer option is to hide that premium content behind a login. Requiring a username and password to access the content you want hidden from the public will do a much more effective job of keeping both search engines and people out.
What Does Robots.txt Look Like?
The average robots.txt file is one of the simplest pieces of code you'll ever write or edit.
If you want to have a robots.txt file for the engines to visit, but don't want to give them any special instructions, simply open up a text editor and type in the following:
User-Agent: * Disallow:
The "User-Agent" part specifies which search engines you are giving the directions to. Using the asterisk means you are giving directions to ALL search engines.
The "disallow" part specifies what content you don't want the search engines to index. If you don't want to block the search engines from any area of your web site, you simply leave this area blank.
For most small web sites, those two simple lines are all you really need.
If your web site is a little bit larger, or you have a lot of folders on your server, you may want to use the robots.txt file to give some instructions about which content to avoid.
A good example of this would be a site that has printer-friendly versions of all of their content housed in a folder called "print-ready." There's no reason for the search engines to index both forms of the content, so it's a good idea to go ahead and block the engines from indexing the printer-friendly versions.
In this case, you'd leave the "user-agent" section alone, but would add the print-ready folder to the "disallow" line. That robots.txt file would look like this:
User-Agent: * Disallow: /print-ready/
It's important to note the forward slashes before and after the folder name. The search engines will tack that folder on to the end of the domain name they are visiting.
That means the /print-ready/ file is found at www.yourdomain.com/print-ready/. If it's actually found at www.yourdomain.com/css/print-ready/ you'll need to format your robots.txt this way:
User-Agent: * Disallow: /css/print-ready/
You can also edit the "user-agent" line to refer to specific search engines. To do this, you'll need to look up the name of a search engine's robot. (For instance, Google's robot is called "googlebot" and Yahoo's is called "slurp.")
If you want to set up your robots.txt file to give instructions ONLY to Google, you would format it like this:
User-Agent: googlebot Disallow: /css/print-ready/
How do I Put Robots.txt on my Site?
Once you've written your robots.txt file to reflect the directions you want to give the search engines, you simply save the text file as "robots.txt" and upload it to the root folder of your web site.
Friendly Disclamer:) Before I go too much further, I should point-out that there are 200+ factors to be considered when doing a full optimisation campaign. Every industry and website is different, so one technique that might work well for one business may not have the same results for another.
On top of that, it’s also important to understand that whilst applying the steps in this article will help make the search engines crawl and index your site more effectively, you’ll still need to incorporate off-site techniques like link building and utilizing social media and other forms of online and off-line marketing etc. to achieve the best possible results.
1. Content, Content, Content and Content!
The most important part of any website is it’s content. Without interesting and compelling content you won’t be able to gain inbound links and more importantly you won’t be able to make many sales.
The more content you have the more authoritative you’ll appear to your clients and the more information the search engines will have to index.
It may seem daunting trying to come-up with new things to write about and add to your site on a regular basis, but it can be easier than you think. Try starting a blog or a regular newsletter. If you’re really struggling, you can always delegate the task to a staff member or hire a professional copy writer. It will be well worth it in the end.
2. Navigation Structure
Keep your navigation structure as simple and text-based as possible. While shiny, animated navigation buttons might look nice, they can make it hard for the search engines to effectively crawl your site. What’s the point in having a flashy looking website if it never gets indexed and no one can find it. More importantly, why spend the money on a site that cannot be found by search engines and, as a result, humans?
If it’s important in your industry to have a site that has all the ‘bells and whistles’ then at least use a text-based navigation menu in the footer of your pages, try and use bread-crumbs throughout the site and be sure to have a sitemap.
It’s also a great idea (if possible without making it look spammy) to incorporate some of your keywords into the anchor text in your footer navigation menu. For instance, if your site is about blue widgets, instead of making the link back to your index page say ‘home’ you could change it to say ‘blue widgets homepage’.
One final side-note on navigation - when linking back to your index page make it link back to the full URL of your site instead of index.htm as this can help prevent canonicalization.
3. Don’t Flash
As mentioned above, having a Flash website might look very impressive - but what’s the point if no one will ever find it when they do a search.
Title tags are one of the most important on-site SEO tools a webmaster has at their disposal.
Here are some Do’s and Dont’s for structuring your title tags:
Don’t use the same tags on every page
Don’t put your company name (unless you’re main purpose is branding) at the start of your title tag
Do put one or two keywords in your title tag, but don’t overdo it otherwise your site will look spammy to both humans and search spyders - just make sure the tag is both readable and relevant to your site
Strictly speaking, this tag has no SEO benefit - but it is still important. Why? Because a good description can be the difference between someone clicking on your site instead of your competitors that may appear above or below you in the SERPs.
Be sure to include a call to action! I often tell search marketers to think of their meta-description as an Adwords ad with additional character space.
6. Heading Tags
Well optimized heading (H1, H2, H3, etc) tags not only help the search engines determine what a page is about, but it can also help make your page more readable by breaking-up the topics into appropriately titled sections.
Try and mix mix things up so that you don’t have the same keywords mentioned in your Title Tag and Heading Tags.
7. Keyword Density
There have been many statistics thrown around over the years about what is ideal number of times to mention your keywords throughout the body of your text. I personally don’t aim for a certain percentage and instead encourage my clients to come-up with great content that focuses on their business’ core products and services. By doing this, you’ll naturally incorporate your keywords throughout the text without it looking overdone.
If you want to go one step further, you can also highlight your keywords by making them bold or in italics, but once again, it’s important not to overdo this as it can make great content start to look spammy.
8. Image Tags
Use the ALT attribute to apply meaningful descriptors to your images. This can help your site rank for image searches and makes the site more usable if people are viewing your page via a mobile browser and have images disabled.
9. Keywords in Domain and URL
Having your keywords mention in your domain name or in the names of your pages can be a great way to ensure people use your keyword in the anchor text when they link to you, plus they will appear in bold in the SERPs when someone searches for those terms.
With saying that, it’s far more important having a memorable, concise domain than a name consisting of 20 keywords held together with hyphens with a .net at the end (eg. www.this-is-not-an-ideal-looking-domain-name.net).
10. Be sensible about everything
All the points above can help, but you can also easily go overboard by using the same keywords over and over again in your page names, titles, headers, meta tags, in bold throughout your content etc. etc.
Just use your common sense - try and read through your page objectively (or as a customer) and if it flows well and looks good you really shouldn’t have anything to worry about. If you have trouble being objective about your own site, get a friend or family member to have a read and give you feedback.