SEO
Domain Name buying tip: Know your phonetics and morphology
Feb 3rd
Anyone buying a domain who is concerned about competitors or domain name squatters sneaking in should consider:
1. The homophones (i.e. same pronounciation) for the prefix (if any), root and suffix (if any) of the domain name (e.g. fone and phone)
2. The plural form or singular form for nouns (e.g. -tune and -tunes)
3. Other tenses of verbs (e.g. -stack and -stacking (gerund) or -stacked (past participle))
3. The hyphenated form for compound words
I’ve paid 3x for a plural version of a domain that someone grabbed after I’d just bought the singular form. Unfortunately consumers tended towards the plural form when recalling the name which made the purchase necessary. That’s another strategy domain squatters may use against you when you purchase a domain name – they see the registration change, know that the name is worth something to someone and so they immediately grab the plural form.
A good domain name finding tool would include phonetic and morphological tools to help you find domains where all relevant forms are available (startup idea for someone).
Web site crawler and link checker (free)
Jan 13th
In a previous post I provided a utility called LinkChecker that is a web site crawler and link checker. The idea behind LinkChecker is that you can include it in your continuous integration scripts and thus check your web site either regularly or after every deployment and unlike a simple ping check this one will fail if you’ve broken any links within your site or have seo issues. It will also break just once for every site change and then be fixed the next time you run it. This feature means that in a continuous integration system like TeamCity you can get an email or other alert each time your site (or perhaps your competitor’s site) changes.
As promised in that post, a new version is now available. There’s many improvements under the covers but one obvious new feature is the ability to dump all the text content of a site into a text file. Simply append -dump filename.txt to the command line and you’ll get a complete text dump of any site. The dump includes page titles and all visible text on the page (it excludes embedded script and css automatically). It also excludes any element with an ID or CLASS that includes one of the words “footer”, “header”, “sidebar”, “feedback” so you don’t get lots of duplicate header and footer information in the dump. I plan to make this more extensible in future to allow other words to be added to the ignore list.
One technique you can use with this new ‘dump’ option is to dump a copy of your site after each deployment and then check it into source control. Now if there’s every any need to go back to see when a particular word or paragraph was changed on your site you have a complete record. You could for example use this to maintain a text copy of your WordPress blog, or perhaps to keep an eye on someone else’s blog or Facebook page to see when they added or removed a particular story.
Download the new version here:- LinkCheck <-- Requires Windows XP or later with .NET4 installed, unzip and run
Please consult the original article for more information.
LinkCheck is free, it doesn’t make any call backs, doesn’t use any personal data, use at your own risk. If you like it please make a link to this blog from your own blog or post a link to Twitter, thanks!
Continuous Link and SEO Testing – Announcing LinkCheck2
Sep 21st
First there was Continuous Integration, then there was Continuous Deployment, now there’s Continuous Testing.
Testing can (and should) be integrated throughout your web site development process: automated unit-testing on developer’s machines, automated unit testing during the continuous integration builds and then further automated testing after your continuous deployment process has deployed the site to a server.
Sadly, once deployed, most sites get only a cursory test through a service like Monastic that pings one or more URLs on your site to check that the site is still alive.
BUT, how do you know if your site is still working from a user’s perspective or from an SEO perspective? Serious bugs can creep in from seemingly small changes that aren’t in code but are in the markup to a site, these are often not tested by any of the aforementioned tests. For example, a designer editing HTML markup for your site could accidentally break the sign up link off the main entry page, or the page you had carefully crafted to be SEO optimized around a specific set of keywords could accidentally lose one of those words and thus loses rank in search engines causing your traffic to go down. Would you even know if this has happened?
Based on a small test I ran on some local startup web sites, the answer appears to be ‘no’. These sites often had broken links and poorly crafted titles (from an SEO perspective). Of course they could have used any of the many SEO services that can check your site to see if it has broken links or poorly crafted titles and descriptions (e.g. seomoz.com), but that’s often a manual process and there’s no way to link such tests into your existing continuous integration process.
What would be nice would be if you could include a ‘Continuous Link and SEO test’ on your Continuous Integration Server. This test could be triggered after each deployment and it could also run as a scheduled task, say every hour, to check that your web site is up and that all public pages are behaving correctly from a links and SEO perspective. It would also be nice if there was some way to get a quick report after each deployment confirming what actually changed on the site: pages added, pages removed, links added, links removed.
This is what my latest utility ‘LinkCheck2′ does. It’s a Window command line application that produces a report, and it will set an error code if it finds anything amiss. You can run it from the command line for a one off report or call it from your continuous integration server. The error code can be used by most CI servers to send you an alert. If you are using the changes feature you’ll get an alert when something changes and then on the next run it will automatically clear.
LinkCheck2 also includes the ability to define a ‘link contract’ on your site. This is a meta tag you add to a page to say ‘this page must link to these other pages’. LinkCheck2 will verify that this contract has been met and that none of your critical site links have been dropped by accident when someone was editing the markup.
At the moment LinkCheck2 checks all links and performs a small number of SEO tests (mostly around the length of titles). If there is interest in this tool I may expand the SEO capabilities, please send me your feedback and requests.
Use of LinkChecker.exe is subject to a license agreement: in a nutshell: commercial use is permitted, redistribution is not. Please contact me for details.
A WordPress plugin to link pages by ID but still have SEO friendly permalinks
May 27th
WordPress provides several ways to link to other pages or posts on your blog or web site, but none of them was good enough for what I wanted so I wrote this plug-in.
Requirements:-
- 1) Link pages so that even if the page changes its permalink path while you are constructing your site, links from other pages don’t break.
- 2) Use full SEO friendly permalinks throughout the site, never show a user an url like ?p=145
To use it you find the page or post id (you can see it on any page you are editing in the URL), then add a shortcode like [permalink id=123 text="Link text"]. If you omit the link text it will use the current page title.
With this in place you are free to change any page’s permalink during the development process, or even to develop the site under a different domain and then move it to the final domain, and nothing breaks during the process! Of course you shouldn’t go changing permalinks after you’ve deployed your site: that would break inbound links including urls that Google has cached in their search results, but during the development process this plugin can help ensure all the links work even as you make major changes to the structure of your site.
I also added a [childpages id=3] shortcode which gives you a list of pages under a particular page in the navigation structure of your site.
Here’s the code, save it to a .php file in your plug-ins directory and activate it.
<?php /* Plugin Name: Ian's Page Links extensions Plugin URI: http://blog.abodit.com/wordpress Description: Allows you to link to pages using just the page or post ID [permalink id=3 text='xxx'] also provides a way to list child pages under a given page: [childpages id=5] Version: 1.0 Author: Ian Mercer Author URI: http://blog.abodit.com License: GPL2 */ function ian_childpages($atts) { extract(shortcode_atts(array( 'id' => 8 ), $atts)); $children = wp_list_pages('title_li=&child_of='.$id.'&echo=0'); return '<ul>'.$children.'</ul>'; } function ian_permalink($atts) { extract(shortcode_atts(array( 'id' => 3, 'text' => 'Missing' ), $atts)); if ($text == 'Missing') $text=get_the_title($id); return '<a href="'.get_permalink($id).'">'.$text.'</a>'; } add_shortcode('childpages','ian_childpages'); add_shortcode('permalink','ian_permalink'); ?>
Thinking about how Search Engine Marketing (SEM) can be applied to other areas of business
Apr 24th
One thing that struck me while I was building the Seo Keyword Search and Mapping tool was that keyword analysis really can reveal a lot about what people are thinking about and what they are looking for. Seo keyword analysis really is the largest scale unprompted recall survey you can possibly do. You can apply it to almost any industry and it gives you a detailed picture as to what customers of that industry really want.
Of course, prior to the launch of the Seo Keyword Search and Mapping tool it was pretty hard to actually see that picture but now it’s quite easy.
So what can you use this new found understanding of your customers to do?
Well, obviously you can create AdWord campaigns around it – that’s why Google provides the information to you in the first place. And clearly you can craft better landing pages with more keyword friendly titles, headings, body content and images on the page. But why stop there? Here’s four more things you can do with the information:
1. Plan your next blog post using it – make sure you’ve covered all the topics people are asking about for your industry
2. Create a digital sign using the keywords that are most interesting to people who visit your retail locations
3. Use them in Twitter campaigns
4. Use them to craft better direct marketing messages whether email or postal
SEO Keyword Search and Mapping Tool now available
Apr 22nd
Today marks the release of my SEO Keyword Mapping tool at http://seokeywordsearch.com.
This released version lets you create a keyword map for any combination of up to three search phrases.
SEO Myths
Apr 20th
There’s a post over on Search Engine Land that has a good list of SEO Myths and things to avoid: http://searchengineland.com/36-seo-myths-that-wont-die-but-need-to-40076
There are however two points with which I take issue:
“14. It’s important for your rankings that you update your home page frequently (e.g. daily.) This is another fallacy spread by the same aforementioned fellow panelist. Plenty of stale home pages rank just fine, thank you very much.”
Yes there are stale pages that rank highly but that doesn’t mean updating your site regularly isn’t going to boost your search engine rankings; the stale pages may be at the top of their category for other reasons. If you aren’t #1 on the keywords you care about then I highly recommend adding fresh, relevant content to your site on a regular basis. I’ve observed competing sites leap up from 2 to 10 spaces in the rankings when they updated their site and then gradually drop back over the following days or weeks. Google does care about recency so this isn’t such good advice as written.
36. Great Content = Great Rankings. Just like great policies equals successful politicians, right?
I’m not sure what the author means by this point. Content and in-bound links form the backbone of SEO efforts. So don’t stop adding great content to your site!
ASP.NET MVC SEO – Solution Part 1
Mar 7th
In a previous post I explained some of the issues with ASP.NET MVC when trying to implement an SEO-optimized web site. In this post I’ll begin to explore some possible solutions.
Step 1: Master View – some additions
First let’s make it easy to set the meta description
, page title
, meta keywords
and canonical url
by adding the following to the head
section of the master view:
<head id="Head1" runat="server"> <title><%=ViewData["PageTitle"]%></title> <%-- This gets wrapped here, so it sees a title tag and doesn't emit two --%> <%=ViewData["PageDescription"]%> <%-- These are wrapped elsewhere so they vanish if not set--%> <%=ViewData["PageKeywords"]%> <%-- These are wrapped elsewhere so they vanish if not set--%> <%=ViewData["CanonicalUrl"]%> <%-- These are wrapped elsewhere so they vanish if not set--%> <meta name="robots" content="noodp" /> <%--Don't use Open Directory Project descriptions--%>
Note how we are not wrapping the canonical URL, and meta tags around the ViewData here (even though it would be more correct to do so). We do this so that when those tags are not present the entire tag disappears from the page instead of rendering a tag with an empty string in it. For the title tag however, that’s universal so let’s do it ‘properly’.
In tomorrow’s post I’ll show how we can set the Canonical URL using an attribute.
How SERP rankings changes over time – SEO
Mar 4th
Being an engineer at heart I like to measure things and to take them apart to see how they work. Search engines however are somewhat opaque – they use hundreds of different algorithms to create Search Engine Ranking Pages (SERPS) and those algorithms are a closely guarded secret and the subject of much debate and investigation in the SEO community.
So, since I wanted to learn more about SEO I decided to track and graph the top 100 ranked entries for a particular keyword over several weeks and here is the result. Click to enlarge it.
Some observations:
(i) volatility is clearly a lot less in the top ranked sites – moving one position from 4 to 3 is going to be a lot harder than moving from 94 to 93.
(ii) some sites can have stunning leaps for a short time but then get reset back to their former position – presumably getting the ‘recent content’ lift but then failing to capitalize on it with more frequent updates.
(iii) search engine optimization (SEO) is not something you can do once and forget, it’s something you need to stay on top of with frequent updates to your site and constant in-bound link building efforts.
I plan to compare this chart against some other keywords I’ve been tracking to see how they compare. I also want to track each site to determine what caused the particularly stunning leaps or falls in the rankings and learn from it. Maybe a chart like this could help identify which keywords are more volatile than others and therefore which ones you can make most progress against.
It’s early days, I just generated the first chart a few hours ago, so stay tuned for updates on this project.
ASP.NET MVC meet SEO; SEO meet ASP.NET MVC
Mar 3rd
Whilst ASP.NET is clearly the best thing to hit .NET web development in a long-time it seems like the framework itself is somewhat challenged when it comes to SEO. For starters the concept of a page has all but disappeared – sure you can have a ViewPage but there’s no code associated with it. And sure, you have ASP.NET Routing so you can do anything you like with routes but the catchall route {Controller}/{Action}/{id} is as much a liability as it is a benefit as it catches things you really didn’t want it to catch and generates routes you really didn’t want to generate all too easily.
Convention over configuration is nice and all that, but sometimes a bit of configuration is necessary to bring your house into order, especially when the convention doesn’t allow things you really want for SEO.
So let’s take a look at all the things we really want to be able to do when creating an SEO friendly web site and see how we can get ASP.NET MVC to handle them.
For SEO we need:-
1. The ability to define a canonical url for a page. To use that canonical URL whenever we generate a route. To include that canonical url in the page header to instruct search engines that this is the canonical url for that page.
2. The ability to define multiple alternate URLs for a page. Plans change and your site changes too but you don’t want 404 errors, you want the user to land on the same page even if you changed the URL to improve its SEO keyword content for example. Ideally you’d like to 301 redirect these legacy urls but having them at least display the right page and including the canonical url in the header for that page is good enough.
3. The ability to use hyphens in urls. But since Controllers are classes and Actions are methods and the convention is to use them as parts of the URL this isn’t supported out of the box.
4. The ability to define title, meta description and meta keywords tags for a page in such a way that you can enforce rules around them such as requiring every public page to have a title tag, or ensuring that the length of the title tag is reasonable, or ensuring that your product name is on the end of every title tag.
5. The ability to build a sitemap.xml file that we can submit to Google or Bing containing every URL that we want them to index.
In my next few posts I’ll explain how we can overcome all of these shortcomings of ASP.NET MVC to create a great SEO-friendly web site.
Stay tuned!