Wednesday, December 28, 2011

SEO Book.com

SEO Book.com


Website Auditor Review: A Full-Featured On-Page Optimization Tool

Posted: 28 Dec 2011 09:11 AM PST

website-auditor-enter-url

Website Auditor is one of the 4 tools found in Link-Assistant's SEO Power Suite. Website Auditor is Link-Assistant's on-page optimization tool.

We recently reviewed 2 of their other tools, SEO Spyglass and Rank Tracker. You can check out the review of SEO Spyglass here and Rank Tracker here.

What Does Website Auditor Do?

Website Auditor crawls your entire site (or any site you want to research) and gives you a variety of on-page SEO data points to help you analyze the site you are researching.

We are reviewing the Enterprise version here, some options may not be available if you are using the Professional version.

In order to give you a thorough overview of a tool we think it's best to look at all the options available. You can compare versions here.

Getting Started with Website Auditor

To get started, just enter the URL of the site you want to research:

website-auditor-enter-url

I always like to enable the expert options so I can see everything available to me. Next step is to select the "page ranking factors:

wa-select-page-factors

Here, you have the ability to get the following data points from the tool on a per-page basis:

  • HTTP status codes
  • Page titles, meta descriptions, meta keywords
  • Total links on the page
  • Links on the page to external sites
  • Robots.Txt instructions
  • W3C validation errors
  • CSS validation errors
  • Any canonical URL's associated with the page
  • HTML Code Size
  • Links on the page with the no-follow attribute

Your next option is to select the crawl depth. For deep analysis you can certainly select no crawl limit and click the option to find unlinked to pages in the index.

wa-step-3

If you want to go nuts with the crawl depth frequently, I'd suggest looking into a VPS to house the application so you can run it remotely. Deep, deep crawls can take quite awhile.

I know HostGator's VPS's as well as a Rackspace Cloud Server can be used with this and I'm sure most VPS hosting options will allow for this as well.

I'm just going to run 2 clicks deep here for demonstration purposes.

Next up is filtering options. Maybe you only want to crawl a certain section or sections of a site. For example, maybe I'm just interested in the auto insurance section of the Geico site for competitive research purposes.

Also, for E-commerce sites you may want to exclude certain parameters in the URL to avoid mucked up results (or any site for that matter). Though there is an option (see below) where you can have Website Auditor treat pages that are similar but might have odd parameters as the same page.

Another option I like to use is pulling up just the blog section of a site to look for popular posts link-wise and social media wise. Whatever you want to do in this respect, you do it here:

wa-step-4-filtering-options

So here, I'm included all the normal file extensions and extension-less files to include in the report and I'm looking for all the stuff under their quote section (as I'm researching the insurance quote market).

The upfront filtering is one of my favorite features because I exclude unnecessary pages from the crawl and only get exactly what I'm looking for, quickly. Now, click next and the report starts:

wa-step-5-searching

Working With the Results

Another thing I like about Link-Assistant Products is the familiar interface between all 4 of their products. If you saw are other reviews, you are familiar with the results pane below.

Before that, Website Auditor will ask you about getting more factors. When I do the initial crawl I do not include stuff that will cause captchas or require proxies, like cache dates and PR. But here, you can update and add more factors if you wish:

wa-more-factors

Once you click that, you are brought to the settings page and give the option to add more factors, I've specifically highlighted the social ones:

wa-social-factors

I'll skip these for now and go back to the initial results section. This displays your initial results and I've also highlighted all the available options with colored arrows:

wa-results-pane-large

Your arrow legend is as follows:)

  • Orange - You can save the current project or all projects, start a new project, close the project, or open another project
  • Green - you can build an white-labeled Optimization report (with crawl, domain, link, and popularity metrics plugged in), Analyze a single page for on-page optimization, Update a workspace or selected pages or the entire project for selected factors, Rebuild the report with the same pages but different factors, or create an XML sitemap for selected webpages.
  • Yellow - Search for specific words inside the report (I use this for narrowing down to a topic)
  • Red - Create and update Workspaces to customize the results view
  • Purple - Flip between the results pane, the white-label report, or with specific webpages for metric updates

Workspaces for Customizing Results

The Workspaces tab allows you to edit current Workspaces (add/remove metrics) or create new ones that you can rename whatever you want and which will show up in the Workspaces drop-down:

wa-workspaces

Simply click on the Workspaces icon to get to the Workspaces preference option:

wa-workspaces-options

You can create new workspaces, edit or remove old ones, and also set specific filtering conditions relative to the metrics available to you:

wa-eric-workspace

Spending some time upfront playing around with the Workspace options can save you loads of time on the backend with respect to drilling down to either specific page types, specific metrics, or a combination of both.

Analyzing a Page

When you go to export a Website Auditor file (you can also just control/command + a to select everything in the results pane and copy/paste to a spreadsheet) you'll see 2 options:

  • Page Ranking Factors (the data in the results pane)
  • Page Content Data

You can analyze a page's content (or multiple pages at once) for on-page optimization factors relative to a keyword you select.

There are 2 ways you can do this. You can highlight a page in the Workspace, right click and select analyze page content. Or, you can click on the Webpages button above the filter box then click the Analyze button in the upper left. Here is the dialog box for the second option:

wa-analyze-page-content

The items with the red X's next to them denote which pages can be analyzed (the pages just need to have content, often you see duplicates for /page and /page/)

So I want to see how the boat page looks, highlight it and click next to get to the area where you can enter your keywords:

wa-keywords-content-analysis

Enter the keywords you want to evaluate the page against (I entered boat insurance and boat insurance quotes) then select what engine you want to evaluate the page against (this pulls competition data in from the selected engine).

wa-choose-engines

The results pane here shows you a variety of options related to the keywords you entered and the page you selected:

wa-analysis-results

You have the option to view the results by a single keyword (insurance) or multi-word keywords (boat insurance) or both. Usually I'm looking at multi-word keyphrases so that's what I typically select and the report tells you the percentage the keyword makes up of a specific on-page factor.

The on-page factors are:

  • Total page copy
  • Body
  • Title tag, meta description, and meta keywords
  • H1 and H2-H6 (H2-H6 are grouped)
  • Link anchor text
  • % in bold and in italics
  • Image text

Website Auditor takes all that to spit out a custom Score metric which is mean to illustrate what keyword is most prominent, on average, across the board.

You can create a white-label report off of this as well, in addition to being able to export the data the same way as the Page Factor data described above (CSV, HTML, XML, SQL, Cut and Paste).

Custom Settings and Reports

You have the option to set both global and per project preferences inside of Website Auditor.

Per Project Preferences:

  • Customer information for the reports
  • Search filters (extensions, words/characters in the URL, etc)
  • Customizing Workspace defaults for the Website reports and the Web page report
  • Setting up custom tags
  • Selecting default Page Ranking Factors
  • Setting up Domain factors (which appear on the report) like social metrics, traffic metrics from Compete and Alexa, age and ip, and factors similar to the Page Factors but for the domain)
  • XML publishing information

Your Global preferences cover all the application specific stuff like:

  • Proxy settings
  • Emulation settings and Captcha settings
  • Company information for reports
  • Preferred search engines and API keys
  • Scheduling
  • Publishing options (ftp, email, html, etc)

Website Auditor also offers detailed reporting options (all of which can be customized in the Preferences area of the application). You can get customized reports for both Page Factor metrics and Page Content Metrics.

I would like to see them improve the reporting access a bit. The reports look nice and are helpful but customizing the text, or inputting your own narratives is accessed via a somewhat arcane dialog blog, where it makes it hard to fix if you screw up the code.

Give Website Auditor a Try

There are other desktop on-page/crawling tools on the market and some of them are quite good. I like some of the features inside of Website Auditor (report outputting, custom crawl parameters, social aspects) enough to continue using it in 2012.

I've asked for clarification on this but I believe their Live Plan (which you get free for the first 6 months) must be renewed in order for the application to interact with a search engine.

I do hope they consider changing that. I understand that some features won't work once a search engine changes something, and that is worthy of a charge, but tasks like pulling a ranking report or executing a site crawl shouldn't be lumped in with that.

Nonetheless, I would still recommend the product as it's a good product and the support is solid but I think it's important to understand the pricing upfront. You can find pricing details here for both their product fees and their Live Plan fees.

Tuesday, December 27, 2011

SEO Book.com

SEO Book.com


SEO Lemons

Posted: 27 Dec 2011 04:11 PM PST

Sharing is caring!

Please share :)

Embed code is here.

SEO Market for Lemons.

You can embed the above graphic on your website here.

Have feedback? Please contribute in the comments.

Thanks to John Andrews for highlighting the above industry trend.

Categories: 

Thursday, December 22, 2011

SEO Book.com

SEO Book.com


SEO Spyglass Review: A Brand New Link Source

Posted: 22 Dec 2011 07:09 AM PST

SEO Spyglass is one of the 4 tools Link-Assistant sells (individually) and as a part of their SEO Power Suite.

We did a review of their Rank Tracker application a few months ago and we plan to review their other 2 tools in upcoming blog posts.

Key Features of SEO Spyglass

The core features of SEO Spyglass are:

  • Link Research
  • White Label Reporting
  • Historical Link Tracking

As with most software tools there are features you can and cannot access, or limits you'll hit, depending on the version you choose. You can see the comparison here.

Perhaps the biggest feature is their newest feature. They recently launched their own link database, a couple of months early in beta, as the tool had been largely dependent on the now dead Yahoo! Site Explorer.

The launch of a third or fourth-ish link database (Majestic SEO, Open Site Explorer, A-Href's rounding out the others) is a win for link researchers. It still needs a bit of work, as we'll discuss below, but hopefully they plan on taking the some of the better features of the other tools and incorporating them into their tool.

After all, good artists copy and great artists steal :)

Setting Up a Project for a Specific Keyword

One of my pet peeves with software is feature bloat which in turn creates a rough user experience. Link-Assistant's tools are incredibly easy to use in my experience.

Once you fire up SEO Spyglass you can choose to research links from a competing website or links based off of a keyword.

Most of the time I use the competitor's URL when doing link research but SEO Spyglass doubles as a link prospecting tool as well, so here I'll pick a keyword I might want to target "Seo Training".

The next screen is where you'll choose the search engine that is most relevant to where you want to compete. They have support for a bunch of different countries and search engines and you can see the break down on their site.

So if you are competing in the US you can pull data the top ranking site off of the following engines (only one at a time):

  • Google
  • Google Blog Search
  • Google Groups
  • Google Images
  • Google Mobile
  • YouTube
  • Bing
  • Yahoo! (similar to Bing of course)
  • AOL
  • Alexa
  • Blekko
  • And some other smaller web properties

I'll select Google and the next screen is where you select the sources you want Spyglass to use for grabbing the links of the competing site it will find off of the preceding screen:

So SEO Spyglass will grab the top competitor from your chosen SERP will run multiple link sources off of that site (would love to see some API integration with Majestic and Open Site Explorer here).

This is where you'll see their own Backlink Explorer for the first time.

Next you can choose unlimited backlinks (Enterprise Edition only) or you can limit it by
Project or Search Engine. For the sake of speed I'm going to limit it to 100 links per search engine (that we selected in a previous screen) and exclude duplicates (links found in one engine and another) just to get the most accurate, usable data possible:

When you start pinging engines, specifically Google in this example, you routinely will get captcha's like this:

On this small project I entered about 8 of them and the project found 442 backlinks (here is what you'll see after the project is completed):

One way around captchas is to either pay someone to run this tool for you and manually do it, but for large projects that is not ideal as captcha's will pile up and you could get the IP temporarily banned.

Link-Assistant offers an Anti-Captcha plan to combat this issue, you can see the pricing here.

Given the size of the results pane it is hard to see everything but you are initially returned with:

  • an icon of what search engine the link was found in
  • the backlinking page
  • the backlinking domain

Spyglass will then ask you if you want to update the factors associated with these links.

Your options by default are:

  • domain age
  • domain ip
  • domain PR
  • Alexa Rank
  • Dmoz Listing
  • Yahoo! Directory Listing
  • On-page info (title, meta description, meta keywords)
  • Total links to the page
  • External links to other sites from the page
  • Page rank of the page itself

You can add more factors by clicking the Add More button. You're taken to the Spyglass Preferences pane where you can add more factors:

You can add a ton of social media stuff here including popularity on Facebook, Google +, Page-level Twitter mentions and so on.

You can also pick up bookmarking data and various cache dates. Keep in mind that the more you select, especially with stuff like cache date, you are likely to run into captcha's.

SEO Spyglass also offers Search Safety Settings (inside of the preferences pane, middle of the left column in the above screenshot) where you can update human emulation settings and proxies to both speed up the application and to help avoid search engine bans.

I've used Trusted Proxies with Link-Assistant and they have worked quite well.

You can't control the factors globally, you have to do it for each project but you can update Spyglass to only offer you specific backlink sources.

I'm going to deselect PageRank here to speed up the project (you can always update later or use other tools for PageRank scrapes).

Working With the Results

When the data comes back you can do number of things with it. You can:

  • Build a custom report
  • Rebuild it if you want to add link sources or backlink factors
  • Update the saved project later on
  • Analyze the links within the application
  • Update and add custom workspaces

These options are all available within the results screen (again, this application is incredibly easy to use):

I've blurred out the site information as I see little reason to highlight the site here. But you can see where the data has populated for the factors I selected.

In the upper left hand corner of the applications is where you can build the report, analyze the data from within the application, update the project, or rebuild it with new factors:

All the way to the right is where you can filter the data inside the application and create a
new workspace:

Your filtering options are seen to the left of the workspaces here. It's not full blown filtering and sorting but if you are looking for some quick information on specific link queries, it can be helpful.

Each item listed there is a Workspace. You can create your own or edit one of the existing ones. Whatever factors you include in the Workspace is what will show in the results pane as factors

So think of Workspaces as your filtering options. Your available metrics/columns are

  • Domain Name
  • Search Engine (where the link was found)
  • Last Found Date (for updates)
  • Status of Backlink (active, inactive, etc)
  • Country
  • Page Title
  • Links Back (does the link found by the search engine actually link to the site? This is a good way of identifying short term, spammy link bursts)
  • Anchor Text
  • Link Value (essentially based on the original PageRank formula)
  • Notes (notes you've left on the particular link). This is very limited and is essentially a single Excel-type row
  • Domain Age/IP/PR
  • Alexa Rank
  • Dmoz
  • Yahoo! Directory Listing
  • Total Links to page/domain
  • External links
  • Page-level PR

Most of the data is useful. I think the link value is overvalued a bit based on my experience finding links that often had 0 link value in the tool but clearly benefited the site it ended up linking to.

PageRank queries in bulk will cause lots of captcha's and given how out of date PR can be it isn't a metric I typically include on large reports.

Analyzing the Data

When you click on the Analyze tab in the upper left you can analyze in multiple ways:

  • All backlinks found for the project
  • Only backlinks you highlight inside the application
  • Only backlinks in the selected Workspace

The Analyze tab is a separate window overlaying the report:

You can't export from this window but if you just do a control/command-a you can copy and paste to a spreadsheet.

Your options here:

  • Keywords - keywords and ratios of specific keywords in the title and anchor text of backlinks
  • Anchor Text - anchor text distribution of links
  • Anchor URL - pages being linked to on the site and the percentages of link distribution (good for evaluating deep link distribution and pages targeted by the competing site as well as popular pages on the site...content ideas :) )
  • Webpage PR
  • Domain PR
  • Domains linking to the competing site and the percentage
  • TLD - percentage of links coming from .com, net, org, info, uk, and so on
  • IP address - links coming from IP's and the percentages
  • Country breakdown
  • Dmoz- backlinks that are in Dmoz and ones that are not
  • Yahoo! - same as Dmoz
  • Links Back - percentages of links found that actually link to the site in question

Updating and Rebuilding

Updating is pretty self-explanatory. Click the Update tab and select whether or not to update all the links, the selected links, or the Workspace specific links:

(It's the same dialog box as when you actually set up the project)

Rebuilding the report is similar to updating except updating doesn't allow you to change the specified search engine.

When you Rebuild the report you can select a new search engine. This is helpful when comparing what is ranking in Google versus Bing.

Click Rebuild and update the search engine plus add/remove backlink factors.

Reporting

There are 2 ways to get to the reporting data inside of Spyglass

There is a quick SEO Report Tab and the Custom Report Builder:

Much like the Workspaces in the prior example, there are reporting template options on the right side of the navigation:

It functions the same way as Workspaces do in terms of being able to completely customize the report and data. You can access your Company Profile (your company's information and logo), Publishing Profiles (delivery methods like email, FTP, and so on), as well as Report Templates in the settings option:

You can't edit the ones that are there now except for playing around with the code used to generate the report. It's kind of an arcane way to do reporting as you can really hose up the code (below the variables in red is all the HTML):

You can create your own template with the following reporting options:

  • Custom introduction
  • All the stats described earlier on this report as available backlink factors
  • Top 30 anchor URLs
  • Top 30 anchor texts
  • Top 30 links by "link value"
  • Top 30 domains by "link value"
  • Conclusion (where you can add your own text and images)

Overall the reporting options are solid and offer lots of data. It's a little more work to customize the reports but you do have lots of granular customization options and once they are set up you can save them as global preferences.

As with other software tools you can set up scheduled checks and report generation.

Researching a URL

The process for researching a URL is the same as described above, except you already know the URL rather than having SEO Spyglass find the top competing site for it.

You have the same deep reporting and data options as you do with a keyword search. It will be interesting to watch how their database grows because, for now, you can (with the Enterprise version) research an unlimited number of backlinks.

SEO Spyglass in Practice

Overall, I would recommend trying this tool out. If nothing else, it is another source of backlinks which pulls from other search engines as well (Google, Blekko, Bing, etc).

The reporting is good and you have a lot of options with respect to customizing specific link data parameters for your reports.

I would like to see more exclusionary options when researching a domain. Like the ability to filter redirects and sub-domain links. It doesn't do much good if we want a quick, competitive report but a quarter or more of the report is from something like a subdomain of the site you are researching.

SEO Spyglass's pricing is as follows:

  • Purchase a professional option or an enterprise option (comparison)
  • 6 months of their Live Plan for free
  • Purchase of a Live Plan required after 6 months to continue using the tool's link research functionality.
  • Pricing for all editions and Live Plans can be found here

In running a couple of comparisons against Open Site Explorer and Majestic SEO it was clear that Spyglass has a decent database but needs more filtering options (sub-domains mainly). It's not as robust as OSE or Majestic yet, but it's to be expected. I still found a variety of unique links from its database that I did not see on other tools across the board.

You can get a pretty big discount if you purchase their suite of tools as a bundle rather than individually

Categories: 

Wednesday, December 21, 2011

AboutUs Weblog

AboutUs Weblog


Tempt readers with your headline, feed them with your content

Posted: 20 Dec 2011 10:43 AM PST

Image of Clowns

Photo courtesy of Simon Forsyth on Flickr

Writers and marketers wear many hats.

Teacher, persuader, motivator, advisor, and entertainer. Which hat are you wearing today?

Deciding that can help you craft your article or blog post so people will want to read it.

Jonathan Morrow of Copyblogger  speaks to your plight in Sex,Lies and The Art of Commanding Attention. He recommends using powerful words in your headlines to wake up readers and tempt them to dive into your article.

He emphasizes that certain words – sex, lies, death, money, dangerous – elicit emotional responses and command attention. But that attention is worthless unless your topic words are also in the headline.

Give your readers meat, with some fat for flavor.  Continue marbling your substantive content with tasty fat – entertaining bits – to keep them reading right through to the end.

In a subsequent blog post with an equally catchy headline, Mr. Morrow recommends sleeping with your reader…metaphorically speaking. You need to be as in tune with your readers’ deepest hopes and fears as with your lover’s.

You may think that’s not possible if you’re writing for, say, General Motors. But it is – car buyers are worried about paying car loans, concerned about how much they’ll spend on gas, worried about their children’s safety. They also care about their image, about how their car adds to (or detracts from) that image. Plus, there’s the sheer pleasure of owning and driving a shiny new car.

Someone writing for General Motors needs to think about all of that – and, says Morrow, once you figure out your readers’ emotions, write about nothing else.

So, choose your hat – a daring, gorgeous, smart, funny, silly or stylish one one with sequins and frills – get into bed with your readers, and get writing!