Friday, February 09, 2007

Keyword Metrics 3.0?

Here at SEO Research Labs, we've been doing keyword research for years. We've already changed the way we report data twice. You can see a description of the current reports here.

It's time to redesign the reports again, and develop a new set of metrics. I'm posting our list of factors to consider here on the blog, to invite your comments on what we should do next.

  1. Search volume forecasting: since the beginning, we have relied on search volume data from Wordtracker for our reports. Over the past two years, we've been monitoring the accuracy of Wordtracker vs. Keyword Discovery, based on actual search count data obtained from partners' pay-per-click ad campaigns.

    We still believe that Wordtracker is the most accurate source when it comes to forecasting the number of searches, but the gap is very narrow at this point, now that Keyword Discovery's premium database has come online.

    Some decisions we need to make include whether to offer a choice of data sources, or to include data from both Wordtracker and Keyword Discovery in our reports. My first impulse is to pay what it costs to include both, but some folks may find the "dueling data sources" confusing.

    The other big decision is whether to forecast the search volumes on the major search engines, based on market share data. The public sources of market share data report a very wide range of scores - depending on who you believe, Google's market share may run anywhere from 28% to 70%. To me, that says "useless information." My first impulse is to eliminate the market share forecasting altogether, since we can't be certain of its accuracy.

    Your thoughts, please.

  2. Click-through traffic forecasting. We have included a traffic forecasting tool in our reports for some time, which I have always found useful. However, this report is often a source of confusion, because its correct use is not well understood. We made some adjustments to the methodolody early on, to make this a more "pessimistic" forecast. We've made changes to the documentation to explain things better.

    The forecast is based on extensive data mining with real web sites. I've been running this analysis every 6 months (it's an expensive process). The last time we ran the analysis, the range of values on expected click-through rate for specific ranked positions was so broad, that I am just about ready to drop this report.

    Since this tool is already based on market share estimates, my first impulse is to stop trying to forecast, and offer a tool that will allow users to input their own estimates, perhaps pre-populated with "default values" from Netratings and our own data mining effort.

    Your thoughts, please.

  3. Competitive landscape - how many competitors? Right now, we use three metrics to report on the number of competitors. Total # of matches for the search term, total # of "in title" matches (search term in title tag), and the number of matches with the search term in the title and inbound links (title+anchor).

    I still like these metrics, but I am not completely happy about how we have to collect the data. Google's API allows us to collect this data fairly easily, although we are not able to collect data for more than 100 search terms in each report. We have frequent requests to collect more data, but it's just not possible.

    We've also included a "KEI" calculation for each search term, based on these metrics. My first, second, third, fourth, and fifth impulses are to stop reporting KEI in any form, since it is completely useless. We included it two years ago because a lot of SEO clients expected to see it, but if the market hasn't gotten any smarter about this since then, maybe we need to stop helping people stay ignorant.

    Other than dropping KEI, my first impulse is to leave this report alone. My second impulse is to stop collecting the data ourselves, and build something into the spreadsheet that would allow users to input their own Google API key and collect their own data. Would the trade-off be worth it? Would users be willing to wait 2-3 hours to collect the data? Would we end up with a tech support nightmare? Would requiring Microsoft Excel be a problem?

    Your thoughts, please.

  4. Pay-per-click bids: we've included bids from Yahoo/Overture, collected by Wordtracker, for years. This hasn't always been perfect because the search term Yahoo uses may not match the exact query found in Wordtracker, and sometimes we get no data on some terms. Now, it's academic. Yahoo no longer publishes this data, so we're dropping this report.

    One of the challenges with PPC reporting is that we can't (OK, won't) "steal" data. We won't screen scrape in violation of a search engine's TOS. So, unless there is a legal way to obtain the data, we aren't going to do it. With that said, it still might be possible to add another metric, for PPC competition. Would it be useful if we could report on the number of advertisers? Any other metrics out there?

    Your thoughts, please.

  5. Link competition: Big report right now. We take the top 10 results from Google, for the 100 most popular search terms, and we present a backlink count from Alexa, which is based on the number of web sites linking in to a ranked site. Sound complex? It's not, but it's a pain to explain.

    A hiccup with Alexa recently has forced us to reconsider this report, even though I like the number they give us. I am not sure we can rely on them. I also think it's just too much information for the average user to digest.

    My first and last impulse is to turn this into a "top sites" report. In other words, we're going to do it... Since we're pulling the Google rankings for 100 search terms, we can "stack rank" the top performing sites in the market fairy well. The current plan for this report is to show the top 100 sites, based on their overall presence in the search results for the 100 most popular search terms.

    Along with each site (listed by domain), we would show the total presence (# of times a page from their site appeared in SERPs), breadth of presence (# of unique URLs that appeared in SERPs), # of incoming links from Yahoo, and the # of sites linking in from Alexa.

    We can also show the same data for the client's URL in this report, so that users can compare their own presence to the top sites.

    Your thoughts, please.
Thanks!
Dan

Wednesday, February 07, 2007

World's Dumbest Link Farmer

"Smart Link Building," he calls it... he charges $1500 for 20 one-way links, embedded in articles that he will create, and post on his web sites. We're assured that these are all on different class C IP addresses, different hosting... at first glance, he sounds like a spammer, but at least a reasonably competent spammer.

There's no way I'm going to buy into it, but I keep reading the sales letter, and he has links to example articles, on his own site. I start to think maybe the guy has a screw loose, because if those are real examples, then he's exposing the identity of several clients. (As it turns out, they are real examples).

But it gets worse. At the bottom of the sales letter, he has a list of the domains, IP addresses, and actual links into the article directories. He has actually published the location of every site in his link farm... this is hilarious. Go ahead and give the search engines a map to your link farm, dude. That's awesome.

My advice, when you see a scheme like this, is to just walk way. My advice, when you see a scheme like this, run by the world's dumbest link farmer, is to run, not walk.

Sorry, Jim. Congrats on "winning" a #1 ranking for elursmebble n7v or whatever, but I think this proves you don't need to have the slightest clue, to win an SEO contest.

Wow.