Blogging and SEO

Getting your content out in a blogging system is much more effective for SEO than by any other content management system, because there is a vast network of new web-crawlers out there competing on many levels to pick up and carry your content. Yahoo, Google and Technorati are competing to pick up all blog posts first.

News aggregation sites specializing in verticals are competing to pick up content on particular subjects. You can see this in technical topics like Java and Ajax right now, but the trend will trickle into every industry, as it is the lazy way to create those “authority sites” that are so important for SEO. In the past, spammers were doing “search scraping” to accomplish much the same thing, but mostly to have pages on which to hide AdSense.

Today, RSS feeds are being combined to make much more professional and sensible sites. But the same dynamic applies. They are attempting to intermediate communication. Not long ago, the concept of disintermediation was a buzzword, due to manufacturers’ sudden ability, thanks of course to the Internet, to sell directly without going through distributors or retail outlets. Today, content is a product, often being given out for free by bloggers who put entire articles into news feeds, and it is an invitation to intermediate. Just as with spamming, there is little to no cost or risk, and there is a great deal of up-side.

The answer is simply to set your blogging software to only include a portion of your post in the RSS feed. Whenever you post, all the blog crawlers will pick up the new content. The only full copy of the search-optimized content will exist on your site. And the news aggregation sites will get your small paragraph, which is fine because it will become a highly desirable non-reciprocal link.

Blogger SEO advantage confirmed

Well, the HitTail site ended up being in the Google SERPs today for the first time. This is concrete evidence that a previously non-existent domain can go from registration to being served in results in just about 2 weeks — and that using Blogger as an on-site blogging tool provides some advantage. I’ll be watching to see if it sticks. The page that got picked up was the blog index page.

To find the page, I searched on the site’s tagline in quotes. It’s hard to find in the results if you don’t use the quotes. And the results are cycling between at least 2 sets of results, one of which doesn’t yet have the blog index page. That shows I’m spotting this just as the new results are coming live.

The Google “site:” command shows that only that page has been picked up and indexed, the homepage. And the homepage has only been recognized, but not indexed.

And if you’re logged into your gmail account when you do these searches, you get additional data! It shows the exact time it was posted of 5:39PM. Unfortunately, that was 5:39 on February 2nd, which was 5 days ago. I speculate that this comes from the additional data Google has available about the page from the Blogger data, since Google owns Blogger.

Collecting Contact Information

Well, it’s been a few days without a HitTail post. Tuesday ended up being an office/client day. I’ve got an interesting post in the works on search-friendly URLs and the concept of the weakest link in the chain in search optimization. Yesterday, I got myself set up on a VOIP phone at home that ties back to the office Avaya IP Office phone system. So from home, I can make outgoing calls as if I were on an office phone. Between these two items, and catching up on sleep lost over the past week, the last two days haven’t been my most productive. I plan on making up for that starting now.

I got a blog post out this morning that really starts to describe what the HitTail site and system is all about. A lot of my time has really gone into blogging, and not building the app. The reason for that is that the thought-work needs to be done to ensure that I’m doing the right work. But I’ve really got to move forward fast now. More and more people will be coming in as a result of my blogging, and I want to start building the list of beta testers. That makes the contact info app the most important, initially. And since I’m collecting contact info, I might as well integrate it with the Connors sales lead management and online conversation system. That’s my goal for today.

I’ve got to get both the BabyStep tutorial that I just produced, and the application itself onto the sight. I want it wrapped in the site’s look and feel, as minimal as it may be right now. So, what I do is finish the template in the content management system. OK, done. The Spider Spotter app now automatically takes on the look of the rest of the sight, as will any new applications that I develop.

First, put a blank page from the CMS under each main category link. OK, done. Now that you have semi-permanent navigation, move it over to the Blogger template. OK, done. The rest of your work for today is activating the Sign Up Now and Start Conversation systems.

Now, it’s time to focus on contact info collection. There are two forms of contact info collection (3 if you include the blog subscribe list), which I’m leaving to Bloglet for now. Time allowing, I’d use the Google API to program a list mailer myself, so it is more tightly integrated. But the 80/20 rule. The two main forms of contact info collection is “Sign Up Now” and “Start Conversation”. The process of each starts with just the email. That way, minimal contact info is acquired, and some idea of their intent, based on which submit button they used. The principle here is: capture fast. Even if no follow-up action is taken on the part of the prospect, you are now in a position to proactively follow up with them.

The submit will come in through the same program either way. The “Sign Up Now” will just give a parting message, thanking them and explaining the next step. It will also give one more opportunity to talk with Connors about PR. The “Start Discussion” submit will bypass all this and drop you directly into the sales lead management/discussion system. It’s sort of like a private one-to-one blog. Technically, it’s one-to-many, with you the prospect being the “one”, and Connors the organization being the “many”.

Every one of these discussions that comes in needs to be treated as priority #1. It could be the next Amazon, Priceline or Vonage. But it could also be influential early adopters of technology, and the people who will be pivotal in making HitTail erupt in buzz. And it also may just be the curious tire kickers, whom we will need to quickly identify. I’m sure that launching a public discussion forum is in the near future, so we can efficiently communicate to the entire audience for HitTail, and general next generation public relations issues.

The Evolution of Search Marketing

Search marketing is about to evolve, brought to you by the New York public relations firm that launched Amazon.com, Priceline.com and Vonage. Connors Communications is about to do for natural search what AdWords and Yahoo! Search Marketing have done for paid search—create a logical system that any marketer can use to bring in more qualified traffic. Except this time, instead of jumping into the bidding frenzy over prohibitively expensive keyword inventory, you tap into the infinite supply of free long tail keyword inventory in what’s coming to be known as the long tail of search.

Go to Google or Yahoo! and perform a few of the searches that you believe are important to your business, organization or cause. Does your site appear? If not, you’re experiencing the anxiety that’s driving companies to paid keyword campaigns, because after brief investigation, you discover that the alternative of cracking natural search is just too difficult. You run into the shadowy world known as search engine optimization (SEO) full of dodgy websites and conflicting advice.

It’s easy to become intimidated, and just put your resources in the same place as your peers, and the early majority of marketers are—paid search. Paid search has become mainstream, and demand for the most lucrative keywords has driven costs to sometimes prohibitive levels, reducing the advantage held by early adopters. The real deals in keyword inventory now exist where demand is lowest, and the work of figuring out exactly what those keywords are is hardest. Wired magazine writer Chris Anderson has done a great deal to popularize the concept, and I recommend you read his paradigm-shifting article.

At some point, every marketer worth his/her salt wonders whether focusing on the infinite supply of cheap keywords that exist in the long tail of search may in fact be worth it. They cost so little and can pay off so big. As searches become more specific, the customer often becomes more qualified and valuable. Yes, it’s actually a flaw in the paid keyword business plan. But not in a paid campaign where managing tens of thousands of keywords becomes self-defeating—but rather, in a way of doing natural search that any marketing department can handle in-house. Demand now exists for a mainstream alternative to paid search.

HitTail is a rote process of identifying the most likely keywords for targeting based on your existing traffic, automatically suggesting topics for writing. You then incorporate this writing into your website or blog, getting into the habit of friction-free publishing. Once the wheels are greased, and you’re comfortable rolling out new subject matter, it becomes an addictive process that’s easily on par with what some marketers jokingly call CrackWords. Feedback on the effectiveness of your writing is quick, more writing topics suggested, and the process accelerated.

This is not a spamming system, as all the writing comes from you. The premise is that any market or industry has plenty to talk about, and it is your obligation to become the voice of authority in your market space. In fact, with HitTail, you kill two birds with one stone, by solving your corporate blogging strategy simultaneously to natural search. HitTail makes a blogging strategy many times more effective than it could be alone, by virtue of being the ideal “writing coach,” understanding your existing performance, and where your biggest gains can be made.

HitTail is not an analytics package, nor is it a conversion tracker, for that is available to everyone for free already in the form of Google Analytics. Instead, HitTail is a very light-weight tracking system built specifically for natural search optimization. There are no reports to customize or mountains of data to wade through. We will not cause paralysis through analysis. Instead, we go directly to actionable data. Just plug in the HitTail tracking code, and recommendations start rolling in. The more you act on them, the better the new recommendations become.

As you follow the recommendations, your natural search hits go up, and more data is available to HitTail. You don’t pay for these hits. They just work for you on a 24/7 basis with no campaign maintenance. If you stop HitTail, the hits continue. They’ve become part of the fabric of your company—a permanent asset. But if you continue with HitTail, success builds on success, and your site begins to build towards a sort of critical mass. The goal is nothing less than to dominate the category. With a little time and creativity, perhaps you can use HitTail to achieve the sort of category dominance that Connors Communications has helped bring to its clients time and again, before opening the system to the public.

How Quick Into Google Search Results?

So, how quickly will a brand new domain show up on Google? I should have been checking day-to-day, but today is January 31, and the site was technically launched on January 16th. That’s just over 2-weeks, the period of time traditionally quoted as the minimum for a Google sweep. So, now’s a good time to do a quick review. Thankfully, the domain name is a totally made-up name, and I can do some very insightful tests here.

I decide to search blogsearch.blogger.com first to ensure that at least the blog posts are in the specialized blog-content engine. It produces 28 results, including the first test post made on HitTail.blogspot.com (now defunct). Every post was made since January 25. So, in one week, every blog post has been included in the blog search.

Next, I search on HitTail in the default search, and I see one result. It’s the domain with no title or description. This is what’s often referred to as the Google sandbox. We can see that Google is aware that the domain exists, but is not producing any of the site’s content in the results. We see in the spiderspotter app that the first visit by GoogleBot was January 25th, the same day I started blogging.

From the 25th to today is exactly 1 week (7 days). In seven days, we have gone from an a previously unknown site to the domain being findable, but collapsed down to 1 page, and no actual page-content in the results. How recent is it? A quick search on a couple of different Google datacenters reveals that even just this 1-page listing is only on a couple of datacenters, and non-existent in others. So I am indeed catching it during the process of propagation, and we have our undisputed evidence that a site can go from zero to listed in some form in 1 week. Have I avoided the Google sandbox penalty altogether?

And finally, we check for specific quoted content from the first blog post. I know it won’t show, but I’m at least doing the text for the sake of completeness. So, it’s 1-week to show up at all. And it’s sometime longer before content appears. After content appears, the results tend to “dance around”, nicknamed the “Google Dance” before the data has propagated across all data centers.

Another factor affecting the results settling down is something people don’t talk about much. The Google patent from March of last year revealed the fact that Google is very sensitive to the amount that a site has changed from one visit to the next. That is to say, how much of the site has changed? How many new links have been established to the site? And when a site is brand new, every few pages you add constitute a significant percentage of the overall site. So, Google is seeing a very volatile site. And the results are correspondingly volatile. Therefore, when a page is first discovered, it goes into what I think of as a moving window of opportunity. I believe they get this extra relevancy boost to see if they have the potential for gangbusters fad-like success.

Fad-like success? Fads, I believe, overrule traditional rules of slow organic growth. These are pages that somehow become massively popular and everyone starts linking to, passing around in email, finding due to events in the news, etc. If a page does suddenly become massively popular, Google sees this, because they’re quietly recording click-through data, similarly to how DirectHit did back in the day. But DirectHit’s system, subsequently merged with Ask Jeeves, was ultimately defeated, because by touting that they were doing this, they invited click-stuffing abuse. Google on the other hand not only doesn’t advertise click-through tracking, but they use very clever JavaScript to keep it from even look like it’s occurring. It’s not evil. It’s just smart. And if a site goes gangbusters, there is a totally organic pattern created that is difficult to fake, because there are hundreds of links from non-related sites, and thousands of click-through from disparate IPs that couldn’t possibly be under one person’s control. This fad traffic pattern then “buoys” that page’s relevancy in future searches. This is just speculation based on observations, and only stands to reason that certain relevancy criteria can outweigh each other criteria, that criteria is both particularly difficult to fake and out of balance to the others.

Anyway, what are my conclusions? This test proves…

  • How long does it take to go from zero-to-being in Google results at all? 1-week.
  • How long does it take to go from zero-to-being in Google results in a meaningful way? Verdict not in, but expected soon. Stay tuned.
  • How long does it take to go from zero-to-being in Google resuts in a stabilized decent and decent fashion necessary to drive sales? Will not know for three to six months
  • How long does it take to go from zeron to being viewes as a healthy, growing site worthy of regular, predictable inclusion of new content? Well, that’s the purpose of HitTail!

Benchmark Keywords Spanning Many Years

This post is about keyword benchmarking for search optimization. After a recent update to the Google search results, nicknamed “Jagger” last November, my personal domain dropped off the first page of results for my name, “Mike Levin”. The photographer of the same name maintained his position #1. My site went four pages in, but a bunch of other pages that also referred to me moved onto the first page of results, including my Blogger profile, the profile on my employer’s site, and my SearchEngineForum staff profile. In other words, I went from holding position #2 with my personal domain that has been around for a very long time, to holding 3 lower positions on the main homepage, but loosing my personal one.

Admittedly, I haven’t kept my personal site very updated and the sites linking to it might be someone dubious, since I’ve been in the SEO circles since day-one. SearchEngineForums is not only one of the oldest search-oriented forums on the Web, but it’s one of the oldest Web forums, period. It was started by Jim Wilson, who has since passed. Webmaster World has mostly taken its place in spirit. Over the years, I’ve worked as something of an intrapreneur (rather than an entrepreneur) at companies like Prophet 21 (now bought by Activant), and Scala Multimedia. There have been certain benchmarks over the years that have helped me gauge what was going on with search.

Searching on my own name was, of course, one of them. And the Google Jagger update was significant in that newer sites, but not too new, suddenly had an edge over long-standing sites, which you might call stale. But another benchmark I occasionally monitor is the term “distribution software”. It was relatively easy to conquer across all the engines of the time, and has sustained itself remarkably over time. So, it was with great interest that I watched when the new purchasers of Prophet 21, and the awesome 3-letter domain p21.com, forwarded the Prophet site to an Activant third-level domain. I don’t think the third-level domain had been around for very long, but the Activant site had. So, would it incur the sandbox penalty? Would it maintain its across-the-board top positions? Was Activant unwittingly walking away from one of its potentially most valuable acquisitions and assets?

The answer is that the Google juice transferred over from www.p21.com to distribution.activant.com very smoothly, at least for the benchmark keyword that I still monitor. The sandbox penalty had been evaded by using a sub-domain of a long-standing second-level domain. If you search on distribution software on Google, Yahoo, MSN and Ask Jeeves, you will find Activant as the VERY TOP result in 3 of the 4 sites, and #3 in Ask Jeeves (which still shows the old domain).

This tells us several lessons. Across-the-board fortified results of the sort I achieved (with help from a fellow named Steve Elsner) are transferable. The transfer can occur in a relatively short period of time (a matter of months). A sub-domain can quickly acquire a great deal of clout—probably more quickly than a newly registered domain, given the new Jagger reality. And when I left P21 back in 1999, I left the Web pieces in some very good hands, and someone at Activant took a gamble that paid off and gave me some important SEO lessons for the SEO landscape as it exists at this particular instant.

Over time, a great deal of evidence mounts up that such-and-such a site is relevant on such-and-such a topic. These breadcrumb trails (mostly link topology) point back to hardwired domain names. So, changing a domain name is serious business.

I have another situation similar to the above one, but the transfer of considerable existing-site clout was to a brand-new domain name. This was December of 2004, before anyone knew newly registered domains were about to have the wind taken out of their sales. Their site appeared in the top results in the Google on their keywords within four months of site-launch, right on schedule and in line with our time estimate to the client. But then it dropped out and didn’t come back.

The client cringed. We cringed. We applied about as much “upward pressure” as we possibly could without crossing ethical boundaries. I was convinced we were worse than stuck in the sandbox, because we had the positions for quite some time and lost them. Then, news broke of the Jagger update. I totally understood the reasoning, and did what I always do when that happens. I metaphorically climbed into the head of the Google Engineers and rummaged around in there for awhile and discovered that if a domain was registered specifically for spamming, they would only be registering it for a year. If a site survived over that 1-year boundary, then bam! You’re out.

So, I gave the client the time estimate based on the new domain launch. I laid out their options, and the risks of sticking it out or bailing to the old domain name too soon. They took our advice and stuck it out to get past that 1-year point, and it paid off. I nailed the time estimate of when the sandbox/Jagger penalty would lift down to the week. It was 1 year and 3 weeks after they launched the new site.

One of my claims to fame in the SEO circles in the early years was my mission to conquer a 2-word keyword combo that landed squarely in the crosshairs of Macromedia, Apple, and a number of other companies: “multimedia software”. I achieved similar fortified results on this 2-keyword combo as I did with “distribution software”, and over the years, it has continued to hover around position #5. And although the term multimedia is so “80’s”, it is also highly competitive—maybe not in bidding, but certainly in how many products I had to push down. So, after I nailed the 2-word combo, I moved onto just the single term “multimedia”. I drove that sucker almost up to page one before I moved onto my next ventures. Also here, I worked the term “digital signage”, which was MUCH easier, since it was a bit more off the beaten track. It has still remained one of my benchmark keywords for taking the pluse of the search landscape.

At Connors Communications, my job is really cut out for me. It’s a PR firm, and doesn’t have the ace up its sleeve that both P21 and Scala had—a product and a user base. Yes, a product and user base are two of the most valuable tools for SEO. Because with a product, you can offer a free downloadable version, which triggers of the viral marketing thing like little else. Everyone adds you to the download sites, and you suddenly have both inbound links AND buzz. But you also have a user base who, for better or for worse, are going to talk about you in forums, and blog about you, and link to you on their websites (sometimes other corporate sites if your product is corporate). It’s even better if you have a network of dealers, distributors and legacy users, which Scala did. It was mostly a matter of directing momentum—or as Sun Tzu would say—throwing rocks on eggs. SEO for Scala was quite easy.

But Connors is a PR firm, which is a service. By nature, it can only serve a small number of clients at any one time. And no matter how talented the Connors crew is (and they are VERY talented, having launched Amazon, Priceline, and most recently, Vonage), it is still just a PR company without the advantages of a product or installed user base. So what is the hook I can hang my hat on from an SEO perspective? What will my benchmark keywords be for Connors? And how do I leverage all the zillions of search hits I’ll be generating for them with SEO if we can’t take everyone onboard simultaneously as clients?

The answer to “what keywords” is “pr firm”, for which we’ve risen to page one in MSN, page 2 in Google and page 2 in Yahoo. This serves as a beachhead for other keyword combos (more on the beachhead concept in later posts), and shows that the methodologies I developed not only are fortified across time (P21 and Scala), but they work across industries. So, the next step is to product-tize an aspect of the PR industry that is exciting to everyone, and can seem in many ways like a downloadable product. Once again, enter HitTail.

Added WiFi Hotspots, and Paying Less

So today I joined the ranks of the wireless warriors. I was on a $60/mo T-Mobile plan that got me 600 peek hours, unlimited night & weekends plus unlimited Internet and download (over the phone only). But now with my shiny new Averatec that everyone thinks is an Apple iBook, I have the itch to walk NYC, sitting down in any Starbucks to do my work. So after about a half hour of talking to a helpful T-Mobile rep named Sidney, I came up with a combo that gets me everything I want.

I’m now spending $10 less per month, and I have unlimited T-Mobile WiFi hotspot access from my laptop as well. What I gave up were 90% of my peek hours and evenings. After I got off the phone with them, I suspended my XP laptop at home, walked over to the closest Starbucks, and connected. A remote desktop session that I had running to my PC at the office came right back, even though it was a different WiFi network and had been in suspend mode. Thing have really improved.

Blog pinging and Pingomatic

I plan on understanding a lot more about how pinging works in blogging systems. I’ve built blogging systems, but before all this pinging stuff was going on, so nothing on that system becomes part of the blogosphere, proper. I just submitted HitTail at the Pingomatic site. And a lot has to be learned just from that process—not the lest of which is simply the names of the different pinging services. Pingomatic even shows you the feedback of the ping.

I don’t know if Pingomatic is using Web Services or simulating a webpage submit. But if this isn’t an application built for Web Services, I don’t know what is. For posterity, and for later review, I did a screen capture. Now that I did this ping, HitTail is really going to be in the blogosphere, because who knows what happens as a result of a one-time ping. There will very likely be discovery-bots sent out, and automatic revisits without pinging by proactive news gatherers. And for the sake of interest, here’s the spider visitation within the first half-hour of doing this blog-ping. Some of these are crawlers that I haven’t seen on HitTail before. Hmmmm…

TrackBack, Link Farms, Jagger Update and Blogger

How much do I miss the track back feature by going with Blogger? Not at all. Why? New information shows just how non-helpful, and perhaps even damaging it can be. Perhaps even Blogger never implemented TrackBack intentionally, due to possibly knowing what was coming down the pike from parent Google—especially in light of the Jagger update from last November. Reciprocal links were penalized—or at least stopped delivering as much value as they used to. If every link you receive is reciprocated, you’re a link farm—at least as far as the Web topology you’re creating.

To deal with this aspect of TrackBack, it’s generally all-or-nothing. You can use TrackBack or turn it off. The second strike against TrackBack is, of course, spam. People link to you and send a TrackBack ping specifically to get a link from your page, even if their site is totally unrelated. The reciprocal links with the unrelated sites goes even further to create that terrible link farm topology. There’s a thin line between the organic pattern created by a genuinely popular site, and the pseudo-organic pattern created by link farms.

My money is on Google getting better and better at recognizing these automatic cross-link patterns. And like every other spam trap, there’s some sort of threshold. Stay below that threshold, and you’re golden. Go over that threshold, and your site is flagged for human review, or possibly even automatic banning.

The real way these blogging software companies should implement TrackBack is to get rid of the silly pinging and TrackBack codes. Blog posts don’t need a unique identifier. The permalink page has a URL! That’s unique enough. The code system is too geeky, and can be automated. Analytics-like Tracking systems built into blogs should simply recognize people following links. If it’s a first-time referrer, it should send a crawler out to check the validity of the page (not all referrers are accurate), and put that link into an inbox queue in the blog user interface. The person running the blog can then go and visit each of the sites and make a human evaluation whether it’s worthy of receiving a link back. If it is, they checkbox it.

This has a number of advantages. First, the human checking process will block spam. Second, it will pick up much more referrers than the TrackBack system in its current form, which requires action on the part of the person linking to your blog. This information is already being passed back and forth. Why not use what’s already there? Third, it serves as a sort of competitive intelligence gatherer for the blogger. They get to see all referring links to their blog as a matter of interest, without necessitating that they receive a link from you.

The time has come, the Walrus said, to speak of many things. The do’s and don’t of SEO, of tracking-back and pings.

An addendum to this post, moments after I published it: in going into Blogger’s settings, I discovered the “Backlink” feature. It sounds like it’s implemented much like I imagined. No codes are necessary. You just turn it on. So, I did (to get the experience). If I think it’s starting to create a link-farm pattern, it gets turned off, pronto. It will be interesting to see what happens. It says that it uses the “link:” feature, which makes me think that the referring site has to be in the Google index, and perhaps even have passed whatever criteria they use to reduce the number of actual results reported by it. That would perhaps deal with the spam issue, if the site linking to the post needs to have, say, a PR of over 4.

Under-Promise, Over-Deliver

I woke up remarkably early, all things considered, and sent out an email informing my team I’d be taking Monday on the HitTail project. I have such momentum, and am so ready to wire up the main homepage to be ready for the first visitors, that I’d be crazy to go into the office, engage, and risk putting it off for another week. My to-do list for today looks like this…

  1. Create the template files.
  2. Put placeholder files in location for FAQ, SEO Best Practices, Why Sign Up.
  3. Fix the navigational links to point to these new placeholder files.
  4. Put the new navigational links into the Blogger templates.
  5. Figure out how the babystep tutorials are going to be linked in.
  6. Link in the first tutorial, and the respective spiderspotter app.
  7. Connect the submit form to lead management.
  8. Start putting content on the placeholder pages.

The work that I need to do on my second round of intensive focus include…

  1. Final thought-work on the actual HitTail app.
  2. Creating the HitTail app.
  3. Giving a flavor for its power directly on the MLT homepage.
  4. Start communicating with the people who signed up early—probably create a public forum, so I can efficiently communicate with all of them at once, and they can communicate with each other.
  5. Ensuring that the conversations that are developing into Connors new client prospect opportunities are being handled properly.

One pitfall to avoid is actually acting on the information that the spider spotter app is revealing to me. For example, Bitacle bot has been trying to retrieve my atom.xml file from the wrong location. I realized the path I set to the XML feed in my Blogger settings was incorrect. I fixed it, but realized I had an absolutely fascinating app to write: one where I could measure the time between me submitting a blog entry, and spiders request that page or the data-feed.

I think I’ll make a list of lists that I need on the HitTail site.

  • Apps that I need to write (which will also become tutorials)
  • Markets, industries and technologies that I want to target
  • People that I need to reach out to (the influencers), and the message I need to deliver to each, based on their interests
  • Topics for the SEO Best Practice
  • Questions for the FAQ section
  • Topics that I intend to blog about
  • Pitfalls to avoid

Perhaps the biggest pitfall of all is over-promising. There is little as damaging as building up expectations, only to be let down. I stand the danger of over-promising to two different audiences: Connors (specifically, Connie), and the people who sign up early for HitTail. I have to start with a small, but potent kernel. HitTail will be modest in how far it’s reaching, but designed to strike a fundamental chord—one that’s in the tornado’s path. There’s no need to over-promise, because that small kernel is totally enough—and I have to focus on over-delivering that one small piece.

That’s very Web 2.0 thinking, by the way. Because everything interoperates, relatively easy, people can write mash-ups based on your app. Each person writing their mash-up is likely to have way more expertise in their problem domain than I do, so what they write USING my service is better than what I could write alone. My role then becomes to put out a few sample mash-ups to stimulate everyone’s imaginations.

The HitTail app will be one of the first Web Services for SEO. Hopefully, it will have the same attractiveness as Tag Clouds, Blog Roles, Bookmarks, and all the other things that are serving as mash-up fodder and material for blog templates. Of course, Google Maps is the ultimate mash-up service, and I will continue to use it for inspiration. But no over-promising!