SEO Secret Sauce: The Prolific Writer

Not long ago, I was a prolific writer—when I was not expected to be. Now that I HAVE to be a prolific writer, I find it more difficult. First, with Prophet 21, before it was acquired by Activant, I delved into SEO and was one of the most active posters on the then-dominant Search Engine Forums (SEF). It has since been replaced by Webmaster World for the geeked-out in-the-know SEO crowd. Everything was new then, I wasn’t in the SEO business, and I felt compelled to share—in part out of altruism, but also out of ego, and to stick it to the SEO secret-holders. Anyone holding his cards too close to his/her chest is a bullshitter, I felt. So, I said it like it was, and was one of the first people to lay out precisely why keyworded previous/next arrows were disproportionately effective in SEO—they pander to Google’s PageRank algorithm WITHIN the site. Since, and perhaps consequently, both MovableType and WordPress have instituted keyworded prev/next arrows.

Next, I decided that if I was such hot shit at SEO, I would put my money where my mouth was and go for a slice of the pie. Instead of affiliate programs, I opted for a straight cut of worldwide gross revenues of a technology company in an emerging market—digital signage. I hitched my wagon to the Scala star, and rode it for about four years. I ran into an unfortunate sysadmin during this experience, and never really received the programming support I needed to take my work to its next level. So, while I made some very nice money proving my concepts, I could never mainstream it into the next big thing as I had planned. And this was still back circa 2001, right during the Bubble Burst, and before everyone woke up to the significance of search and the power of arbitrating other people’s traffic. But I was a prolific writer at Scala, relentlessly blogging on the topic of digital signage much to the chagrin of Scala’s competitors, such as WireSpring and WebPavement. I remained prolific until August 2004, when I joined Connors and NEEDED to write.

So at Connors, I focused on delivering my secret sauce to Connors clients-only! I got a touch of the paranoia that I humorously observed in my SEO counterparts while at Prophet 21 and Scala. And I started thinking whether I could ever really write again about the secret sauce. Webmaster World was the place to be to impress peers and feed one’s ego. But building influential personal blogs such as Battellemedia and Micropersuasion was the way to build audience and attract new prospective clients. But what could I write about? This was also a problem with speaking opportunities. I was such a back-room SEO-guy that all I had to say was stuff that gave away the family jewels—tidbits like the power of prev/next arrows!

Meanwhile, things are changing at an ultra-rapid pace. Ruby on Rails comes along, and the concept of an agile framework is on peoples’ radars. There goes the special edge of my “generalized system” (where apps spring into being virtually before the product spec is written). Next, Google buys Urchin and makes Web Analytics free. There goes the possibility of selling what I built as an analytics package. The world is catching up with me on all fronts but one: how to know WHAT keywords to optimize for the biggest return. And along comes Chris Anderson with his fateful 2004 online article about the long tail distribution curve and how it applies to search.

The SEM people (those who sell clicks) rapidly jumped on the bandwagon, and said “Hey, up the keywords! It doesn’t cost you any more, and it may help”. This is a good premise, but the nuance is that it works even better when you don’t pay a cent! That’s right! Writing about an obscure but promising topic and putting the content in optimized format on your website simultaneously optimizes for ALL search engines. You don’t have the overhead of another keyword in your campaigns to manage. You don’t have to pay if a click occurs. And it’s a permanent addition to your website, working for you 24×7.

Two metaphors come into play: the snowball effect, and the iceberg principle. I will probably write about both of these separately because they are such important topics. But the snowball effect states that if you keep adding just the right content to your site at a consistent rate, your site will increase in effectiveness as it grows in size and search-appeal. It’s “a-peelin’ off a layer of someone else’s search traffic.” At any given moment, there is a fixed amount of search traffic occurring on the planet, and we’re in a battle of capturing it before the next guy does.

Secondly, the iceberg principle states that if someone searched past that magical 3-page limit and STILL found your site on a given keyword (or keyword combo), then you MIGHT just be seeing the tip of the iceberg. If you could somehow tweak your page to the first page of results for that keyword in the search engine result pages (SERPS), then you might receive tons more traffic on that term. Some determined searcher actually volunteered competitive intelligence to you through their determination and dropped it in your lap to use or not. HitTail is about using it! It is a powerful tool in the hands of a prolific writer.

And now that Connors Communications went public with its secret sauce for SEO, and I’m free to talk about it—watch out! The prolific writer is back. And just as I spelled out the power of prev/next arrows with keywords, I’m going to be blowing the lid off of as many closely held techniques of SEO that I can, in order to level the playing field and create some opportunity for Connors. So, what opportunity is there once all the “process” secrets of SEO are disclosed? Simple! They’re not easy to implement—especially at large companies. The old 3 c’s of the Internet used to be content, community and commerce. The new 3 c’s of SEO are consensus, clarity and confidence. In other words, large companies to stand even a minute chance in this blogging world of ours need to build consensus between the marketing and IT departments. The projects that are going to be carried out need to be spelled out in crystal clear fashion (clarity), and everyone—especially the IT department, needs to be completely confident that it can be done without breaking everything or cost sultan’s ransom to overhaul infrastructure.

And I know the prolific writer is back, because I have to discipline myself to stop there. I really could go on forever.

Long Tail Keywords

So, HitTail is finally undergoing a soft launch. My plan is to get a few thousand users on relatively low-volume sites. People who already have high volume sites will hardly need the extra boost that HitTail provides. Its people who are just establishing themselves that will benefit most. People launching new websites will actually have the ability to watch every search hit as it occurs! In a way, it’s sad, but it is also a strategic advantage, because HitTail will inform you if it needs further refinement — the long tail keywords!

One of the important principles of HitTail is that it does not work in a vacuum. An existing site with a little bit of content built up is a perfect starting point, because HitTail will inform you of keywords that are on the edge of breaking out. But if you think you should be coming up on a diversity of terms that you don’t, you have the chicken and the egg problem, and it is time to float a few test balloons.

There is no easier publishing mechanism to float test balloons than Blogger. The beautiful part of Blogger is that it can plant your entire blog into the subdirectory of an existing site, if you have FTP access. The advantage there is that the snowball effect you are trying to achieve is on a per-domain basis, usually taking into account all three levels: So, using TypePad achieves little for a corporate site, because you can’t plant it in You can with MovableType and WordPress, but you would need an unusual level of access to the corporate webserver. Better to FTP it into place.

Test balloons usually entail taking a writing recommendation under the Suggestions tab of HitTail, and only investing a half-hour to an hour of writing time on that topic. One of the dirty secrets of SEO is that your main objective is achieved merely by using blogging software and getting your title tag correct. You can almost leave the page blank and get SOME results. But it’s better to put out a quality page with quality content. The reason it’s a test balloon is because it will result in MORE suggestions being issued in HitTail. There will be word combinations you never anticipated, with a word from the top of your page combining with a word on the bottom of your page, and some very determined searcher going many pages into the engine and finding you. This registers with HitTail, and begins to solve the chicken and the egg problem.

For those people who have been lucky enough to discover the soft launch of HitTail, feel free to get started. Put the HitTail code on every page of your site through the template system. With Blogger, it is a breeze. Just copy and paste it as-is into your template somewhere between the body tags—but not between special blogger tags. Seeing the results should be almost instantaneous. Take the first reasonable suggestion, and make a new blog post base on it.

Writing for SEO

Have you ever gotten into the mood where you are just a writing machine? If so, you are one of the most valuable people in marketing today. Just about anyone can plagiarize off the web, and change a few words here or there. And low-priced outsourced employees can put target keywords together in new sentences and combinations for AdWord campaigns. But the plagiarized pages will eventually be filtered as spam, and the AdWord campaigns will eventually stop running. Only the original content, containing valuable and preferably timeless information will continue.

Such the pages that achieve the largest double-whammy objective: acquiring new links. New inbound links that occur spontaneously without link-begging or link-exchange are like gold. They play to the Google BackRub formula, but they also leave a trail of references that actual Web Travelers can follow back to your site; search engines aside. As long as they are not reciprocal, and you’re receiving links from an average distribution of people around the world, your site will be achieving this critical objective in the most desirable organic fashion.

And the whole process begins with writing—and writing at a decent rate. Why? Because search engines are biased to consider newly discovered content and float it temporarily as a test balloon. That means new content always receives a temporary boost, and explains why new site launches/re-launches are so often accompanied by unsustainable natural search gains. The effect wears away in a few months. Google’s patent application from March of 2005 showed us that Google actually considers the “delta” information between one full site crawl and the next.

New writing goes into this set of fresh content. And if you’re publishing with blogging software, you have the additional boost of the new post ping system and the whole additional set of crawlers and news feeders that it brings. This is part (but not all) of the reason blogging software brings an unfair SEO advantage over crusty old content management systems that never had search friendliness as part of their criteria. And a company who hires a professional “writing cannon” as member of their marketing team will be very well served. Their work, instead of just transient press releases, will become a permanent part of the tapestry that is the Web, and their work will be put to work 24×7 generating new prospects.

Google Trends

Until today, Alexa has been the place to go to get a semi-authoritative and free gauge on traffic. But with the release of Google Trends, there is another must-see site. It is along the lines of BlogPulse, but much more authoritative, because it is tied into the most used search box in the world. It is however interesting to note that it has a similar problem as Alexa, in that all data is relative (not absolute). In other words, they’re not giving actual traffic numbers. They’re just comparing relative search volumes across times and between words. This is so similar to Alexa’s relative rankings, and I cannot help but feel is a general strategy that they use to keep themselves from being called to task on their numbers.

Whenever you’re looking for hard and fast numbers on Web traffic, based either on keywords or on site visitation, you have to turn to such companies as comScore and Hitwise, who each have dubious methods. comScore essentially runs software on enough individual’s PCs to get an average cross-sampling. Alexa too runs special software in the form of their toolbar. comScore has the superior method of cutting deals with ISPs to get a direct and pure sampling, but potentially skewed by region due to ISP service areas.

Alexa is the only one so far that has turned around and put its data out for popular general use. But their numbers can be challenged if put in absolute terms. The solution is to make it relative, and I am never done with explaining their Traffic Rank, Reach and Page Views. Of these numbers, Reach is the funniest. They call it reach “per million viewers”. They are essentially saying “parts per million”, measuring traffic that visits a site as one would measure ammonia levels in a fish tank. The trick is, you can’t tell how much ammonia is in the fish tank without being given the total volume of water. So, you can’t know how much traffic is reaching a site in absolute terms unless you’re given the total number of Internet users.

Similarly, while the Google Trends tool is amazingly useful in comparing keyword traffic on a word today versus last week, or between keywords A and B, it’s not telling you traffic on a keyword in absolute terms. This is still a process of piecing it together from the clues.

Superior Search Optimization Suggestion Tool

I cannot help but be disappointed each time I try a new analytics package. They are all concerned with Top-5 this, Top-10 that. They, by definition, distill down all the data into reports that show you only the most important data, which ignores all the granular information, where the long-tail goodies exist. This, by definition, makes traditional analytics packages poor choice as search engine optimization tools. But the alternative seems worse—being overwhelmed with log file data. Either way, paralysis is the result.

HitTail specifically addresses data analysis for natural search optimization by keeping the entire data granularity, but skimming off only the bits that are important for SEO. As a result, you can easily look at the details of every single search hit that led to your site. You can even go as far as clicking to reproduce each and every search engine hit, reproducing the first page of your visitors’ site experiences.

Log files need to be deleted, or else they will fill your hard drives. Analytics packages have similar limitations. Neither keeps all the cumulative data that represents the life of your website from a natural search optimization standpoint. When did your first search hit occur? When did the first search hit on each new term occur? What was the latest new term to lead to your site? No data is left there to be mined. You essentially lose all of your memory after each reporting cycle. It is therefore impossible with today’s tools to create superior search optimization suggestion tools that uses all the available competitive intelligence off of your own site! Enter HitTail.

Blogging and SEO

Getting your content out in a blogging system is much more effective for SEO than by any other content management system, because there is a vast network of new web-crawlers out there competing on many levels to pick up and carry your content. Yahoo, Google and Technorati are competing to pick up all blog posts first.

News aggregation sites specializing in verticals are competing to pick up content on particular subjects. You can see this in technical topics like Java and Ajax right now, but the trend will trickle into every industry, as it is the lazy way to create those “authority sites” that are so important for SEO. In the past, spammers were doing “search scraping” to accomplish much the same thing, but mostly to have pages on which to hide AdSense.

Today, RSS feeds are being combined to make much more professional and sensible sites. But the same dynamic applies. They are attempting to intermediate communication. Not long ago, the concept of disintermediation was a buzzword, due to manufacturers’ sudden ability, thanks of course to the Internet, to sell directly without going through distributors or retail outlets. Today, content is a product, often being given out for free by bloggers who put entire articles into news feeds, and it is an invitation to intermediate. Just as with spamming, there is little to no cost or risk, and there is a great deal of up-side.

The answer is simply to set your blogging software to only include a portion of your post in the RSS feed. Whenever you post, all the blog crawlers will pick up the new content. The only full copy of the search-optimized content will exist on your site. And the news aggregation sites will get your small paragraph, which is fine because it will become a highly desirable non-reciprocal link.

Blogger SEO advantage confirmed

Well, the HitTail site ended up being in the Google SERPs today for the first time. This is concrete evidence that a previously non-existent domain can go from registration to being served in results in just about 2 weeks — and that using Blogger as an on-site blogging tool provides some advantage. I’ll be watching to see if it sticks. The page that got picked up was the blog index page.

To find the page, I searched on the site’s tagline in quotes. It’s hard to find in the results if you don’t use the quotes. And the results are cycling between at least 2 sets of results, one of which doesn’t yet have the blog index page. That shows I’m spotting this just as the new results are coming live.

The Google “site:” command shows that only that page has been picked up and indexed, the homepage. And the homepage has only been recognized, but not indexed.

And if you’re logged into your gmail account when you do these searches, you get additional data! It shows the exact time it was posted of 5:39PM. Unfortunately, that was 5:39 on February 2nd, which was 5 days ago. I speculate that this comes from the additional data Google has available about the page from the Blogger data, since Google owns Blogger.

Collecting Contact Information

Well, it’s been a few days without a HitTail post. Tuesday ended up being an office/client day. I’ve got an interesting post in the works on search-friendly URLs and the concept of the weakest link in the chain in search optimization. Yesterday, I got myself set up on a VOIP phone at home that ties back to the office Avaya IP Office phone system. So from home, I can make outgoing calls as if I were on an office phone. Between these two items, and catching up on sleep lost over the past week, the last two days haven’t been my most productive. I plan on making up for that starting now.

I got a blog post out this morning that really starts to describe what the HitTail site and system is all about. A lot of my time has really gone into blogging, and not building the app. The reason for that is that the thought-work needs to be done to ensure that I’m doing the right work. But I’ve really got to move forward fast now. More and more people will be coming in as a result of my blogging, and I want to start building the list of beta testers. That makes the contact info app the most important, initially. And since I’m collecting contact info, I might as well integrate it with the Connors sales lead management and online conversation system. That’s my goal for today.

I’ve got to get both the BabyStep tutorial that I just produced, and the application itself onto the sight. I want it wrapped in the site’s look and feel, as minimal as it may be right now. So, what I do is finish the template in the content management system. OK, done. The Spider Spotter app now automatically takes on the look of the rest of the sight, as will any new applications that I develop.

First, put a blank page from the CMS under each main category link. OK, done. Now that you have semi-permanent navigation, move it over to the Blogger template. OK, done. The rest of your work for today is activating the Sign Up Now and Start Conversation systems.

Now, it’s time to focus on contact info collection. There are two forms of contact info collection (3 if you include the blog subscribe list), which I’m leaving to Bloglet for now. Time allowing, I’d use the Google API to program a list mailer myself, so it is more tightly integrated. But the 80/20 rule. The two main forms of contact info collection is “Sign Up Now” and “Start Conversation”. The process of each starts with just the email. That way, minimal contact info is acquired, and some idea of their intent, based on which submit button they used. The principle here is: capture fast. Even if no follow-up action is taken on the part of the prospect, you are now in a position to proactively follow up with them.

The submit will come in through the same program either way. The “Sign Up Now” will just give a parting message, thanking them and explaining the next step. It will also give one more opportunity to talk with Connors about PR. The “Start Discussion” submit will bypass all this and drop you directly into the sales lead management/discussion system. It’s sort of like a private one-to-one blog. Technically, it’s one-to-many, with you the prospect being the “one”, and Connors the organization being the “many”.

Every one of these discussions that comes in needs to be treated as priority #1. It could be the next Amazon, Priceline or Vonage. But it could also be influential early adopters of technology, and the people who will be pivotal in making HitTail erupt in buzz. And it also may just be the curious tire kickers, whom we will need to quickly identify. I’m sure that launching a public discussion forum is in the near future, so we can efficiently communicate to the entire audience for HitTail, and general next generation public relations issues.

The Evolution of Search Marketing

Search marketing is about to evolve, brought to you by the New York public relations firm that launched, and Vonage. Connors Communications is about to do for natural search what AdWords and Yahoo! Search Marketing have done for paid search—create a logical system that any marketer can use to bring in more qualified traffic. Except this time, instead of jumping into the bidding frenzy over prohibitively expensive keyword inventory, you tap into the infinite supply of free long tail keyword inventory in what’s coming to be known as the long tail of search.

Go to Google or Yahoo! and perform a few of the searches that you believe are important to your business, organization or cause. Does your site appear? If not, you’re experiencing the anxiety that’s driving companies to paid keyword campaigns, because after brief investigation, you discover that the alternative of cracking natural search is just too difficult. You run into the shadowy world known as search engine optimization (SEO) full of dodgy websites and conflicting advice.

It’s easy to become intimidated, and just put your resources in the same place as your peers, and the early majority of marketers are—paid search. Paid search has become mainstream, and demand for the most lucrative keywords has driven costs to sometimes prohibitive levels, reducing the advantage held by early adopters. The real deals in keyword inventory now exist where demand is lowest, and the work of figuring out exactly what those keywords are is hardest. Wired magazine writer Chris Anderson has done a great deal to popularize the concept, and I recommend you read his paradigm-shifting article.

At some point, every marketer worth his/her salt wonders whether focusing on the infinite supply of cheap keywords that exist in the long tail of search may in fact be worth it. They cost so little and can pay off so big. As searches become more specific, the customer often becomes more qualified and valuable. Yes, it’s actually a flaw in the paid keyword business plan. But not in a paid campaign where managing tens of thousands of keywords becomes self-defeating—but rather, in a way of doing natural search that any marketing department can handle in-house. Demand now exists for a mainstream alternative to paid search.

HitTail is a rote process of identifying the most likely keywords for targeting based on your existing traffic, automatically suggesting topics for writing. You then incorporate this writing into your website or blog, getting into the habit of friction-free publishing. Once the wheels are greased, and you’re comfortable rolling out new subject matter, it becomes an addictive process that’s easily on par with what some marketers jokingly call CrackWords. Feedback on the effectiveness of your writing is quick, more writing topics suggested, and the process accelerated.

This is not a spamming system, as all the writing comes from you. The premise is that any market or industry has plenty to talk about, and it is your obligation to become the voice of authority in your market space. In fact, with HitTail, you kill two birds with one stone, by solving your corporate blogging strategy simultaneously to natural search. HitTail makes a blogging strategy many times more effective than it could be alone, by virtue of being the ideal “writing coach,” understanding your existing performance, and where your biggest gains can be made.

HitTail is not an analytics package, nor is it a conversion tracker, for that is available to everyone for free already in the form of Google Analytics. Instead, HitTail is a very light-weight tracking system built specifically for natural search optimization. There are no reports to customize or mountains of data to wade through. We will not cause paralysis through analysis. Instead, we go directly to actionable data. Just plug in the HitTail tracking code, and recommendations start rolling in. The more you act on them, the better the new recommendations become.

As you follow the recommendations, your natural search hits go up, and more data is available to HitTail. You don’t pay for these hits. They just work for you on a 24/7 basis with no campaign maintenance. If you stop HitTail, the hits continue. They’ve become part of the fabric of your company—a permanent asset. But if you continue with HitTail, success builds on success, and your site begins to build towards a sort of critical mass. The goal is nothing less than to dominate the category. With a little time and creativity, perhaps you can use HitTail to achieve the sort of category dominance that Connors Communications has helped bring to its clients time and again, before opening the system to the public.

How Quick Into Google Search Results?

So, how quickly will a brand new domain show up on Google? I should have been checking day-to-day, but today is January 31, and the site was technically launched on January 16th. That’s just over 2-weeks, the period of time traditionally quoted as the minimum for a Google sweep. So, now’s a good time to do a quick review. Thankfully, the domain name is a totally made-up name, and I can do some very insightful tests here.

I decide to search first to ensure that at least the blog posts are in the specialized blog-content engine. It produces 28 results, including the first test post made on (now defunct). Every post was made since January 25. So, in one week, every blog post has been included in the blog search.

Next, I search on HitTail in the default search, and I see one result. It’s the domain with no title or description. This is what’s often referred to as the Google sandbox. We can see that Google is aware that the domain exists, but is not producing any of the site’s content in the results. We see in the spiderspotter app that the first visit by GoogleBot was January 25th, the same day I started blogging.

From the 25th to today is exactly 1 week (7 days). In seven days, we have gone from an a previously unknown site to the domain being findable, but collapsed down to 1 page, and no actual page-content in the results. How recent is it? A quick search on a couple of different Google datacenters reveals that even just this 1-page listing is only on a couple of datacenters, and non-existent in others. So I am indeed catching it during the process of propagation, and we have our undisputed evidence that a site can go from zero to listed in some form in 1 week. Have I avoided the Google sandbox penalty altogether?

And finally, we check for specific quoted content from the first blog post. I know it won’t show, but I’m at least doing the text for the sake of completeness. So, it’s 1-week to show up at all. And it’s sometime longer before content appears. After content appears, the results tend to “dance around”, nicknamed the “Google Dance” before the data has propagated across all data centers.

Another factor affecting the results settling down is something people don’t talk about much. The Google patent from March of last year revealed the fact that Google is very sensitive to the amount that a site has changed from one visit to the next. That is to say, how much of the site has changed? How many new links have been established to the site? And when a site is brand new, every few pages you add constitute a significant percentage of the overall site. So, Google is seeing a very volatile site. And the results are correspondingly volatile. Therefore, when a page is first discovered, it goes into what I think of as a moving window of opportunity. I believe they get this extra relevancy boost to see if they have the potential for gangbusters fad-like success.

Fad-like success? Fads, I believe, overrule traditional rules of slow organic growth. These are pages that somehow become massively popular and everyone starts linking to, passing around in email, finding due to events in the news, etc. If a page does suddenly become massively popular, Google sees this, because they’re quietly recording click-through data, similarly to how DirectHit did back in the day. But DirectHit’s system, subsequently merged with Ask Jeeves, was ultimately defeated, because by touting that they were doing this, they invited click-stuffing abuse. Google on the other hand not only doesn’t advertise click-through tracking, but they use very clever JavaScript to keep it from even look like it’s occurring. It’s not evil. It’s just smart. And if a site goes gangbusters, there is a totally organic pattern created that is difficult to fake, because there are hundreds of links from non-related sites, and thousands of click-through from disparate IPs that couldn’t possibly be under one person’s control. This fad traffic pattern then “buoys” that page’s relevancy in future searches. This is just speculation based on observations, and only stands to reason that certain relevancy criteria can outweigh each other criteria, that criteria is both particularly difficult to fake and out of balance to the others.

Anyway, what are my conclusions? This test proves…

  • How long does it take to go from zero-to-being in Google results at all? 1-week.
  • How long does it take to go from zero-to-being in Google results in a meaningful way? Verdict not in, but expected soon. Stay tuned.
  • How long does it take to go from zero-to-being in Google resuts in a stabilized decent and decent fashion necessary to drive sales? Will not know for three to six months
  • How long does it take to go from zeron to being viewes as a healthy, growing site worthy of regular, predictable inclusion of new content? Well, that’s the purpose of HitTail!