It’s my website’s 12th anniversary

Earlier this month (on August 7th to be exact), my website turned 12. Yes, I’ve been writing and publishing online since 2000!

I also got two unexpected gifts around the anniversary date. One of my articles got featured on the home page of (in the Freshly Pressed section). It was the second time I was featured, so it felt pretty good. (This was my first featured post.)

Another nice surprise was to see my stats counter fly past 1,000,000 views.

That’s over 1 million views since January of 2010, when I moved my site to I had been self-hosting a WordPress install prior to that, and before that, my website went through several iterations of static HTML designs and a home-grown blogging engine written in ASP, running off an Access database (it was surprisingly stable). By my own guesstimate, the total number of views on my website since 2000 is somewhere around 10-12 million, if not a lot more. I used to get about 3,000 visits (not just views) per day in previous years.

Web traffic is but one measure of a website’s value. I’m much more pleased by your interactions with my articles and your messages to me, particularly the ones where you tell me how much you like the content I publish here. I do put in a lot of effort into my posts, I’ve done it from the start, and I’m glad when people see the quality.

So, a hearty thank you to all of you who are subscribed, particularly to those of you who are long-time readers! I hope you’ll stick around for another 12 years; I know I will! Cheers! 🙂


TechCrunch is now at

When did TechCrunch make the move to I took a look at the top blogs today, and it was listed there, which means it’s no longer self-hosted, it’s literally at, under their VIP hosting program.

I did a quick search of their site, but they say nothing about their migration. I can imagine it was a grueling piece of work given the sheer size of the site and their various content embeds, like CrunchBase. With Google’s help, I saw that CenterNetworks wrote a post on 2/8 where they asked and got confirmation that TC is indeed hosted at, so it looks like they migrated sometime in late January or early February 2010.

I completed my own migration to (albeit not under their VIP program) on January 31st, and support from WP was very hard to come by during my migration. Perhaps they were busy at work on the TC migration?

Back when TechCrunch was a smaller operation, they were hosted at Media Temple, and were continually running their banners on the site. Then they moved to the RackSpace cloud, presumably after they outgrew Media Temple’s Nitro service. Apparently RackSpace no longer sufficed, for whatever reason.

What I do know is that WP’s own VIP hosting program is a compelling choice for those who need that kind of horsepower. Pricing begins at $500/month, with a one-time setup fee of $1,500, and your site will pretty much be able to handle any kind of traffic that comes its way.

How To

Run WordPress by itself or cached?

I’ve been running an experiment for the past three months. I wanted to see how well WordPress would do if I ran it by itself, without any sort of caching. So far, so good.

About four months ago, my web server kept getting pummelled into the ground almost daily, and I couldn’t figure out why it kept happening. After researching the issue, I found the prevailing opinion to side with the need for a caching plugin. People were complaining that it’s just not optimized well, and must be run with the aid of such a plugin, otherwise higher levels of traffic will bring the web server down. Trouble was, I already ran my WP install cached, using WP Super Cache, had been doing so for over a year, and my server still went down. (I should specify it had only recently started to go down.) What was I to do?

I posted a message in the WP forums asking why WordPress doesn’t generate static files. Were there any plans to do so in the future? To my surprise, Matt Mullenweg (WP’s founder) replied to my post, and told me that while there are caching plugins out there, doesn’t run any, and they’re doing just fine hosting millions of blogs. Others chimed in as well, and their replies got me to make the following changes:

  1. Made the switch to a VPS (Virtual Private Server) with SliceHost. Four months later, I’m still very happy about that move.
  2. Doubled the RAM on my web server (to 512MB from 256MB).
  3. Turned off WP Super Cache and started running my site by itself.

Each step followed the other in succession. I wanted to make gradual changes so I could see why my server kept having issues. Switching to a VPS host was good, and it was needed, but for my traffic levels, it wasn’t enough. Doubling the RAM was good and it was needed, and while the new RAM is enough for now, I’d still be having problems if I didn’t also disable my caching plugin.

Here’s where I think the crux of the caching/non-caching issue lies: it has to do with the load placed on the server as cached versions of the pages get created. Normally, that’s a non-issue. But as I monitored my server carefully, I discovered that it went down only as it started to get indexed heavily by search engines. Their bots visited my site in spurts, with traffic peaking, then falling back down. They spawned multiple threads, over ten at times, following links and slurping up the content. It’s when bot traffic peaked that an incredible load was placed on the web server. It kept generating cached versions of pages it hadn’t already cached, RAM and CPU demand increased to unsustainable levels, and it went down.

No amount of tweaking the Apache and MySQL config files helped with this sort of scenario, or at least it didn’t help me. You see, the difference between peak traffic levels with search engines vs. people is that people will go to a single article or a group of articles that are in demand. A caching plugin works great for those sorts of situations. There’s a limited number of pages to worry about caching, and those pages get served up time and time again. The load is acceptable. When a search engine bot starts indexing your site, it’ll call up any and all available pages that it can find. That can place a huge load on the web server as it scrambles to serve up those pages and build static versions for the caching plugin. I believe that it’s too much for most medium-sized servers to handle, and they will usually go down.

In my case, disabling the caching plugin and making sure no traces were left in the .htaccess file were the only things that helped. Now, I might have up to four different search engine bots crawling my site, each spawning multiple threads, and my server will usually not go down. Sure, there are times when the server will get dangerously low on RAM, and will be unresponsive for 5-10 minutes, but that’s an acceptable scenario for me. And if I should all of a sudden get huge amounts of people traffic to a post, it’s possible that the web server will also become unresponsive, at least for a time. But the great thing about running WordPress by itself is that Apache will usually take care of itself. As the requests die down, Apache will kill the extra threads, the available RAM will go back up again, and the server will recover nicely. That wasn’t possible while I ran the caching plugin. When it went down, it stayed down, and that was a problem.

I realize that what works for me may not work for others. I have not tested what happens with WP Super Cache on a larger server, for example one with more RAM. It’s possible that the larger amount of RAM there will offset the greater demand placed on the server as it builds static versions of the pages, although I’m not sure what to say about the CPU usage. That also peaked as the caching plugin went crazy. Not sure how that’ll work on a more powerful server.

WP Super Cache has some options that allow you to cache more pages and keep them cached for longer periods of time. Perhaps fiddling with those options would have allowed me to keep running the plugin, but I wanted to see how things stood from the other side of the fence. Like I said, so far, so good. Caveats aside, running WordPress by itself was the cure for my persistent web server outages.


Sorry about the growing pains

My sites were out of commission yesterday afternoon, evening, and part of the night as well. Each outage lasted anywhere from half an hour to a couple of hours, and drove home very clearly this message: I need a web server upgrade.

While it was certainly frustrating to see my sites go down, and to see that no matter how much I tuned Apache or MySQL, I couldn’t meet the traffic demands, it’s also encouraging to see that I first outgrew shared hosting plans, then outgrew a small dedicated server, and had to (now) upgrade to a more powerful dedicated server. My site stats show this same trend. Traffic levels have been growing steadily throughout this year and even more in the last few months. October in particular has been rough on my little web server.

Yesterday, Google and Yahoo had been indexing my sites, on top of the usual, fairly heavy traffic. I started having serious performance issues during the afternoon, which led to a small outage. Google got done with my sites after that, but Yahoo kept going, and Cuil, the new search engine on the block, joined the party as well. Cuil is known for taxing web servers heavily when it indexes sites, and it was merciless on me last night. It, together with Yahoo, brought my server down and kept it down for close to one and a half hours.

I got it back up and re-tuned Apache and MySQL with Chris Johnston‘s help, but at some point during the night, it went down hard, and stayed down. When I woke, I decided enough was enough. It was high time I upgraded.

Thanks to my awesome hosting company, SliceHost, I was able to double the specs of my previous server in less than two hours. Before non today, my little web server morphed into a larger, more powerful one that can handle the current traffic levels with ease. We’ll see how long it can keep up before I need to upgrade again. You can help there. I don’t mind at all if I have to upgrade again in the near future, should my traffic levels warrant it.

Thank you for sticking around!


Google should offer web hosting

Google has been tinkering with the idea of immediate indexing of changed content in websites for some time. Its current solution is to have you place certain files and bits of code inside your site, that can notify the Google bots when your content has changed. But there’s a much simpler solution: why not offer web hosting? With content already on its servers, Google won’t have to go out over an Internet connection to index stuff – it will be able to do it much faster in-house.

Granted, this sort of a decision from the juggernaut may run afoul with web hosting companies, but it’s a thought. If they’re giving away 2.5 GB of space for Gmail, why can’t they give away the same amount for web hosting, or offer paid plans?