Tag Archives: linux

I am looking for cheap, shared web hosting because the downtime and customer support I’m getting with JustHost is intollerable (and well documented). I’m willing to pay about $10/month for hosting, and I don’t ask for much other than reasonable uptime. I have a blog (ardamis.com) that gets about 1000 visits per day, a few other sites that barely get any traffic, and a small photo gallery that I share only with family and friends. I don’t stream or make available for download any video or audio files, but I do want to be able to upload all of my personal photos and use my web host as an off-site backup. I have an account with Drop Box, but I have gigs of photos, so I would have to purchase extra storage, and I kinda want to keep the Drop Box stuff separate. I’ve also considered using Google Drive, but I have enough photos that I would need to purchase additional space.

I’m currently paying $108/year for hosting and an additional $19.99/year for a dedicated IP address. I’m willing to pay a little more, but not terribly much more.

Before JustHost, I was a GoDaddy customer for something like 5 years, and there was almost no downtime. Customer service was incredibly, suprisingly good and the reps were always knowlegeable and effective. I left because I didn’t like the proprietary admin panel that GoDaddy has developed and wanted to keep my domain registrar and my web host at separate companies. But really, there was nothing wrong with GoDaddy’s hosting at the time, that I could tell.

The problem with researching new web hosts is that unbiased information (if it’s out there) is buried under tons of completely untrustworthy garbage sites that sell reviews and rankings. If you look for personal recommendations based on experiences with multiple hosts, it is extremely hard to tell from the obscure bulletin board threads whether the posts come from shills or actual customers. Even if one finds posts that appear to be from genuine customers, their descriptions of their experiences are usually subjective, anecdotal, and not comparative.

(As a little aside, BlueHost, HostGator, HostMonster, and JustHost are all owned by the same corporate parent, Endurance International Group. An outage affecting one can affect them all, as described in the Mashable article: Bluehost, HostGator and HostMonster Go Down. So, take these things into consideration if you are trying to choose between them.)

This post, then, is just my notes on what I’ve found while researching my next web host.

No Host

One option would be to not go with a hosting company at all and just use Amazon EC2 to self-host my blog. I have seriously considered doing this, but by some accounts, it’s actually more expensive to run an EC2 instance than purchase shared hosting. And I don’t really want to become my own linux administrator. As much as I enjoy occasionally tinkering with Ubuntu, I just want my sites to run smoothly and for someone to keep the server up-to-date and to fix problems for me quickly if something breaks.



Everyone has heard of Bluehost. They’re huge. Their tagline is “Trusted by Millions as the Best Web Hosting Solution”. So there you have it. Maybe you want to be one of their millions of customers.

According to the builtwith.com profile for ardamis.com, the site’s hosting provider is not JustHost but BlueHost!

Their website has terribly low production values, for being such a huge company. They have a crappy stock photograph of a dude with a headset providing customer support, below the text “We specialize in customer service. Call or Chat!” I do not want to live chat with this dude, or anyone else. I probably don’t need any customer service from my hosting company unless they screw something up, so prominently featuring your support phone numbers make me wonder if they get tons of calls.

They do have cPanel, which is nice, as I am wary of proprietary admin panels after using GoDaddy for years.

They also offer unlimited domains, unlimited storage, unlimited bandwith, unlimited email accounts, and a free domain. They also offer custom php.ini files and SSH access. And they do all of this for just $6.95/month for 1 year.

It actually sounds pretty decent, on paper. But one does get the sense that they are completely driven by the bottom-line, and that your site is going to be jammed into an already crowded server.



HostGator is the other huge discount web hosting company, competing with BlueHost and GoDaddy for what I picture to be the same sort of confused customers or WordPress blog/Google Adsense scammers.

Like with BlueHost, I’m immediately turned off by the website, which is just ugly as all get out and also conspicuously promotes Live Chat support with a stock photograph of a girl with a (wired) headset. The pricing for the hosting plans is also a bit misleading, as all of the pricing shown is discounted 20%, which discount is only valid for the first invoice.

Another thing I really dislike is that there are three tiers of shared hosting, named “Hatchling Plan”, “Baby Plan”, and “Business Plan”. I just can’t see myself signing up for the ridiculously named “Hatchling Plan” or “Baby Plan”. Which is probably by design so that people like me upgrade to the more respectable and grown-up sounding “Business Plan”.

They do offer a single free website transfer for shared hosting, whether or not the site being transferred uses cPanel.

Lithium Hosting


I first heard about Lithium Hosting while reading the ars technica article How to set up a safe and secure Web server, which also mentioned A Small Orange.

Their site looks pretty good, but it’s a little bit too template-like, and I felt that a quick glance on themeforest.net would turn up about a dozen hosting reseller templates for which the layout and stock placeholder text is an almost exact match. For example, the tagline on their home page is “Why we’re not the best host in the world.” Yeah, that sort of false modesty smells just a bit too contrived here. I was almost ready to sign up with Lithium Hosting, but the cookie-cutter stuff gave me enough pause to keep looking.

One of the things that I didn’t like is that they offer a free month coupon code, but it doesn’t work when you configure your cart to pay a year-at-a-time. Another thing I didn’t like was the $36/year cost of a dedicated IP address – $5 to set up and then $3 per month thereafter. It seems like the price on dedicated IP addresses has dramatically increased in the months since all the breathless news reports about the world running out of IPv4 addresses. They also charge an extra $60/year for shell access via SSH. That’s pretty shameful.

Bailing out of the cart just before purchase doesn’t cause a window to pop up an lure you back with a discount, as I fully expected to happen.

I checked out their Facebook page and the most recent post was from a guy who’s website was down. Two other recent posts were about downtime.

At the end of the day, Lithium Hosting just seems too much like a hosting reseller itself than a company that will be around for years.

A Small Orange


I first heard about A Small Orange while reading the same ars technica article How to set up a safe and secure Web server that mentioned Lithium Hosting.

My first impression of the site was that it looked pretty much exactly as I wanted my hosting company to look. Again, I’m being pretty superficial here, but I want to be happy with my choice and if the company’s outward appearance is cruddy then I’m not going to be as satisfied. I want to be certain that I made the best choice, and a crappy, thrown-together-looking site injects a small amount of doubt. The site clearly sets out the costs of the different hosting plans, and I knew I would be looking at shared hosting first. There are five tiers of shared hosting, the least expensive being $35/year for 250 MB storage and 5 GB bandwidth. Basically, the only differentiating factor between the different levels of shared hosting is the amount storage and bandwidth. The standard features for all shared hosting accounts include:

  • Unlimited Parked and Addon Domains
  • Unlimited Subdomains
  • Unlimited POP3/IMAP Mail Accounts
  • Unlimited MySQL Databases
  • Unlimited FTP Accounts
  • Automated Daily Backups
  • cPanel
  • Automatic Script Installation (Softaculous)
  • Jailed shell upon request
  • FTP and SFTP access
  • Cron jobs for scheduled tasks
  • 99.9% uptime guarantee

A Small Orange uses CloudLinux as the server OS.

OK, yes, there is still a Live Chat button, but it’s this inconspicuous little green tab at the left side of the window that states “live help”. That’s it.

I checked out their Facebook page for recent posts, and they were almost entirely positive, with a good amount of interaction from the admin. The page claims that the company is home to 45,000 web sites, which might be exactly what I’m looking for, without even knowing it.

One of the things that ultimately convinced me to go with them was the Inc.com write up of CEO Douglas Hanna in America’s coolest College Start-Ups 2012 and the Duke Chronical article Hanna makes juicy profits with A Small Orange. Hanna worked at HostGator for two years in customer service, so one could reasonably expect that he knows what customers want in affordable hosting.

I have also found a ton of coupon codes for A Small Orange (just Google it) for either $5 or 15% off your order.


I just picked up an old Dell Precision 690 workstation, which I intend to develop into a file server, a Windows IIS server, and an Ubuntu LAMP server. This monster was built in 2006, but it still has some neat specs and tons of capacity (7 PCI slots, 4 hard drive bays, etc…), should I want to expand further.

Dell Precision 690

Dell Precision 690 Workstation

The main specs

CPU: Dual Core Intel Xeon 5060 3.2GHz, 4M Cache, 1066 MHz FSB
RAM: 2GB DDR2 PC2-5300, CL=5, Fully Buffered, ECC, DDR2-667
HD: SAS Fujitsu MAX3073RC 73GB, 15000 RPM, 16MB Cache
Video: Nvidia Quadro NVS 285 PCI-Express, 128MB

This is not a normal tower

Right away, the size of this thing suggests it isn’t a normal tower. It’s about up to my knee and weights 70 lbs. It feels like it’s made with heavier gauge steel than the typical chassis, but that may be me projecting.

I immediately shopped around for more RAM, obviously. 2GB seems a little thin, even by 2006 standards, when considering the way everything else is high-end. The mainboard has 8 slots and supports up to 32GB, but I figure 6GB is a safe place to start.

The workstation has three enormous fans, like, big-as-your-hand big. Running it with the chassis open causes some sort of thermal protection system to kick in and it spins the fans up to the point that they were blowing stuff on the floor half-way across the room.

The CPU has a big, passive heat sink with six copper pipes and sits between two of those fans. I’m tempted to buy a second CPU, but I’ll hold off.

I’m still on the fence about the SCSI drive. It should be super fast, but I’m a little spoiled by the SSD in my machine at work, so it’s hard to get excited about a mechanical drive, even one running at 15k RPM.

The Nvidia Quadro card is also fanless, and has a bizarre DMS-59 connector. An adapter converts the DMS-59 connector into two DVI outputs.

Attempting to run the W3C Link Checker against //ardamis.com/ returns an error message.

Error: 406 Not Acceptable

This is what the W3C says about the 406 HTTP status header:

406 Not Acceptable
The resource identified by the request is only capable of generating response entities which have content characteristics not acceptable according to the accept headers sent in the request.


In other words, the W3C Link Checker requests the web page, and tells the web server that, by the way, it can only accept a responses in a certain format. The web server then regrets to inform the requestor that it cannot fulfill this request, because it cannot return a response that would be acceptable to the requestor. It does this in the form of a 406 Not Acceptable HTTP header. The W3C Link Checker then outputs this error.

Other W3C apps, like Unicorn – W3C’s Unified Validator and the W3C HTML Validator don’t seem to be sending the same HTTP headers. (But I did note that there were a few small issues preventing the home page from passing the test, which I then fixed.)

Ardamis runs on WordPress, with a custom theme originally developed years ago from the Kubrick theme and a handful of plugins (as more completely described at the colophon page). I tinker with the site, from time to time, trying to speed it up or what-have-you. But no amount of tinkering seemed to resolve this problem. Over the course of a few months, I’d try various changes to the site to see if there was something I could do to fix this problem. I had pretty much convinced myself that it was going to be an issue for my web host when, miraculously, after making some changes to the .htaccess file, my theme and disabling one of the plugins (which I can’t see how would possibly affect the HTTP headers) the Link Checker began working.

In the results page for www.ardamis.com, it lists some of the headers used:

Settings used:

  • Accept: text/html, application/xhtml+xml;q=0.9, application/vnd.wap.xhtml+xml;q=0.6, */*;q=0.5
  • Accept-Language: en-US,en;q=0.8
  • Referer: sending
  • Sleeping 1 second between requests to each server

I’m not sure what I did to make this work, or even if it was something I did. But further troubleshooting would have involved disabling all plugins, trying a different theme, and then ruling out WordPress entirely.

File compression is possible on Apache web hosts that do not have mod_gzip or mod_deflate enabled, and it’s easier than you might think.

A great explanation of why compression helps the web deliver a better user experience is at betterexplained.com.

Two authoritative articles on the subject are Google’s Performance Best Practices documentation | Enable compression and Yahoo’s Best Practices for Speeding Up Your Web Site | Gzip Components.

Compressing PHP files

If your Apache server does not have mod_gzip or mod_deflate enabled (Godaddy and JustHost shared hosting, for example), you can use PHP to compress pages on-the-fly. This is still preferable to sending uncompressed files to the browser, so don’t worry about the additional work the server has to do to compress the files at each request.

Option 1: PHP.INI using zlib.output_compression

The zlib extension can be used to transparently compress PHP pages on-the-fly if the browser sends an “Accept-Encoding: gzip” or “deflate” header. Compression with zlib.output_compression seems to be disabled on most hosts by default, but can be enabled with a custom php.ini file:


zlib.output_compression = On

Credit: http://php.net/manual/en/zlib.configuration.php

Check with your host for instructions on how to implement this, and whether you need a php.ini file in each directory.

Option 2: PHP using ob_gzhandler

If your host does not allow custom php.ini files, you can add the following line of code to the top of your PHP pages, above the DOCTYPE declaration or first line of output:

<?php if (substr_count($_SERVER['HTTP_ACCEPT_ENCODING'], 'gzip')) ob_start("ob_gzhandler"); else ob_start(); ?>

Credit: GoDaddy.com

For WordPress sites, this code would be added to the top of the theme’s header.php file.

According to php.net, using zlib.output_compression is preferred over ob_gzhandler().

For WordPress or other CMS sites, an advantage of zlib.output_compression over the ob_gzhandler method is that all of the .php pages served will be compressed, not just those that contain the global include (eg.: header.php, etc.).

Running both ob_gzhandler and zlib.output_compression at the same time will throw a warning, similar to:

Warning: ob_start() [ref.outcontrol]: output handler ‘ob_gzhandler’ conflicts with ‘zlib output compression’ in /home/path/public_html/ardamis.com/wp-content/themes/mytheme/header.php on line 7

Compressing CSS and JavaScript files

Because the on-the-fly methods above only work for PHP pages, you’ll need something else to compress CSS and JavaScript files. Furthermore, these files typically don’t change, so there isn’t a need to compress them at each request. A better method is to serve pre-compressed versions of these files. I’ll describe a few different ways to do this, but in both cases, you’ll need to add some lines to your .htaccess file to send user agents the gzipped versions if they support the encoding. This requires that Apache’s mod_rewrite be enabled (and I think it’s almost universally enabled).

Option 1: Compress locally and upload

CSS and JavaScript files can be gzipped on the workstation, then uploaded along with the uncompressed files. Use a utility like 7-Zip (quite possibly the best compression software around, and it’s free) to compress the CSS and JavaScript files using the gzip format (with extension *.gz), then upload them to your server.

For Windows users, here is a handy command to compress all the .css and .js files in the current directory and all sub directories (adjust the path to the 7-Zip executable, 7z.exe, as necessary):

for /R %i in (*.css *.js) do "C:\Program Files (x86)\7-Zip\7z.exe" a -tgzip "%i.gz" "%i" -mx9

Note that the above command is to be run from the command line. The batch file equivalent would be:

for /R %%i in (*.css *.js) do "C:\Program Files (x86)\7-Zip\7z.exe" a -tgzip "%%i.gz" "%%i" -mx9

Option 2: Compress on the server

If you have shell access, you can run a command to create a gzip copy of each CSS and JavaScript file on your site (or, if you are developing on Linux, you can run it locally):

find . -regex ".*\(css\|js\)$" -exec bash -c 'echo Compressing "{}" && gzip -c --best "{}" > "{}.gz"' \;

This may be a bit too technical for many people, but is also much more convenient. It is particularly useful when you need to compress a large number of files (as in the case of a WordPress installation with multiple plugins). Remember to run it every time you automatically update WordPress, your theme, or any plugins, so as to replace the gzip’d versions of any updated CSS and JavaScript files.

The .htaccess (for both options)

Add the following lines to your .htaccess file to identify the user agents that can accept the gzip encoded versions of these files.

<files *.js.gz>
  AddType "text/javascript" .gz
  AddEncoding gzip .gz
<files *.css.gz>
  AddType "text/css" .gz
  AddEncoding gzip .gz
RewriteEngine on
#Check to see if browser can accept gzip files.
ReWriteCond %{HTTP:accept-encoding} gzip
RewriteCond %{HTTP_USER_AGENT} !Safari
#make sure there's no trailing .gz on the url
ReWriteCond %{REQUEST_FILENAME} !^.+\.gz$
#check to see if a .gz version of the file exists.
RewriteCond %{REQUEST_FILENAME}.gz -f
#All conditions met so add .gz to URL filename (invisibly)
RewriteRule ^(.+) $1.gz [QSA,L]

Credit: opensourcetutor.com

I’m not sure it’s still necessary to exclude Safari.

For added benefit, minify the CSS and JavaScript files before gzipping them. Google’s excellent Page Speed Firefox/Firebug Add-on makes this very easy. Yahoo’s YUI Compressor is also quite good.

Verify that your content is being compressed

Use the nifty Web Page Content Compression Verification tool at http://www.whatsmyip.org/http_compression/ to confirm that your server is sending the compressed files.

Speed up page load times for returning visitors

Compression is only part of the story. In order to further speed page load times for your returning visitors, you will want to send the correct headers to leverage browser caching.

I’m dual booting Linux (Fedora 11 with Gnome) and Windows 7. If I set the time in Windows, then boot into Linux, the time remains correct. When I boot back into Windows, the time is off by a few hours. After some reading, it seems that Linux is using UTC time and Windows is using local time.

This issue can be fixed by changing either OS, but because the problem seems to be with Windows mishandling UTC time, I chose to correct it there by turning on a feature called RealTimeIsUniversal. When RealTimeIsUniversal is enabled, Windows will treat the Real-Time Clock (RTC) from the motherboard as UTC time.

Open Regedit, drill down to:
and create a new DWORD entry named “RealTimeIsUniversal”. Set the value to 1.

Shut down Windows and boot Fedora. Set the correct time in Fedora, shutdown, and boot back into Windows. The time should be accurate if you have set your local time to the correct time zone.

It seems that there are problems with Windows XP and Vista whereby Sleep/Hibernate would cause Windows to revert to local time upon waking/resuming, but apparently this has been resolved in Windows 7.

If you prefer to change Fedora, go to Date/Time Properties via System > Administration > Date & Time or by opening a terminal and entering:


Under the Time Zone tab, clear the checkbox next to “System clock uses UTC”.

Under the Network Time Protocol tab, select Enable Network Time Protocol.

Note that your BIOS has no idea what timezone you are in, that’s up to the OS to figure out. If you check the time in the BIOS, it will likely be a few hours off, and that’s OK.

Just so there’s no confusion, Chromium is an open source web browser project. Google Chrome is a web browser from Google, based on the Chromium project. If you are familiar with Google Chrome in Windows, the Chromium browser looks very similar, but lacks the Google branding. Right now, there is no Google Chrome for Linux.

Here’s how I installed Chromium on Fedora 11.

Download the latest Chromium and v8 files for your OS (32 or 64 bit) from http://spot.fedorapeople.org/chromium/F11/ Save them to ~/Download or whatever location is convenient for you.

At the time of this post, these two files were
“chromium-” and
“v8-1.3.8-1.20090827svn2777.fc11.x86_64.rpm”, but they are certain to change frequently.

Open up a terminal window (Applications > System Tools > Terminal). You will probably need to switch to root for the entire installation process. To do this, type:


First, install two dependencies:

yum install minizip

minizip manipulates files from a .zip archive.

yum install nss-mdns

nss-mdns is a plugin for the GNU Name Service Switch (NSS) functionality of the GNU C Library (glibc) providing host name resolution via Multicast DNS (aka Zeroconf, aka Apple Rendezvous, aka Apple Bonjour), effectively allowing name resolution by common Unix/Linux programs in the ad-hoc mDNS domain .local. nss-mdns provides client functionality only, which means that you have to run a mDNS responder daemon separately from nss-mdns if you want to register the local host name via mDNS (e.g. Avahi).

Once those installs are complete, stay in the terminal window, navigate to the folder containing the Chromium and v8 downloads and then type:

rpm -ivh v8* chromium*

This will run the installer on the two downloads. If you see any other dependencies, just handle them as they come.

That’s it, you should see Chromium Web Browser under Applications > Internet.

Installing Chromium via YUM

If you want to install Chromium via YUM, you just need to add a new repository file to the /etc/yum.repos.d directory. I’m partial to gedit, so I’ll use that editor in the instructions.

Open a terminal and type:

sudo bash

This will give you root privileges for launching gedit (and other GUI apps). Launch gedit by typing:


Copy and paste the following lines into the new document:

name=Chromium Test Packages

Save the file as chromium.repo to /etc/yum.repos.d/. (If you are running Fedora 10, change the F11 to F10 in the baseurl path.)

Back in the terminal window, type:

yum update

YUM will pick up the new chromium file. You’re now ready to install Chromium with the line:

yum install chromium