Tag Archives: htaccess

Just a few notes to myself about monitoring web sites for infections/malware and potential vulnerabilities.

Tools for detecting infections on web sites

Google Webmaster Tools

Your first stop should be here, as I’ve personally witnessed alerts show up in Webmaster Tools, even when all the following tools gave the site a passing grade. If your site is registered here, and Google finds weird pages on your site, an alert will appear. You can also have the messages forwarded to your email account on file, by choosing the Forward option under the All Messages area of the Home page.

Google Webmaster Tools Hack Alert

Google Safe Browsing

The Google Safe Browsing report for ardamis.com: http://safebrowsing.clients.google.com/safebrowsing/diagnostic?site=ardamis.com

Norton Safe Web

https://safeweb.norton.com/

The Norton Safe Web report for ardamis.com: https://safeweb.norton.com/report/show?url=ardamis.com

Tools for analyzing a site for vulnerabilities

Sucuri Site Check

http://sitecheck.sucuri.net/scanner/

The Sucuri report for ardamis.com: http://sitecheck.sucuri.net/scanner/?scan=www.ardamis.com.

Attempting to run the W3C Link Checker against //ardamis.com/ returns an error message.

Error: 406 Not Acceptable

This is what the W3C says about the 406 HTTP status header:

406 Not Acceptable
The resource identified by the request is only capable of generating response entities which have content characteristics not acceptable according to the accept headers sent in the request.

http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html

In other words, the W3C Link Checker requests the web page, and tells the web server that, by the way, it can only accept a responses in a certain format. The web server then regrets to inform the requestor that it cannot fulfill this request, because it cannot return a response that would be acceptable to the requestor. It does this in the form of a 406 Not Acceptable HTTP header. The W3C Link Checker then outputs this error.

Other W3C apps, like Unicorn – W3C’s Unified Validator and the W3C HTML Validator don’t seem to be sending the same HTTP headers. (But I did note that there were a few small issues preventing the home page from passing the test, which I then fixed.)

Ardamis runs on WordPress, with a custom theme originally developed years ago from the Kubrick theme and a handful of plugins (as more completely described at the colophon page). I tinker with the site, from time to time, trying to speed it up or what-have-you. But no amount of tinkering seemed to resolve this problem. Over the course of a few months, I’d try various changes to the site to see if there was something I could do to fix this problem. I had pretty much convinced myself that it was going to be an issue for my web host when, miraculously, after making some changes to the .htaccess file, my theme and disabling one of the plugins (which I can’t see how would possibly affect the HTTP headers) the Link Checker began working.

In the results page for www.ardamis.com, it lists some of the headers used:

Settings used:

  • Accept: text/html, application/xhtml+xml;q=0.9, application/vnd.wap.xhtml+xml;q=0.6, */*;q=0.5
  • Accept-Language: en-US,en;q=0.8
  • Referer: sending
  • Sleeping 1 second between requests to each server

I’m not sure what I did to make this work, or even if it was something I did. But further troubleshooting would have involved disabling all plugins, trying a different theme, and then ruling out WordPress entirely.

A collection of web development tools for building better sites more easily.

Frameworks and scripts

HTML5 Boilerplate is the professional badass’s base HTML/CSS/JS template for a fast, robust and future-proof site.

scriptsrc.net is a collection of script tags of the latest versions of a range of JavaScript libraries.

Modernizr adds classes to the <html> element which allow you to target specific browser functionality in your stylesheet. You don’t actually need to write any Javascript to use it.

Images

placehold.it is a quick and simple image placeholder service.

Text manipulation

TextFixer is a collection of online text tools. Remove line breaks from text, alphabetize text, capitalize the first letter of sentences, remove whitespaces, and uppercase text or lowercase text.

XHTML

HTML Minifier will minify HTML (or XHTML), and any CSS or JS included in your markup.

CSS

CSS3 Generator is an awesome code generator for CSS3 snippets, and shows the minimum browser versions required to display the effects.

proCSSor is an advanced CSS prettifier with tons of formatting options.

JavaScript

Online javascript beautifier will reformat and reindent bookmarklets, ugly javascript, unpack scripts packed by the popular Dean Edward’s packer, as well as deobfuscate scripts processed by javascriptobfuscator.com. The source code for the latest version is always available on github, and you can download the beautifier for local use (zip, tar.gz) as well.

Fonts and Typography

Fontshop.com has written A Field Guide to Typography to get you excited about fonts and typography.

Typetester is an online app for comparing different fonts for the screen, you can test up to three fonts at a time and choose the one you like. Its primary role is to make web designer’s life easier.

A quick chart of the fonts common to all versions of Windows and the Mac equivalents, or a more extensive matrix of fonts bundled with Mac and Windows operating systems, Microsoft Office and Adobe Creative Suite.

<html>ipsum has Lorem ipsum already wrapped in HTML tags. Pre-made paragraphs, lists, etc…

More resources at: 50 Useful Design Tools For Beautiful Web Typography and 21 Typography and Font Web Apps You Can’t Live Without.

Colors

Color Scheme Designer.

Markup

Google Webmaster Tools’ Rich Snippets Testing Tool.

Use the Rich Snippets Testing Tool to check that Google can correctly parse your structured data markup and display it in search results.

I’ve written a few posts on how to speed up web sites by sending the correct headers to leverage browser caching and compressing .php, .css and .js files without mod_gzip or mod_deflate.

The intended audience for this post is developers who have already applied most or all of Google’s Web Performance Best Practices and Yahoo’s Best Practices for Speeding Up Your Web Site and are wondering how to speed up WordPress sites in particular.

I assume you’re familiar with WordPress caching and are already using a caching plugin, such as WP Super Cache, W3 Total Cache or the like.

Reduce HTTP requests

Reducing HTTP requests should be the very first thing step in speeding up any site. If you are using plugins, watch them carefully for inefficiencies like added CSS and JavaScript files. Combine, minify and compress these files. Some plugins allow you to turn off their bundled CSS in the plugin’s settings page. Where possible, copy the plugin’s CSS into the current theme’s style.css and turn off the extra file.

Delete deactivated plugins

Remove any plugins you’re not using. Deactivated plugins can be deleted from the Plugins page.

Speed up the mod_rewrite code

jdMorgan from WebmasterWorld.com has written a replacement for the .htaccess rewrite rule used by WordPress. This will speed up the WordPress mod_rewrite code by a factor of more than two.

This is a total replacement for the code supplied with WP as bounded by the “Begin WP” and “End WP” comments, and fixes several performance-affecting problems. Notably, the unnecessary and potentially-problematic container is completely removed, and code is added and re-structured to both prevent completely-unnecessary file- and directory- exists checks and to reduce the number of necessary -exists checks to one-half the original count (due to the way mod_rewrite behaves recursively in .htaccess context).

http://www.webmasterworld.com/apache/4053973.htm

The following code is adapted from the original to add png, flv and swf to the list of static file formats:

# BEGIN WordPress
# adapted from http://www.webmasterworld.com/apache/4053973.htm
#
RewriteEngine on
#
# Unless you have set a different RewriteBase preceding this point,
# you may delete or comment-out the following RewriteBase directive
# RewriteBase /
#
# if this request is for "/" or has already been rewritten to WP
RewriteCond $1 ^(index\.php)?$ [OR]
# or if request is for image, css, or js file
RewriteCond $1 \.(gif|jpg|png|ico|css|js|flv|swf)$ [NC,OR]
# or if URL resolves to existing file
RewriteCond %{REQUEST_FILENAME} -f [OR]
# or if URL resolves to existing directory
RewriteCond %{REQUEST_FILENAME} -d
# then skip the rewrite to WP
RewriteRule ^(.*)$ - [S=1]
# else rewrite the request to WP
RewriteRule . /index.php [L]
#
# END wordpress 

Only load the comment-reply.js when needed

In the default WordPress template, the comment-reply.js script is included on all single post pages, regardless of whether nested/threaded comments is enabled. A simple tweak changes the theme to only include comment-reply.js on single post pages only when it makes sense to do so: if threaded comments are enabled, commenting on that post is allowed, and a comment already exists.

Remove the following line from your theme’s header.php:

<?php if ( is_singular() ) wp_enqueue_script( 'comment-reply' ); ?>

Add the following lines to your theme’s functions.php.

// Don't add the wp-includes/js/comment-reply.js?ver=20090102 script to single post pages unless threaded comments are enabled
// adapted from http://bigredtin.com/behind-the-websites/including-wordpress-comment-reply-js/
function theme_queue_js(){
  if (!is_admin()){
    if (is_singular() && (get_option('thread_comments') == 1) && comments_open() && have_comments())
      wp_enqueue_script('comment-reply');
  }
}
add_action('wp_print_scripts', 'theme_queue_js');

Only load the l10n.js when needed

In WordPress 3.1, a l10n.js script is loaded. It is “mostly used for scripts that send over localization data from PHP to the JS side using wp_localize_script().” Whether it’s safe to remove this file seems to be a matter of debate, but should you decide you want to remove it…

Add the following lines to your theme’s functions.php.

// Don't add the wp-includes/js/l10n.js?ver=20101110 script to non-admin pages
// adapted from http://wordpress.stackexchange.com/questions/5451/what-does-l10n-js-do-in-wordpress-3-1-and-how-do-i-remove-it
function remove_l10n_js(){
  if (!is_admin()){
    wp_deregister_script('l10n');
  }
}
add_action('wp_print_scripts', 'remove_l10n_js');

Replace unecessary code executions and database queries

WordPress saves settings specific to your blog in the database. These settings include what language your blog is written in, whether text is read left-to-right or vice versa, the path to the template directory, etc.

The default WordPress theme contains a number of database queries in order to figure out these things and build the correct page. The default theme needs this flexibility, but your theme does not. Joost de Valk recommends replacing these database queries with static text in your theme files in his post Speed up WordPress, and clean it up too!

An easy way to do this is to browse to your web site and then view the source code. Copy the content that won’t change from page to page and paste it into your theme, overwriting the PHP with the generated HTML.

For example, my theme’s header.php file contains this line:

<html xmlns="http://www.w3.org/1999/xhtml" <?php language_attributes(); ?>>

The source code of the rendered page displays this line:

<html xmlns="http://www.w3.org/1999/xhtml" dir="ltr" lang="en-US">

On my blog, this HTML output is never going to be anything different, so why make WordPress look these settings up in the database each time a page is loaded? This line is an excellent candidate for replacement. The footer.php file contains a handful more opportunities for replacement, but go through each of your theme’s files and look for references to the template directory and other data that won’t change as long as you’re using the theme. All told, I was able to replace 12 database queries with static HTML.

Joost also recommends checking for unnecessary or slow database queries, and installing a helpful debugging plugin, in his post on Optimizing WordPress database performance.

Clean out your MySQL database

Delete spam comments

From the Dashboard, click Comments, then click on Empty Spam.

Delete post revisions

If you don’t use post revisions, you may want to delete them from the wp_posts table. Back up your database, then run the following SQL query:

DELETE FROM wp_posts WHERE post_type = "revision";

Before: 683 records
After: 165 records

This does not delete the latest draft of unpublished posts. It’s a good idea to optimize the table after deleting the revisions.

You can stop WordPress from saving post revisions by adding the following code to wp-config.php:

define('WP_POST_REVISIONS', false );

Optimize your MySQL database

Optimizing your MyISAM tables is comparable to defragmenting your hard drive. It’s probably been a while since you’ve done that, too.

If you’re using phpMyAdmin, browse to your WordPress database. Under the Structure tab, at the bottom of the list of tables, click on the link “Check all”. In the “With selected” menu, choose “Optimize table”. (I would have recommended just optimizing tables that have overhead, but the wp_posts table can be optimized even when it doesn’t show any overhead.

For MyISAM tables, OPTIMIZE TABLE works as follows:

  • If the table has deleted or split rows, repair the table.
  • If the index pages are not sorted, sort them.
  • If the table’s statistics are not up to date (and the repair could not be accomplished by sorting the index), update them.

http://dev.mysql.com/doc/refman/5.1/en/optimize-table.html

Flush the Buffer Early

When users request a page, it can take anywhere from 200 to 500ms for the backend server to stitch together the HTML page. During this time, the browser is idle as it waits for the data to arrive. In PHP you have the function flush(). It allows you to send your partially ready HTML response to the browser so that the browser can start fetching components while your backend is busy with the rest of the HTML page. The benefit is mainly seen on busy backends or light frontends.

http://developer.yahoo.com/performance/rules.html#flush

Add flush() between the closing tag and the opening tag in header.php.

</head>
<?php flush(); // http://developer.yahoo.com/performance/rules.html#flush ?>
<body>

OK, so this isn’t technically a WordPress-specific tweak, but it’s a good idea.

As a follow-up to my post on compressing .php, .css and .js files without mod_gzip or mod_deflate, I’m documenting the changes I made to the .htaccess file on ardamis.com in order to speed up page load times for returning visitors and satisfy the Leverage browser caching recommendation of Google’s Page Speed Firefox/Firebug Add-on.

A great explanation of why browser caching helps the web deliver a better user experience is at betterexplained.com.

Two authoritative articles on the subject are Google’s Performance Best Practices | Optimize caching and Yahoo’s Best Practices for Speeding Up Your Web Site | Add an Expires or a Cache-Control Header.

I’d like to point out that in researching browser cashing, I came across a lot of information that contradicted the rather clear instructions from Google:

It is important to specify one of Expires or Cache-Control max-age, and one of Last-Modified or ETag, for all cacheable resources. It is redundant to specify both Expires and Cache-Control: max-age, or to specify both Last-Modified and ETag.

I’m not sure that this recommendation is entirely correct, as the W3C states that Expires and Cache-Control max-age are used in different situations, with Cache-Control max-age overriding Expires in the event of conflicts.

If a response includes both an Expires header and a max-age directive, the max-age directive overrides the Expires header, even if the Expires header is more restrictive. This rule allows an origin server to provide, for a given response, a longer expiration time to an HTTP/1.1 (or later) cache than to an HTTP/1.0 cache.

http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html

It would seem that Cache-Control is the preferred method of controlling browser caching going forward.

HTTP 1.1 clients will honour “Cache-Control” (which is easier to use and much more flexible).
HTTP 1.0 clients will ignore “Cache-Control” but honour “Expires”. With “Expires” you get thus at least a bit control for these old clients.

http://www.peterbe.com/plog/cache-control_or_expires

In any event, Page Speed won’t protest if you do end up sending both Expires and Cache-Control max-age, or if you remove both Last-Modified and ETag, but I was able to get the best results with just setting Cache-Control max-age and removing the ETag.

Setting the headers in .htaccess

On Apache, configuring the proper headers can be done in the .htaccess file, using the Header directive. The Header directive requires the mod_headers module to be enabled.

I’m choosing to set a far future Expires header of one year on my images files, because I tweak the CSS and JavaScript pretty often, and don’t want those file types to be cached as long.

Add the following code to your .htaccess file to set your Cache-Control and Expires headers, adjusting the date to be one year from today.

# Set Cache-Control and Expires headers
<filesMatch "\\.(ico|pdf|flv|jpg|jpeg|png|gif|swf|mp3|mp4)$">
Header set Cache-Control "max-age=2592000, private"
Header set Expires "Sun, 17 July 2011 20:00:00 GMT"
</filesMatch>
<filesMatch "\\.(css|css.gz)$">
Header set Cache-Control "max-age=604800, private"
</filesMatch>
<filesMatch "\\.(js|js.gz)$">
Header set Cache-Control "max-age=604800, private"
</filesMatch>
<filesMatch "\\.(xml|txt)$">
Header set Cache-Control "max-age=216000, private, must-revalidate"
</filesMatch>
<filesMatch "\\.(html|htm)$">
Header set Cache-Control "max-age=7200, private, must-revalidate"
</filesMatch>

Removing ETags in .htaccess

Most sources recommend simply removing ETags if they are not required.

Entity tags (ETags) are a mechanism that web servers and browsers use to determine whether the component in the browser’s cache matches the one on the origin server.

If you’re not taking advantage of the flexible validation model that ETags provide, it’s better to just remove the ETag altogether.

http://developer.yahoo.com/performance/rules.html#etags

Add the following code to your .htaccess file to remove ETag headers.

# Turn off ETags
FileETag None
Header unset ETag

Set Expires headers with ExpiresByType (optional)

If your host has the mod_expires module enabled, you can specify Expires headers by file type. Godaddy does not have this module enabled.

# Set Expires headers
ExpiresActive On
ExpiresDefault "access plus 1 year"
ExpiresByType text/html "access plus 1 second"
ExpiresByType image/gif "access plus 2592000 seconds"
ExpiresByType image/jpeg "access plus 2592000 seconds"
ExpiresByType image/png "access plus 2592000 seconds"
ExpiresByType image/x-icon "access plus 2592000 seconds"
ExpiresByType text/css "access plus 604800 seconds"
ExpiresByType text/javascript "access plus 604800 seconds"
ExpiresByType application/x-javascript "access plus 604800 seconds"

Removing the Last-Modified header in .htaccess (optional)

I’m following Google’s instructions and not removing the Last-Modified header, but if you wanted to do so, you could use:

# Remove Last-Modified header
Header unset Last-Modified

Busting the cache when files change

What happens when you change files and need to force browsers to load the new files? Christian Johansen offers two methods in his post on Using a far future expires header.

File compression is possible on Apache web hosts that do not have mod_gzip or mod_deflate enabled, and it’s easier than you might think.

A great explanation of why compression helps the web deliver a better user experience is at betterexplained.com.

Two authoritative articles on the subject are Google’s Performance Best Practices documentation | Enable compression and Yahoo’s Best Practices for Speeding Up Your Web Site | Gzip Components.

Compressing PHP files

If your Apache server does not have mod_gzip or mod_deflate enabled (Godaddy and JustHost shared hosting, for example), you can use PHP to compress pages on-the-fly. This is still preferable to sending uncompressed files to the browser, so don’t worry about the additional work the server has to do to compress the files at each request.

Option 1: PHP.INI using zlib.output_compression

The zlib extension can be used to transparently compress PHP pages on-the-fly if the browser sends an “Accept-Encoding: gzip” or “deflate” header. Compression with zlib.output_compression seems to be disabled on most hosts by default, but can be enabled with a custom php.ini file:

[PHP]

zlib.output_compression = On

Credit: http://php.net/manual/en/zlib.configuration.php

Check with your host for instructions on how to implement this, and whether you need a php.ini file in each directory.

Option 2: PHP using ob_gzhandler

If your host does not allow custom php.ini files, you can add the following line of code to the top of your PHP pages, above the DOCTYPE declaration or first line of output:

<?php if (substr_count($_SERVER['HTTP_ACCEPT_ENCODING'], 'gzip')) ob_start("ob_gzhandler"); else ob_start(); ?>

Credit: GoDaddy.com

For WordPress sites, this code would be added to the top of the theme’s header.php file.

According to php.net, using zlib.output_compression is preferred over ob_gzhandler().

For WordPress or other CMS sites, an advantage of zlib.output_compression over the ob_gzhandler method is that all of the .php pages served will be compressed, not just those that contain the global include (eg.: header.php, etc.).

Running both ob_gzhandler and zlib.output_compression at the same time will throw a warning, similar to:

Warning: ob_start() [ref.outcontrol]: output handler ‘ob_gzhandler’ conflicts with ‘zlib output compression’ in /home/path/public_html/ardamis.com/wp-content/themes/mytheme/header.php on line 7

Compressing CSS and JavaScript files

Because the on-the-fly methods above only work for PHP pages, you’ll need something else to compress CSS and JavaScript files. Furthermore, these files typically don’t change, so there isn’t a need to compress them at each request. A better method is to serve pre-compressed versions of these files. I’ll describe a few different ways to do this, but in both cases, you’ll need to add some lines to your .htaccess file to send user agents the gzipped versions if they support the encoding. This requires that Apache’s mod_rewrite be enabled (and I think it’s almost universally enabled).

Option 1: Compress locally and upload

CSS and JavaScript files can be gzipped on the workstation, then uploaded along with the uncompressed files. Use a utility like 7-Zip (quite possibly the best compression software around, and it’s free) to compress the CSS and JavaScript files using the gzip format (with extension *.gz), then upload them to your server.

For Windows users, here is a handy command to compress all the .css and .js files in the current directory and all sub directories (adjust the path to the 7-Zip executable, 7z.exe, as necessary):

for /R %i in (*.css *.js) do "C:\Program Files (x86)\7-Zip\7z.exe" a -tgzip "%i.gz" "%i" -mx9

Note that the above command is to be run from the command line. The batch file equivalent would be:

for /R %%i in (*.css *.js) do "C:\Program Files (x86)\7-Zip\7z.exe" a -tgzip "%%i.gz" "%%i" -mx9

Option 2: Compress on the server

If you have shell access, you can run a command to create a gzip copy of each CSS and JavaScript file on your site (or, if you are developing on Linux, you can run it locally):

find . -regex ".*\(css\|js\)$" -exec bash -c 'echo Compressing "{}" && gzip -c --best "{}" > "{}.gz"' \;

This may be a bit too technical for many people, but is also much more convenient. It is particularly useful when you need to compress a large number of files (as in the case of a WordPress installation with multiple plugins). Remember to run it every time you automatically update WordPress, your theme, or any plugins, so as to replace the gzip’d versions of any updated CSS and JavaScript files.

The .htaccess (for both options)

Add the following lines to your .htaccess file to identify the user agents that can accept the gzip encoded versions of these files.

<files *.js.gz>
  AddType "text/javascript" .gz
  AddEncoding gzip .gz
</files>
<files *.css.gz>
  AddType "text/css" .gz
  AddEncoding gzip .gz
</files>
RewriteEngine on
#Check to see if browser can accept gzip files.
ReWriteCond %{HTTP:accept-encoding} gzip
RewriteCond %{HTTP_USER_AGENT} !Safari
#make sure there's no trailing .gz on the url
ReWriteCond %{REQUEST_FILENAME} !^.+\.gz$
#check to see if a .gz version of the file exists.
RewriteCond %{REQUEST_FILENAME}.gz -f
#All conditions met so add .gz to URL filename (invisibly)
RewriteRule ^(.+) $1.gz [QSA,L]

Credit: opensourcetutor.com

I’m not sure it’s still necessary to exclude Safari.

For added benefit, minify the CSS and JavaScript files before gzipping them. Google’s excellent Page Speed Firefox/Firebug Add-on makes this very easy. Yahoo’s YUI Compressor is also quite good.

Verify that your content is being compressed

Use the nifty Web Page Content Compression Verification tool at http://www.whatsmyip.org/http_compression/ to confirm that your server is sending the compressed files.

Speed up page load times for returning visitors

Compression is only part of the story. In order to further speed page load times for your returning visitors, you will want to send the correct headers to leverage browser caching.

I had noticed mentions of analytic information provided by Compete.com often enough that I was curious about what it could do for me.

Compete already had some information about ardamis.com, but it was stunningly wrong. For example, it was telling me that the phrase “godaddy referral program” was responsible for 20.35% of the total traffic sent to my site by search engines. Until recently, I did have a page that mentioned godaddy referral programs, but according to Google Analytics, it was barely ever visited (7 page views in the last 30 days – it was the 78th most popular page on my site, which does get a few thousand visitors a month). Even more strange, it told me that I was getting traffic for the search phrase “myspace”. I have never written anything about myspace before.

I figured that once I installed their JavaScript tracking code, the analytics information would be much more accurate. So I installed the code, confirmed it appeared at the bottom of the home page, and attempted to verify my site at //ardamis.com/, but was unable to. I had read somewhere that the free account does not support tracking subdomains, and the verification process seemed to get hung up on the use of .htaccess to redirect non-www traffic to www.ardamis.com. I was mystified that Compete apparently could not recognize this was happening via a 301 redirect header and compensate.

Sorry,

It looks like the CompeteXL code has not been correctly placed on the homepage of your site.

This could be because either the code was not copied over correctly, or because it has been placed on the wrong page.

We think your homepage is

ardamis.com

I went so far as to email my amazement to their support staff, who promptly and politely wrote me back. (Thumbs up to the guys answering the emails.)

I had made a few other changes to my site at the same time, so I ran a Page Speed check on it. Page Speed told me that I was linking to a resource at trgc.opt.fimserve.com that was throwing a 404 error. I was pretty sure I didn’t intentionally link to anything at that domain, so I Googled it. Surprisingly, there’s not much out there on trgc.opt.fimserve.com other than this and this. As it turns out, fimserve.com is part of something called the FOX Audience Network, and FAN’s parent company is News Corporation, which also owns myspace.com.

Here’s the WHOIS on fimserve.com:

Domain Name: FIMSERVE.COM
Registrar: REGISTER.COM, INC.
Whois Server: whois.register.com
Referral URL: http://www.register.com
Name Server: NS1.MYSPACE.COM
Name Server: NS2.MYSPACE.COM
Status: clientTransferProhibited
Updated Date: 17-oct-2006
Creation Date: 17-oct-2006
Expiration Date: 17-oct-2011

And here’s the WHOIS on foxaudiencenetwork.com:

Domain Name: FOXAUDIENCENETWORK.COM
Registrar: MARKMONITOR INC.
Whois Server: whois.markmonitor.com
Referral URL: http://www.markmonitor.com
Name Server: NS1.MYSPACE.COM
Name Server: NS2.MYSPACE.COM
Status: clientDeleteProhibited
Status: clientTransferProhibited
Status: clientUpdateProhibited
Updated Date: 03-may-2010
Creation Date: 03-jun-2008
Expiration Date: 03-jun-2011

I didn’t like the idea that information about my visitors was being shared with anyone but the site I had signed up for, so I started looking through the Compete FAQs and found this:

Currently, the CompeteXL code tracks ONLY self-reported Audience Profile data through a partnership with the FOX Audience Network.

The CompeteXL code DOES NOT track traffic or user engagement metrics, that information continues to be provided through our multi-sourced panel and requires NO addition of code to your site.

http://www.compete.com/help/q225

What the hell? Why am I installing a tracking code if it’s not used to track traffic?

Oh, and this was a fun discovery, too:

The FOX Audience Network (FAN) is a unit of News Corporation that supports monetization efforts across the company’s online content portfolio, as well as third-party publisher sites.

FAN leverages proprietary advertising technology to create highly-targeted advertising campaigns for a wide range of marketers, while also delivering cutting-edge tools and services to third-party publisher partners. FAN works directly with hundreds of advertisers to develop customized marketing programs that optimize both branded and performance-based strategies.

http://www.foxaudiencenetwork.com/aboutus.php

I use a very popular HOSTS file to block a huge number of servers that are known to host advertisements, tracking scripts (including Google Analytics), parasites, hijackers and unwanted Adware/Spyware programs. The 404 error in Page Speed was caused by the inclusion of trgc.opt.fimserve.com in this custom HOSTS file, which then led me to finding all this out the next day. I’ve removed the tracking code because the information I wanted was on traffic – who’s coming to my site, why, and through what means – and not user demographics.

Update 2/22/2010: It looks like changing .htaccess is no longer necessary. After you select PHP 5.x, your site will begin using version 5.2.5 without any further configuration.

The following applies to older domains. As of early 2009, newly purchased linux hosting plans are running PHP 5.2.8, while older plans, once updated, only go up to PHP 5.2.5. I’ve had Ardamis.com hosted at GoDaddy since 2005, and quite awhile ago I thought I had upgraded to PHP version 5 from 4.3.11, but tonight I happened to check with phpinfo and found I was still on version 4.

In the unheard of ten minutes that I was on hold waiting for technical support, I figured out how to really run my pages on PHP 5.x (in this case, 5.2.5).

Log in and go to your Hosting Control Center. You must be running Hosting Configuration 2.0 to go any further, so if you haven’t touched your domain in years, do that first.

Click on Content, then Add-On Languages. Next to PHP Version, select PHP 5.x and click Continue. You’ll get a message that “Changing to PHP 5.x may make your PHP files run incorrectly.” Highly unlikely these days, but OK, you’ve been warned. Notice, too that it says “PHP 5.x will be activated“. Click Update.

It may take awhile for this change to be processed by the server, but once your Account Summary is displaying PHP Version: 5.x, it’s time for the really important part.

You see, you’ve only made PHP 5.x available at this point. Your *.php files are still running in 4.x. Go ahead and check phpinfo again.

Now, you could simply edit .htaccess to change the extensions, like so:

AddHandler x-httpd-php5 .php
AddHandler x-httpd-php .php4

More details at http://help.godaddy.com/article/1082

But if you’re squeamish about changing .htaccess yourself, there’s another way to set 5.x to be the default handler for *.php files. All the following does, strangely enough, is to add the AddHandler x-httpd-php5 .php to the beginning of your .htaccess file.

Back in the Hosting Control Center, click on Settings, then File Extension. If the change to 5.x has been completed, you’ll see at the bottom of the available extensions list, “Extension -> .php | Runs Under -> PHP 5.x” If it’s not there, stop here and come back in an hour or so.

Click on Custom Extensions at the left. This should be empty, with a message stating “No custom extensions have been created.”

Click on Default Extensions and then click on the Edit button (it looks like a piece of paper and a pencil) to the right of .php | PHP 5.x. Click on Continue.

Click again on the Custom Extensions button on the left, and you should now see “Extension -> .php | Runs Under -> PHP 5.x”. Check your phpinfo page one more time, and it should report PHP 5.x.

It’s unfortunate we even have to do this for our older domains, but I asked the tech support guy if I could somehow get on to PHP 5.2.8, and he said nope, that the newer servers have the more recent version but the older servers are stuck back in 2007.

This is a collection of php code snippets that seem to come in handy rather often. They are assembled here more for my own organization than anything else.

String: trim and convert to lowercase

A very straightforward but useful snippet. A string is first trimmed of any leading or trailing white space, and then converted to lowercase letters. Good for normalizing user input.

<?php
$string = "Orange";
$string = strtolower(trim($string));
echo $string;
?>

String: truncate and break at word

This will attempt to shorten a string to $length characters, but will then increase the string (if necessary) to break at the next whole word and then append an ellipses to the end of the string. Good for shortening readable text while keeping it looking pretty.

<?php 
function truncate($string, $length) {
	if (strlen($string) > $length) {
		$pos = strpos($string, " ", $length);
		return substr($string, 0, $pos) . "...";
	}
	
	return $string;
}

	echo truncate('the quick brown fox jumped over the lazy dog', 10);
?> 

In the above example, the resultant output will be the quick brown…, because the 10th character is the space immediately before the ‘b’ in ‘brown’, which is counted as part of the word ‘brown’.

What season is it?

Note that this is only a very rough approximation of when a season begins and ends. This snippet would be good for rotating a seasonal background or something, but it’s not astronomically correct, and I wouldn’t use it as a calendar. Reckoning a season is rather complex.

<?php echo "It is day " . date('z') . " of the year. <br />"; ?>
<?php $theday = date('z');
	if($theday >= "79" && $theday <= "171") { 
	$season = "Spring";
	} elseif($theday >= "172" && $theday <= "264") { 
	$season = "Summer";
	} elseif($theday >= "265" && $theday <= "355") { 
	$season = "Autumn";
	} else { 
	$season = "Winter";
	}
	echo "It's " . $season . "!";
?>

Get the number of days since something happened

This function takes a date (formatted as a Unix timestamp) and calculates the number of days since that date. The floor() function shouldn’t really be necessary, but it’s a hold-over from a less accurate function that used only the hours elapsed. In that function, the results would vary depending on the time of day the function was called. In this method, the times are normalized to 12:00:00 AM.

function calc_days_ago($date){
	// The function accepts a date formatted as a Unix timestamp
	
	// First, normalize the current date down to the Unix time at 12:00:00 AM (to the second)
	$now = time() - ( (date('G')*(60*60)) + date('i')*60 + date('s') );
	// Second, normalize the given date down to the Unix time at 12:00:00 AM (to the second)
	$then = $date - ( (date('G', $date)*(60*60)) + date('i', $date)*60 + date('s', $date) );
	$diff = $now - $then;
	$days = floor($diff/(24*60*60));
	switch ($days) {
	case 0:
 		$days_ago = "today";
		break;
	case 1:
		$days_ago = $days . " day ago";
		break;
	default:
		$days_ago = $days . " days ago";
	}
	return $days_ago;
}

Get the hours and minutes remaining until something happens

This function takes a time (formatted as a Unix timestamp) and calculates the number of hours and minutes remaining until that time. If the time has already passed, the function returns “historical”. Example outputs would be “7 hours”, “6 hours and 34 minutes”, and “12 minutes”. It could probably be made even more accurate if you changed it to use 3 decimal places and then round to 2 decimal places, but this is good enough for my purposes.

function calc_time_left($date){
	// The function accepts a date formatted as a Unix timestamp

	$now = time();
	$event = $date;
	if ($event >= $now) {
		$diff = $event - $now;
		$unroundedhours = $diff/(60*60);
		// Find the hours, if any, and assemble a string
		$hours = floor($unroundedhours);
		if ($hours > "0") {
			$hourtext = ($hours == "1")? " hour" : " hours";
			$thehours = $hours . $hourtext;
		}else{
			$thehours = "";
		}
		// Find the minutes, if any, and assemble a string
		if (strpos($unroundedhours, '.')) {
			$pos = strpos($unroundedhours, '.') + 1;
			$remainder = substr($unroundedhours, $pos, 2);
			$minutes = floor($remainder * .6);
			$minutetext = ($minutes == "1")? " minute" : " minutes";
			$theminutes = $minutes . $minutetext;
		}elseif ($minutes == "0") {
			$theminutes = "";
		}else{
			$theminutes = "";
		}
		if ($thehours && $theminutes) {
			$sep = " and ";
		}
		$timeleft = $thehours . $sep . $theminutes;
	}else{
		$timeleft = "historical";
	}
	return $timeleft;
}

Get the path of the containing directory

This one really comes in handy. It will give you the URL of the folder where the executing script resides, so you can reference the full path to other files in that folder, no matter where the folder may be located. It works on both Linux and Windows servers, and it adds a trailing slash to the path if one doesn’t already exist, so that root looks the same as a subfolder.

<?php 
function get_path() {
	// Get the path of the folder where the executing script resides, with the trailing slash
	
	// Determine HTTPS or HTTP
	$url = (isset($_SERVER['HTTPS']) && $_SERVER['HTTPS'] == 'on') ? 'https://' : 'http://';
	$url .= $_SERVER['HTTP_HOST'] . dirname($_SERVER['PHP_SELF']);
	// Convert the trailing backslash (on Windows root) to a forward slash
	$url = str_replace('\\', '/', $url);
	// Determine whether the current location is root by looking for a trailing slash (Windows or Linux)
	if (strlen($url) != strrpos($url, '/') +1) {
		$url .= '/';
	}
	return $url;
}
?>

Centering unordered list items

I wrote this script because I wanted to center the thumbnails in the Plogger image gallery while still using an unordered list item to contain each thumbnail. The script figures out how many thumbnails exist on a page and how many will fit in the space provided, then adds sufficient left padding to each to give the appearance of them being centered. It can be easily adapted for other uses. The full explanation and code example is at //ardamis.com/2007/08/05/centering-the-thumbnails-in-plogger/.

Parse .html as .php (Apache .htaccess)

This isn’t actually a PHP script, but it’s still handy. If you need to write pages with a .html or .htm extension but still want to use PHP in those pages, adding the following line to your .htaccess file will force an Apache server to parse .html files as .php files. I have confirmed this to work with GoDaddy’s hosting (GoDaddy runs PHP as CGI).

AddHandler x-httpd-php .php .htm .html

If you are running an Apache server as part of an XAMPP installation on top of Windows, try using this instead:

AddType application/x-httpd-php .html .htm

Updated for XAMPP version 1.6.5

Some time ago, I decided to start phasing out static xhtml in favor of pages using PHP includes. To test these new pages, I used apachefriends.org’s wonderful XAMPP (which I really can’t recommend highly enough) to install Apache, MySQL, and PHP (among other things). Once I had my local server running, I put each dev site into its own folder in \htdocs\ and navigated to them by http://127.0.0.1/foldername/.

This setup was functional but far from ideal, as the index pages for these local sites weren’t in what could be considered a root directory, which lead to some tip-toeing around when creating links.

Then I discovered the NameVirtualHost feature in Apache. NameVirtualHost allows the server admin to set up multiple domains/hostnames on a single Apache installation by using VirtualHost containers. In other words, you can run more than one web site on a single machine. This means that each dev site (or domain) can then consider itself to have a root directory. You will be able to access each local site as a subdomain of “localhost” by making a change to the HOSTS file. For example, I access the local dev version of this site at http://ardamis.localhost/.

This works great for all sorts of applications that rely on the site having a discernible root directory, such as WordPress.

Unfortunately, setting up NameVirtualHost can be kind of tricky. If you are having problems configuring your Apache installation to use the NameVirtualHost feature, you’re in good company. Here’s how I managed to get it working:

For XAMPP version 1.6.5

  1. Create a folder in drive:\xampp\htdocs\ for each dev site (adjust for your directory structure). For example, if I’m creating a development site for ardamis.com on my d: drive, I’d create a folder at:
    d:\xampp\htdocs\ardamis\
  2. Edit your HOSTS file (in Windows XP, the HOSTS file is located in C:\WINDOWS\system32\drivers\etc\) to add the following line, where sitename is the name of the folder you created in step 1. Don’t change or delete the existing “127.0.0.1 localhost” line.
    127.0.0.1 sitename.localhost
    

    Add a new line for each dev site folder you create.

    Continuing with the example, I’ve added the line:
    127.0.0.1 ardamis.localhost

  3. Open your drive:\xampp\apache\conf\extra\httpd-vhosts.conf file and add the following lines to the end of the file, using the appropriate letter in place of drive. Do this step only once. We’ll add code for each dev site’s folder in the next step. (Yes, keep the asterisk.)
    NameVirtualHost *:80
    <VirtualHost *:80>
        DocumentRoot "drive:/xampp/htdocs"
    ServerName localhost
    </VirtualHost>
    

    My DocumentRoot line would be:
    DocumentRoot "d:/xampp/htdocs"

  4. Immediately after that, add the following lines, changing sitename to the name of the new dev site’s folder, again using the appropriate letter in place of drive. Repeat this step for every folder you’ve created.
    <VirtualHost *:80>
        DocumentRoot "drive:/xampp/htdocs/sitename"
        ServerName sitename.localhost
    </VirtualHost>
    

    My DocumentRoot line would be:
    DocumentRoot "d:/xampp/htdocs/ardamis"
    My ServerName line would be:
    ServerName ardamis.localhost

  5. Reboot your computer to be sure it’s using the new HOSTS file (you’ll have to at least restart Apache). You should now be able to access each dev domain by way of:
    http://sitename.localhost/

For XAMPP version 1.4

If you are using an older version of XAMPP (like XAMPP version 1.4) without the httpd-vhosts.conf file, use the instructions below.

  1. Create a folder in your drive:\apachefriends\xampp\htdocs\ for each local version of your site. For example, if I’m creating a development site for ardamis.com on my f: drive, I’d create a folder at:
    f:\apachefriends\xampp\htdocs\ardamis\
  2. Open your HOSTS file (in Windows XP, the HOSTS file is located in C:\WINDOWS\system32\drivers\etc\) and add the following line, where sitename is the name of the folder you created in step 1. Repeat this step, as necessary, for each folder you create. Don’t change or delete the existing “127.0.0.1 localhost” line.
    127.0.0.1 sitename.localhost
    

    Continuing with the example, I’ve added the line:
    127.0.0.1 ardamis.localhost

  3. Open your drive:\apachefriends\xampp\apache\conf\httpd.conf file and add the following lines to the end of the file, using the appropriate letter for drive. Do this step only once. We’ll add code for each dev site’s folder in the next step. (Yes, keep the asterisk.)
    NameVirtualHost *:80
    <VirtualHost *:80>
        DocumentRoot "drive:/apachefriends/xampp/htdocs"
    ServerName localhost
    </VirtualHost>
    

    My DocumentRoot line would be:
    DocumentRoot "f:/apachefriends/xampp/htdocs"

  4. Immediately after that, add the following lines, changing sitename to the name of the new folder, using the appropriate letter for drive and repeating this step for every folder you’ve created.
    <VirtualHost *:80>
        DocumentRoot "drive:/apachefriends/xampp/htdocs/sitename"
        ServerName sitename.localhost
    </VirtualHost>
    

    My DocumentRoot line would be:
    DocumentRoot "f:/apachefriends/xampp/htdocs/ardamis"

  5. Reboot and restart Apache. Open a browser; you should now be able to access each folder by way of:
    http://sitename.localhost

I’m assuming that you could change the DocumentRoot line to point to any folder on any drive. I’ll experiment with pointing this at a folder on another drive later.

The official Apache.org documentation for VirtualHost is at http://httpd.apache.org/docs/2.2/vhosts/. You may want to read that for further details before you try to set up virtual hosts.

If you have any questions about the above instructions, the Apache NameVirtualHost function, XAMPP, or anything in between, post a comment, but I can’t promise that I’ll be able to help. I’m learning as I go along, too.