Tag Archives: web app

I have to look up how to implement a 301 redirect in ASP every few months, so I’m putting this up as a personal reference.

<%@ Language="VBScript" %>
<% 
Response.Status = "301 Moved Permanently" 
Response.AddHeader "Location", "http://domain.com/page.asp" 
Response.End 
%> 

It seems that the value of Location can be a relative path and the redirect will still function.

A collection of web development tools for building better sites more easily.

Frameworks and scripts

HTML5 Boilerplate is the professional badass’s base HTML/CSS/JS template for a fast, robust and future-proof site.

scriptsrc.net is a collection of script tags of the latest versions of a range of JavaScript libraries.

Modernizr adds classes to the <html> element which allow you to target specific browser functionality in your stylesheet. You don’t actually need to write any Javascript to use it.

Images

placehold.it is a quick and simple image placeholder service.

Text manipulation

TextFixer is a collection of online text tools. Remove line breaks from text, alphabetize text, capitalize the first letter of sentences, remove whitespaces, and uppercase text or lowercase text.

XHTML

HTML Minifier will minify HTML (or XHTML), and any CSS or JS included in your markup.

CSS

CSS3 Generator is an awesome code generator for CSS3 snippets, and shows the minimum browser versions required to display the effects.

proCSSor is an advanced CSS prettifier with tons of formatting options.

JavaScript

Online javascript beautifier will reformat and reindent bookmarklets, ugly javascript, unpack scripts packed by the popular Dean Edward’s packer, as well as deobfuscate scripts processed by javascriptobfuscator.com. The source code for the latest version is always available on github, and you can download the beautifier for local use (zip, tar.gz) as well.

Fonts and Typography

Fontshop.com has written A Field Guide to Typography to get you excited about fonts and typography.

Typetester is an online app for comparing different fonts for the screen, you can test up to three fonts at a time and choose the one you like. Its primary role is to make web designer’s life easier.

A quick chart of the fonts common to all versions of Windows and the Mac equivalents, or a more extensive matrix of fonts bundled with Mac and Windows operating systems, Microsoft Office and Adobe Creative Suite.

<html>ipsum has Lorem ipsum already wrapped in HTML tags. Pre-made paragraphs, lists, etc…

More resources at: 50 Useful Design Tools For Beautiful Web Typography and 21 Typography and Font Web Apps You Can’t Live Without.

Colors

Color Scheme Designer.

Markup

Google Webmaster Tools’ Rich Snippets Testing Tool.

Use the Rich Snippets Testing Tool to check that Google can correctly parse your structured data markup and display it in search results.

File compression is possible on Apache web hosts that do not have mod_gzip or mod_deflate enabled, and it’s easier than you might think.

A great explanation of why compression helps the web deliver a better user experience is at betterexplained.com.

Two authoritative articles on the subject are Google’s Performance Best Practices documentation | Enable compression and Yahoo’s Best Practices for Speeding Up Your Web Site | Gzip Components.

Compressing PHP files

If your Apache server does not have mod_gzip or mod_deflate enabled (Godaddy and JustHost shared hosting, for example), you can use PHP to compress pages on-the-fly. This is still preferable to sending uncompressed files to the browser, so don’t worry about the additional work the server has to do to compress the files at each request.

Option 1: PHP.INI using zlib.output_compression

The zlib extension can be used to transparently compress PHP pages on-the-fly if the browser sends an “Accept-Encoding: gzip” or “deflate” header. Compression with zlib.output_compression seems to be disabled on most hosts by default, but can be enabled with a custom php.ini file:

[PHP]

zlib.output_compression = On

Credit: http://php.net/manual/en/zlib.configuration.php

Check with your host for instructions on how to implement this, and whether you need a php.ini file in each directory.

Option 2: PHP using ob_gzhandler

If your host does not allow custom php.ini files, you can add the following line of code to the top of your PHP pages, above the DOCTYPE declaration or first line of output:

<?php if (substr_count($_SERVER['HTTP_ACCEPT_ENCODING'], 'gzip')) ob_start("ob_gzhandler"); else ob_start(); ?>

Credit: GoDaddy.com

For WordPress sites, this code would be added to the top of the theme’s header.php file.

According to php.net, using zlib.output_compression is preferred over ob_gzhandler().

For WordPress or other CMS sites, an advantage of zlib.output_compression over the ob_gzhandler method is that all of the .php pages served will be compressed, not just those that contain the global include (eg.: header.php, etc.).

Running both ob_gzhandler and zlib.output_compression at the same time will throw a warning, similar to:

Warning: ob_start() [ref.outcontrol]: output handler ‘ob_gzhandler’ conflicts with ‘zlib output compression’ in /home/path/public_html/ardamis.com/wp-content/themes/mytheme/header.php on line 7

Compressing CSS and JavaScript files

Because the on-the-fly methods above only work for PHP pages, you’ll need something else to compress CSS and JavaScript files. Furthermore, these files typically don’t change, so there isn’t a need to compress them at each request. A better method is to serve pre-compressed versions of these files. I’ll describe a few different ways to do this, but in both cases, you’ll need to add some lines to your .htaccess file to send user agents the gzipped versions if they support the encoding. This requires that Apache’s mod_rewrite be enabled (and I think it’s almost universally enabled).

Option 1: Compress locally and upload

CSS and JavaScript files can be gzipped on the workstation, then uploaded along with the uncompressed files. Use a utility like 7-Zip (quite possibly the best compression software around, and it’s free) to compress the CSS and JavaScript files using the gzip format (with extension *.gz), then upload them to your server.

For Windows users, here is a handy command to compress all the .css and .js files in the current directory and all sub directories (adjust the path to the 7-Zip executable, 7z.exe, as necessary):

for /R %i in (*.css *.js) do "C:\Program Files (x86)\7-Zip\7z.exe" a -tgzip "%i.gz" "%i" -mx9

Note that the above command is to be run from the command line. The batch file equivalent would be:

for /R %%i in (*.css *.js) do "C:\Program Files (x86)\7-Zip\7z.exe" a -tgzip "%%i.gz" "%%i" -mx9

Option 2: Compress on the server

If you have shell access, you can run a command to create a gzip copy of each CSS and JavaScript file on your site (or, if you are developing on Linux, you can run it locally):

find . -regex ".*\(css\|js\)$" -exec bash -c 'echo Compressing "{}" && gzip -c --best "{}" > "{}.gz"' \;

This may be a bit too technical for many people, but is also much more convenient. It is particularly useful when you need to compress a large number of files (as in the case of a WordPress installation with multiple plugins). Remember to run it every time you automatically update WordPress, your theme, or any plugins, so as to replace the gzip’d versions of any updated CSS and JavaScript files.

The .htaccess (for both options)

Add the following lines to your .htaccess file to identify the user agents that can accept the gzip encoded versions of these files.

<files *.js.gz>
  AddType "text/javascript" .gz
  AddEncoding gzip .gz
</files>
<files *.css.gz>
  AddType "text/css" .gz
  AddEncoding gzip .gz
</files>
RewriteEngine on
#Check to see if browser can accept gzip files.
ReWriteCond %{HTTP:accept-encoding} gzip
RewriteCond %{HTTP_USER_AGENT} !Safari
#make sure there's no trailing .gz on the url
ReWriteCond %{REQUEST_FILENAME} !^.+\.gz$
#check to see if a .gz version of the file exists.
RewriteCond %{REQUEST_FILENAME}.gz -f
#All conditions met so add .gz to URL filename (invisibly)
RewriteRule ^(.+) $1.gz [QSA,L]

Credit: opensourcetutor.com

I’m not sure it’s still necessary to exclude Safari.

For added benefit, minify the CSS and JavaScript files before gzipping them. Google’s excellent Page Speed Firefox/Firebug Add-on makes this very easy. Yahoo’s YUI Compressor is also quite good.

Verify that your content is being compressed

Use the nifty Web Page Content Compression Verification tool at http://www.whatsmyip.org/http_compression/ to confirm that your server is sending the compressed files.

Speed up page load times for returning visitors

Compression is only part of the story. In order to further speed page load times for your returning visitors, you will want to send the correct headers to leverage browser caching.

A friend who does a lot of selling on eBay asked me to develop a web application for generating HTML templates that could be copied and pasted into his auctions. He wanted to be able to add a title, a description, and some terms and conditions, and also to be able to upload images so the auction template would display thumbnails that could be clicked to display full-sized versions in a new window or tab. The more we talked, the more it sounded like something that would be both useful for a good number of people, and potentially a source of ad revenue for us. And so was born Simple Auction Wizard, an online HTML template generator for auction websites like eBay.

I was looking for just this kind of a project because I wanted a reason to play with TinyMCE, an Open Source, platform independent, web-based JavaScript HTML WYSIWYG editor control, and SWFUpload, a small JavaScript/Flash library that does very neat things with file uploads.

The most difficult part was actually getting the templates to look good in a variety of browsers. Because the template HTML appears inside of a larger page, I couldn’t rely on Doctype switching to place IE into standards mode. This meant the templates would have to be developed so that they would look approximately the same in standards-compliant browsers like Firefox, Chrome, and Safari, as well as horrible browsers like IE6 in quirks-mode. I tried very hard to minimize the use of tables, but they eventually crept in. I was able to use CSS, for the most part, though if any more issues come up, I’m going to throw in the towel and just do inline styles.

The web app is open to the public, but realize that it’s very early in its development and I intend to continue making changes.

A little over a year ago, I wrote a post about a PHP script I had created for protecting a download using a unique URL. The post turned out to be pretty popular, and many of the comments included requests to extend the script in useful ways. So, I’ve finally gotten around to updating the script to generate multiple URLs (up to 20) at a time, to allow different files to be associated with different keys, and to allow brief notes to be attached to the download key.

I’ve also added a simple page that prints out a list of all of the keys generated the date and time that each key was created, the filename of the download on the server that the key accesses, the number of times the key was used, and any attached note. This should make it easier to generate gobs of keys, drop them into an Excel spreadsheet, and help the files’ owner keep track of who’s getting which file, and how often.

The scripts themselves are a little more involved this time around, but the general idea is the same. A unique key is generated and combined with a URL that allows access to a single file on the server. Share the URL/key instead of the URL to the file itself to allow a visitor to download the file, but not to know the location of the file. The key will be valid for a certain length of time and number of downloads, and will stop working once the first limiting condition is met. This should prevent unauthorized downloading due to people sharing the keys.

How it works

There are six main components to this system:

  1. the MySQL database that holds each key, the key creation date and time, the maximum age of the key, the number of times the key has been used, the maximum times the key may be used, the file associated with the key, and the note attached to the key, if any
  2. a generatekey.php page that generates the keys and outputs the corresponding unique URLs
  3. a download.php page that accepts the key, checks its validity, and either initiates the download or rejects the key as invalid
  4. a report.php page that returns all of the data in the database
  5. a config.php file that contains variables such as number of downloads allowed, the maximum allowable age of the key, and the filenames of the downloads, along with the database connection information
  6. the .zip file(s) to be protected

The files

The files, along with two example downloads, are available for download as a .zip file.

Download the protecting multiple downloads PHP script

The MySQL database

Using whatever method you’re comfortable with, create a new MySQL database named “download” and add the following table:

CREATE TABLE `downloadkeys` (
  `uniqueid` varchar(12) NOT NULL default '',
  `timestamp` INT UNSIGNED,
  `lifetime` INT UNSIGNED,
  `maxdownloads` SMALLINT UNSIGNED, 
  `downloads` SMALLINT UNSIGNED default '0',
  `filename` varchar(60) NOT NULL default '',
  `note` varchar(255) NOT NULL default '',
  PRIMARY KEY (uniqueid)
);

How to use the scripts

The scripts require a little setup before they’re ready to be used, so open config.php in your text editor of choice.

Change the values for $db_host, $db_username, $db_password, $db_name to point to your database.

Set the variable $maxdownloads equal to the maximum number of downloads (actually, the number of page loads).

Set the variable $lifetime equal to the keys’ viable duration in seconds (86400 seconds = 24 hours).

Set the variable $realfilenames to the real names of actual download files on the server as a comma-separated list (this is optional; you can also use a single filename or just leave it as empty double-quotes: “”). If you have more than one file to protect, enter the names as a comma-separated list and the script will create a drop-down menu as the Filename field. If you leave the variable blank, the form will display an empty input box as the Filename field.

Set the variable $fakefilename to anything – this is what the visitor’s file will be named when the download is initiated.

I would strongly recommend renaming generatekey.php, as anyone who can view it will be able to create unlimited numbers of keys, and worse, they’ll be able to see the filenames (if you set them in config.php). I would also recommend that the directory you put these files into, and each directory on your site (/images, /css, /js, etc.), contain an index.html file. This is a simple security measure that will prevent visitors from snooping around a directory and viewing its contents (though access to the directory contents is usually prohibited by a setting on the server).

Place all the PHP scripts and your .zip file(s) into the same directory on your server.

That’s all there is to it. Whenever you want to give someone access to the download, visit the generatekey.php page and fill out the form. It will generate a key code, save it to a database, and print out a unique link that you can copy and paste into an email or whatever. The page that the unique link points to checks to see if the key code is legitimate, then checks to see if the code is less than X hours old, then checks to see if it has been used less than X times. The visitor will get a descriptive message for the first unmet condition and the script will terminate. If all three conditions are met, the download starts automatically.

Errors and issues

Note: The download will not initiate automatically, and will actually be output as text on the page, if the download.php page is changed to send headers or any output to the browser. Be careful when making modifications or incorporating this script into another page.

Check the HTTP headers (Google for an online service that does this, or install the LiveHTTPHeaders Firefox plugin) of the download link. If the script is working correctly, you should see Content-Transfer-Encoding: binary and Content-Type: application/octet-stream in the headers. If you’re getting a page of text instead of the zip file, you’ll probably see Content-Type: text/html.

Example HTTP headers for a correctly working download

If the script is working correctly, the HTTP headers will look something like this:

HTTP/1.1 200 OK
Date: Sun, 20 Jun 2010 13:31:50 GMT
Server: Apache
Cache-Control: must-revalidate, post-check=0, pre-check=0, private
Content-Disposition: attachment; filename="bogus_download_name.zip"
Content-Transfer-Encoding: binary
Pragma: public
Content-Length: 132
Keep-Alive: timeout=15, max=100
Connection: Keep-Alive
Content-Type: application/octet-stream

I’ve finally opened a Twitter account, so you can follow me at http://twitter.com/ardamis. As a social experiment, it’s interesting to watch and discuss, but I haven’t really participated much. On the other hand, I’m very interested in the phenomenon of URL shortening. So, without too much trouble, I put together yet another URL shortening service: Minifi.de (‘minified’, in the jargon). It does pretty much the same thing as tinyurl.com, bit.ly and is.gd – you enter a long URL and it returns a shorter one. I’m about 50% done with the API (it works, so long as the URL valid), so if anyone knows of any clients that allow the user to specify a shortening service, I’d like to test that functionality. I have plans to make it account-based, so that you can track usage statistics and such, but that will only happen after I’m confident that the thing will survive in the wild.

So, feel free to give it a whirl, but know that it’s still in development and that I’ll probably have to wipe the database a few more times before I get it just right.

I’ve been developing a PHP/MySQL web application that will be accessed by multiple users. These users will be both viewing and editing records in the database. Obviously, any situation in which multiple users may be performing operations on the same record puts the integrity of the data at risk.

In the case of my application, there is a very real possibility (a certainty, actually) that two or more people will open the same record at the same time, make changes, and attempt to save these changes. This common concurrent execution problem is known as the “lost update”.

User A opens record “A”.
User B opens record “A”.
User A saves changes to record “A”.
User B saves changes to record “A”.

In this example, User A’s changes are lost – replaced by User B’s changes.

I started to look for a method of preventing this sort of data loss. At first, I wanted to lock the record using a sort of check-in/check-out system. User A would check-out record “A”, and then have exclusive write access to that record until it was checked back in as part of the saving process. There were a number of problems with this method, foremost that User A may decide to not make any changes and so not save the record, which would leave the record in a checked-out state until further administrative action was taken to unlock it.

For awhile, I tried to come up with some ingenious way around this, which usually boiled down to somehow automatically unlocking the record after a period of time. But this is not a satisfactory solution. For one thing, a user may have a legitimate reason to keep the record checked-out for longer periods. For another, User B shouldn’t have to wait for a time-out event to occur if User A is no longer in the record.

So what I eventually came up with is a method of checking whether a record has been changed since it was accessed by the user each time a save is initiated. My particular way of doing this involves comparing timestamps, but other ways exist.

Here’s how I’m implementing my solution to the lost update concurrency issue:

User A creates record “A” and saves it at 9:00 AM. A “last-saved timestamp” of 9:00 AM is generated and saved to the record.

User A opens record “A” at 10:00 AM. An “opened timestamp” of 10:00 AM is generated and written to a hidden (or readonly) input field on the HTML page.
User B opens record “A” at 10:00 AM. An “opened timestamp” of 10:00 AM is generated and written to a hidden (or readonly) input field on the HTML page.

At 10:30 AM, User A attempts to save the record. The “last-saved timestamp” is retrieved from the record. The “opened timestamp” of 10:00 AM is compared to the “last-saved timestamp” of 9:00 AM. Because the record has not been changed since it was opened, the record is saved. A new “last-saved timestamp” of 10:30 AM is generated and saved to the record.

At 11:00 AM, User B attempts to save the record. The “last-saved timestamp” is retrieved from the record. The “opened timestamp” of 10:00 AM is compared to the “last-saved timestamp” of 10:30 AM (User A’s timestamp). Because the record has been changed since it was opened by User B, User B is not allowed to save the record.

User B will have to re-open record “A”, consider the effect that User A’s changes may have, and then make any desired changes.

Unless I’m missing something, this assures that the data from the earlier save will not be overwritten by the later save. To keep things consistent, I’m using PHP to generate all of my timestamps from the server clock, as JavaScript time is based on the user’s system time and is therefore wholly unreliable.

The main drawback, that I see, is extra work for User B, who has to now review the record as saved by User A before deciding what changes to make. But this is going to be necessary anyway, as changes made between when User B opened and attempted to save the record may influence User B’s update.

The strange thing is that I haven’t seen this offered as a solution on any of the pages I found while Googling for solutions to the access control, lost update and other concurrency-related data loss problems. Lots of people acknowledge the problem and potential for data loss, but few offer solutions on the application level – preferring to rely on a database engine’s ability to lock rows.