Always Get Better

Archive for the ‘Ubuntu’ Category

Setting up WordPress with nginx and FastCGI

Monday, January 30th, 2012

All web site owners should feel a burning need to speed. Studies have shown that viewers waiting more than 2 or 3 seconds for content to load online are likely to leave without allowing the page to fully load. This is particularly bad if you’re trying to run a web site that relies on visitors to generate some kind of income – content is king but speed keeps the king’s coffers flowing.

If your website isn’t the fastest it can be, you can take some comfort in the fact that the majority of the “top” web sites also suffer from page load times pushing up into the 10 second range (have you BEEN to Amazon lately?). But do take the time to download YSlow today and use its suggestions to start making radical improvements.

I’ve been very interested in web server performance because it is the first leg of the web page’s journey to the end user. The speed of execution at the server level is capable of making or breaking the user’s experience by controlling the amount of ‘lag time’ between the web page request and visible activity in the web browser. We want our server to send page data as immediately as possible so the browser can begin rendering it and downloading supporting files.

Not long ago, I described my web stack and explained why I moved away from the “safe” Apache server solution in favour of nginx. Since nginx doesn’t have a PHP module I had to use PHP’s FastCGI (PHP FPM) server with nginx as a reverse proxy. Additionally, I used memcached to store sessions rather than writing to disk.

Here are the configuration steps I took to realize this stack:

1. Memcached Sessions
Using memcached for sessions gives me slightly better performance on my Rackspace VM because in-memory reading&writing is hugely faster than reading&writing to a virtualized disk. I went into a lot more detail about this last April when I wrote about how to use memcached as a session handler in PHP.

2. PHP FPM
The newest Ubuntu distributions have a package php5-fpm that installs PHP5 FastCGI and an init.d script for it. Once installed, you can tweak your php.ini settings to suit, depending on your system’s configuration. (Maybe we can get into this another time.)

3. Nginx
Once PHP FPM was installed, I created a site entry that would pass PHP requests forward to the FastCGI server, while serving other files directly. Since the majority of my static content (css, javascript, images) have already been moved to a content delivery network, nginx has very little actual work to do.


server {
listen 80;
server_name sitename.com www.sitename.com;
access_log /var/log/nginx/sitename-access.log;
error_log /var/log/nginx/sitename-error.log;
# serve static files
location / {
root /www/sitename.com/html;
index index.php index.html index.htm;

# this serves static files that exists without
# running other rewrite tests
if (-f $request_filename) {
expires 30d;
break;
}

# this sends all-non-existing file or directory requests to index.php
if (!-e $request_filename) {
rewrite ^(.+)$ /index.php?q=$1 last;
}
}

location ~ \.php$ {
fastcgi_pass 127.0.0.1:9000;
fastcgi_index index.php;
fastcgi_param SCRIPT_FILENAME /www/sitename.com/html$fastcgi_script_name;
include fastcgi_params;
}
}

The fastcgi_param setting controls which script is executed, based upon the root path of the site being accessed. All of the requests parameters are passed through to PHP, and once the configuration is started up I didn’t miss Apache one little bit.

Improvements
My next step will be to put a varnish server in front of nginx. Since the majority of my site traffic comes from search engine results where a user has not yet been registered to the site or needs refreshed content, Varnish can step in and serve a fully cached version of my pages from memory far faster than FastCGI can render the WordPress code. I’ll experiment with this setup in the coming months and post my results.

How to Upgrade Firefox using Ubuntu

Saturday, August 6th, 2011
Mochila Firefox
Creative Commons License photo credit: jmerelo

So I got tired of using Firefox 3.6 in my Ubuntu machine and decided to upgrade to the newest version (5.0). It’s understandable that the package maintainers responsible for Ubuntu don’t put bleeding-edge cutting-edge releases in the distribution due to the possibility of introducing unstable elements into the user experience. But Firefox 4 has been out for over a year, and the migration to 5 is well underway.

Fortunately, it couldn’t be much easier to get the newest official release using our good friend aptitude.

In a terminal window, add the Mozilla team’s stable Firefox repository by issuing the following command:


sudo add-apt-repository ppa:mozillateam/firefox-stable

Next, perform an update to get the package listing, and upgrade to install the newest browser:


sudo apt-get update
sudo apt-get upgrade

That’s it – you’re done! Your shortcuts are even updated, and any bookmarks or open tabs you might have had on the go are carried forward.

I was pleasantly surprised at how easy this process was.

Finding Your Ubuntu Version

Friday, April 29th, 2011

Apart from the login screen, if you are using a Ubuntu computer and want to know the version number:


> cat /etc/lsb-release
DISTRIB_ID=Ubuntu
DISTRIB_RELEASE=10.04
DISTRIB_CODENAME=lucid
DISTRIB_DESCRIPTION="Ubuntu 10.04.2 LTS"

Cheap File Replication: Synchronizing Web Assets with fsniper

Sunday, November 14th, 2010

Awhile ago I wrote about how I was using nginx to serve static files rather than letting the more memory-intensive Apache handle the load for files that don’t need its processing capabilities. The basic premise is that nginx is the web-facing daemon and handles static files directly from the file system, while shipping any other request off to Apache on another port.

What if Apache is on a different server entirely? Unless you have the luxury of an NAS device, your options are:

1. Maintain a copy of the site’s assets separate from the web site
There are two problems with this approach: maintainability, and synchronization. You’ll have to remember to deploy any content changes separately to the rest of the site, which is counter-intuitive and opens up your process to human error. User-generated content stays on the Apache server and would be inaccessible to nginx.

2. Use a replicating network file system like GlusterFS
Network-based replication systems are advanced and provide amazing redundancy. Any changes you make to one server can be replicated to the others very quickly, so any user generated content will be available to your content servers, and you only have to deploy your web site once.

The downside is that many NFS solutions are optimized for larger (>50Mb) filesizes. If you rely on your content server for small files (images, css, js), the read performance may decline when your traffic numbers increase. For high availability systems where it is critical for each server to have a full set of up-to-date files, this is probably the best solution.

3. Use an rsync-based solution
This is the method I’ve chosen to look at here. It’s important that my content server is updated as fast as possible, and I would like to know that when I perform disaster recovery or make backups of my web site the files will be reasonably up to date. If a single file takes a few seconds to appear on any of my servers, it isn’t a huge deal (I’m just running WordPress).

The Delivery Mechanism
rsync is fast and installed by default on most servers. Pair it with ssh and use password-less login keys, and you have an easy solution for script-able file replication. The only missing piece is the “trigger” – whenever the filesystem is updated, we need to run our update script in order to replicate to our content server.

Icrond is one possible solution – whenever a directory is updated icrond can run our update script. The problem here is that service does not act upon file updates recursively. fsniper is our solution.

The process flow should look like this.
1. When the content directory is updated (via site upload or user file upload), fsniper initiates our update script.
2. Update script connects to the content server via ssh, and issues an rsync command between our content directory and the server’s content directory.
3. Hourly (or whatever), initiate an rsync command from the content server to any web servers – this will keep all the nodes fairly up-to-date for backup and disaster recovery purposes.

Life in Linux

Friday, August 6th, 2010

So I wiped my hard drive and installed Ubuntu. After struggling with the decision to switch from Windows for some time, I finally resolved to move.

So far the results have been very good. My system boots up and is ready to use in less than a minute, there is no lag loading and switching programs, and everything I need for my day-to-day programming is available much more readily than it was with the other operating system.

The most striking difference to me is the amount of disk space I now have available to me. With all of my software, work projects, and operating system overhead, Windows left 80Gb free from my 285Gb drive. With all of my projects, code libraries, files and operating system installed, Ubuntu uses just 6.7Gb, leaving 97% of the drive available for my use. I am blown away by how much less clutter I have now.

I haven’t tried to do very much with Mono yet; we’ll see how it works when I try making improvements to my SiteAssistant project. I’ve been reading about Mono’s Winforms capabilities and so far am impressed by the possibilities. We’ll see how well it works with my fairly simple project; with any luck I may have found a cross-platform .NET solution with this one. Maybe the Winforms explorations will be a good topic for a future post.

Not missing Office yet, either. My Quicken financial software has been running perfectly under Wine, and all of my files appear to have made the move intact. I still own licenses to all my software, so on those rare instances if I really need it I can install Windows with VirtualBox and fill up some of that hard drive space I’ve earned.

Installing Git on Ubuntu 10.04

Thursday, August 5th, 2010

Here’s how to install Git on Ubuntu 10.04


sudo apt-get install git-core

(The package name is git-core, not git)

Thinking About Switching to Ubuntu

Monday, August 2nd, 2010
Ubuntu-logo-unique-image
Creative Commons License photo credit: Jeffpro57

In the last number of weeks I have been seriously considering taking the plunge and wiping my Windows laptop clean in order to switch to Ubuntu as my primary machine. Although Windows 7 has gone a long way toward smoothing over the problems Vista brought, it isn’t perfect.

Windows isn’t a bad operating system, by any means. Like OS X and Ubuntu, it has its strengths and weaknesses. However, as a developer whose primary work involves web pages, I definitely see Windows as more of a barrier to efficient workflow. There are a few pieces of software that have kept me on Windows for awhile but which just don’t hold me back anymore:

1. Microsoft Office
I definitely qualify as a power user for this software. Yes, OpenOffice can do most of what MS Office can, but I will sorely miss the features that “most people” don’t use. However, the majority of my work doesn’t touch Office – in fact, it’s relatively rare that I will need Access, or Word, or even Excel. When I do use these programs, it only tends to be in support of a client who has used them inappropriately for some data storage.

I’ve long outgrown Outlook due to the amount of mail I keep; I don’t like to delete anything because true professionals are able to refer back to projects no matter how old. I don’t have a replacement mail program yet; but so far the Gmail interface has been more than sufficient.

2. Visual Studio
This software is giving me pause. If I switch over, I will be giving up my ability to truly work in the .NET world, which is where I have largely been for the past decade. Most of my workflow recently has been with the open source, PHP-driven web world and I’m not sure that I’m excited about going back to a pure Microsoft environment. That said, I want to be sure I’m not closing any doors.

Mono has made great strides in bringing the .NET platform, specifically C#, over to Mac and Unix, but the more Windows-centric database and GUI interfaces don’t translate over very well. I can always run a Windows Virtual Machine for the rare instances I will need to work on that platform, but it seems a bit counter-productive to keep around an environment that I don’t use for the sake of a few days each year.

3. Quicken
My other strong reason for staying with Windows has been my love for Quicken – the Mac version just doesn’t compare to the Windows version – and my inability to manage my finances on paper after years of dependence. Since so much of my data is tied into this software, any switch will involve either years of data entry or a major hit to my forecasting abilities.

Fortunately, Wine has come to the rescue – last night I was able to get a full install of Quicken on my test/Ubuntu machine with absolutely no problems! The fonts looked a little weird in the reports, but otherwise all of the functionality was there and working beautfully. It even looked like a Linux app – unbelievable!

Should I Stay or Should I Go?
So the big thing I’m weighing in my head right now is whether I can stand to give up the .NET programming I have been involved with for so long and switch to a full Linux environment. I still love my Windows environment and software, but it just doesn’t seem to make sense to keep it given my current open source focus.