Use pigz for ultra fast, parallel gzipping

Use pigz for ultra fast, parallel gzipping

gzip can be slow. Painfully slow. If you have a server with a bunch of processors then you can use pigz (pig-zee, or I guess pig-zed if you are Canadian) to utilize those processors.

Installation is easy: aptitude install pigz

Usage is also easy. For example, to tar and gzip /home/jacob using 3 processors, you could use this command: tar cf – -C / /home/jacob | pigz -p 3 > backup.tar.gz

I’m using this for some of my backup tasks and have dramatically cut down the time it takes for my backups to run. I’ve also used this a few times for bundling files up to move between servers.

Featured image creative commons licensed ( BY ) flickr photo shared by MIKI Yoshihito (´・ω・)

Read More

Increase maximum upload size with nginx and PHP

Increase maximum upload size with nginx and PHP

I recently reconfigured my servers and switched from Apache HTTP to nginx. My wife wanted to upload some photos to her blog but it kept giving her an error. I checked the error log on the server and found this weird error message:

2014/12/23 16:03:18 [error] 23573#0: *4374978 client intended to send too large body: 1070520 bytes, client: 24.243.18.61, server: www.awkwardsheturtle.com, request: “POST /wp-admin/async-upload.php HTTP/1.1”, host: “www.awkwardsheturtle.com”, referrer: “http://www.awkwardsheturtle.com/wp-admin/media-new.php”

nginx

First I needed to increase the client maximum body size in nginx. This is done by adding the following line to /etc/nginx/nginx.conf in the http block:

client_max_body_size 8M;

Reload the nginx config and you should be good to go:

service nginx reload

PHP

Next, edit your php.ini (for me this is located at /etc/php5/fpm/php.ini). Find the following lines and update their values:

upload_max_filesize = 8M
post_max_size = 8M

Restart PHP to load the new configuration:

service php5-fpm restart
Read More

OhLife is dead, long live MyLife!

OhLife is dead, long live MyLife!

I started using OhLife, a free journal-by-email service, in October of 2011. Every morning I received an email asking me about my day and showing a snippet of my past journal entries. I just had to reply to an email in order to add an entry. I was very sad to hear, 3 years later, that they were shutting down. Fortunately a pile of OhLife clones have popped up to take its place.

My favorite is MyLife. It is free and provides an experience similar to OhLife. It runs on Google App Engine so there is little risk of the service going down in the future. Setup takes a bit of effort, but the author has provided step-by-step instructions.

MyLife is written in Python and is open source. I’m not great at Python but the code is straightforward enough that I was able to submit a bug fix. It should be pretty easy to add additional features if you’ve always had the itch to customize OhLife to fit your exact needs.

Read More

Picking a second DNS server

Picking a second DNS server

A few months back Time Warner suffered a massive DNS failure. Customers across the country (myself included) were unable to browse the web because Time Warner was unable to tell them what IP addresses went with which domain names. It was incredibly frustrating.

Since then, I’ve always made sure my computer is configured to use two separate DNS servers from different companies: one from whatever internet service I’m connected to, and another from somebody else. But how to pick the second DNS server? Google Public DNS? OpenDNS? Or something else?

I found this neat software called DNS Benchmark (or Domain Name Speed Benchmark, or Domain Name Server Benchmark… they aren’t consistent with the naming…). It checks a pile of DNS servers to see which perform best for you, and which are likely the most reliable.

Once you’ve picked your second DNS server, you just need to update your computer to use it and one of the DNS servers provided by your router. This process varies based on your OS, so I’m not going to walk you through it (there are hundreds of articles on Google to help you with this).

Time Warner has suffered occasional DNS issues since then, but I’ve never noticed. If the first DNS servers fails, your computer is smart enough to try the second.

Read More

Cloud servers, VPS servers, Ubuntu, and nginx

Cloud servers, VPS servers, Ubuntu, and nginx

I’ve always used bare metal servers running Apache HTTP on CentOS. I’ve always run nearly my entire business off of a single server. Every couple years I’ll order a new one, move everything over, and call it a day.

But the world is changing. Bare metal prices are going up while cloud and VPS server prices are going down. Ubuntu has surpassed CentOS for market share. nginx is second only to Apache for market share of active sites. It was time for me to change, too.

Over the past few months I’ve been moving everything I have over to cloud and VPS servers, Ubuntu, and nginx. The result has been phenomenal!

Fake Mail Generator

I started with my Fake Mail Generator server. I had a single huge VPS at Linode running CentOS and Apache HTTP. I love Linode. Their servers are affordable, reliable, and their support is great. They also offer discounts for paying annually.

I split this single server into four separate VPS servers: a frontend web server, two mail servers, and a database server. I used Ubuntu on all of the servers and switched to nginx. It was amazing! I’m using a fraction of the RAM and CPU I was using before, even though my traffic has increased dramatically since the server move.

Even better, the four VPS servers cost me less than I was paying for a single huge server. Part of this is because Linode gives you less for your money as you order larger servers (e.g., four 2GB servers gives you more CPU than one 8GB server) so I was able to get more for my money by splitting into multiple servers, but the biggest difference was nginx. It is serving millions of pageviews per month with only 1GB of RAM. There is no way Apache HTTP could do that. nginx made it possible to buy less powerful servers and get the same amount of work done.

Everything else

I’ve been hosting everything else, including the high traffic Fake Name Generator, on a single bare metal server at SoftLayer. I love SoftLayer, too. I started with The Planet many years ago, which got merged with SoftLayer, which recently got purchased by IBM. So my servers have been passed around a bit, but the quality of service has always remained high.

Unfortunately, SoftLayer’s bare metal servers have been going up in price. I tried switching to a less expensive company and had a horrible experience, so I decided I wanted to stay with SoftLayer. I’ve been fretting over what to do for months when I received a coupon for up to $500 off my first month of cloud servers at SoftLayer. With nothing to lose, I gave it a shot.

SoftLayer cloud servers have all-inclusive pricing. You don’t have to pay extra for bandwidth, IP addresses, DNS, etc. You get everything you need to have a fully functional, publicly accessible server for one monthly fee. I like that a lot.

I decided to move MySQL to its own cloud server, so I’ve ended up with a database server and an “everything else” server. I chose local disks (which I’ve read are RAID 10) for better performance. I was able to reduce my number of CPU cores and total RAM, again thanks to nginx. If you aren’t using nginx, you really are missing out. Not only does it perform better, but it is easier to configure and use. I’ll never willingly use Apache HTTP again.

For heavy tasks, performance is noticeably slower compared to my bare metal server but this is to be expected. My bare metal server had RAID 1 SSDs, 12GB of RAM, and 16 blazing fast Intel Xeon CPU cores. There is no way a cloud server is going to come close to matching that performance.

But I’ve learned that I don’t really need it to. My webpages still load fast, backups finish in a reasonable amount of time, and I’m saving money by paying only for the resources I actually need.

Why not Amazon Web Services?

AWS EC2 servers are a terrible option for most companies. There, I said it.

The real benefit to AWS is automation and the ability to quickly scale. If you aren’t automating and you don’t need to scale, then you don’t need AWS and you are probably throwing your money away and complicating your life by buying into the AWS ecosystem.

AWS also has some reliability issues. Yes, you can get around these by deploying bunches of servers and load balancers and whatever, but that is a huge extra cost (and complexity) that most companies don’t need.

So I don’t use AWS. I’ve been happily running everything off of a single server for over a decade with 99.9% uptime.

Read More

Use QR codes to save paper backups of your private keys

Use QR codes to save paper backups of your private keys

I love QR codes. They make it incredibly easy to get chunks of text from paper to computer (or phone or whatever).

One of the ways I like to use them is to store offline, paper backups of my server private keys. A private key can be thousands of case-sensitive characters long. Nobody wants to type that in by hand. By creating a QR code, I can print it off and store it in my safe in case I need it.

But QR codes are awful!

Although I love QR codes, using them isn’t always pleasant. Most marketers suck at using QR codes: they randomly place them on products without context, link them to non-mobile websites or just their company homepage, or print them ultra tiny while including massive amounts of data.

This doesn’t mean QR codes suck, it just means people use them poorly. Blame the users, not the technology.

Private key backups

The first step is to create your private key. I like long 4096 bit keys, and I tend to create them using the PuTTY Key Generator. Use whatever you want. Doesn’t really matter.

Next, you need some QR code generating software. You could do this online but then you are giving your private key to a random stranger on the internet. I use QR-Code Studio, not because it is particularly awesome but because it was the first easy-to-use, free QR code software I stumbled upon.

Paste the text of your key into the input box in QR-Code Studio. Change the width/height to something large like 8 inches (the software will likely scale this down a bit).

You can optionally add a caption. I like to add the filename of the original key above the QR code.

Export the barcode to a PNG and print it. Get your phone out and make sure you can scan it. One of the most important rules of backups is to make sure your backup actually works.

Example QR code

An example key in PuTTY’s ppk format.

 

Why not a thumb drive, Dropbox, CD, etc?

My private keys are literally the keys to my business. They get me into everything that matters. I don’t want anyone having access to my keys but me, and I absolutely must have a backup.

Your files at Dropbox are not absolutely private and secure, even more so if you use third-party apps. Thumb drives and CDs have limited lifespans (as short as 1.9 years in some tests) and it is hard to know when they will fail. If my house burns down my external hard drive isn’t going to do me any good.

So that brings us to paper. I print on quality paper using a black and white laser printer. I’ve been unable to find any authoritative source to tell me how long laser prints will last. I suspect decades is a conservative estimate, especially stored in a fireproof safe.

Read More