Friday, August 14, 2009

Deploying a Rails Application to Dreamhost Part 2

I've created an empty Demo app in Dreamhost. I have a working Rails app locally. I need it working at Dreamhost remotely. I need a fast edit-and-deploy process. I need backups.

Assumptions, Versions:

I overload the term "Dreamhost" here to also denote the Dreamhost server that hosts all my domains and servers, including mikedll.com, demo.mikedll.com, and the hypothetical projectname.mikedll.com.

A MySQL deployment database has been created via Dreamhosts' Web Panel. Rails knows its contact info.

My demo app has been launched by Dreamhost using FastCGI, so I'll use FastCGI here (as oppose to Passenger).

Files like config/environment.rb, the logs/ directory, and the vendor/ directory don't need to be synced, but most other files do. A version control system can help with this. Beyond synchronization, it will help me recover when I break a feature. SCP and rsync won't do this. I'll use Git. Hidden Reflex says Git is better than Mercurial and compatible with Trac. Git has Github, which makes it trendy. Git also supports tiered revision, which Subversion doesn't. I've been used to tiered revisions since using AccuRev.

Syncplicity, which I've been using for nine months, will be used for semi-automatic backups.

Local and remote software versions compare as follows:

  • Operating System: Local Ubuntu 8.04.3 LTS (Hardy Heron) vs Dreamhost some variant of Linux 2.6.24
  • Ruby: Local 1.8.6 vs Dreamhost 1.8.5
  • Rails: Local 2.0.2 vs Dreamhost 2.2.2
  • RubyGems: Local 0.9 vs Dreamhost 1.3.1

My Ubuntu package manager is behind on versions for both RubyGems and Rails. Besides this problem, most of the community uses RubyGems for package management, not a default package manager.

Given that I'm on Rails 2.0.2 and Dreamhost is on 2.2.2, I anticipate version conflicts. I've encountered a similar situation before.

Develop a Plan

To avoid the version conflicts, I'll sync Rails versions. It'll be an upgrade for me.

I'll manually install RubyGems and use it to install Rails.

I will continue to use Ubuntu's package manager for non-ruby dependencies.

After that, I'll learn Git via the Git tutorial, and then put it in charge of my app.

I can test my edit-and-deploy process by making changes and then expecting to observe them on Dreamhost.

Carrying Out The Plan

I downloaded the following packages and gems into my ~/settings/packages directory.

I put RubyGems in charge.

# Remove Ubuntu's versions
sudo aptitude remove rails
sudo aptitude remove rubygems

# Install RubyGems.
# The only surprise here is that I have to create a symlink.
tar -xzf ./rubygems-1.3.5.tgz.tar -C ~/
cd ~/rubygems-1.3.5
sudo ruby setup.rb
sudo ln -s /usr/bin/gem1.8 /usr/bin/gem

I then installed some dependencies with Ubuntu. The ruby 1.8 development files may be required by the sqlite3-ruby gem. The sqlite3 development headers are definitely required. Without them, the sqlite3-ruby gem gives a "missing sqlite3.h" build error.

# Following above explanations, get sqlite3-ruby dependencies
sudo aptitude install ruby1.8-dev
sudo aptitude install libsqlite3-dev

While installing gems. I explicitly specified the Dreamhost versions. RubyGems recognized and used the downloaded .gem files, which were in the current working directory.

sudo gem install -V sqlite3-ruby --version 1.2.5
sudo gem install -V rails --version 2.2.2
cd ~/myapp
rake rails:update  # update my app's files to 2.2.2

Here, I deleted deprecated calls to cache_template_extensions= and cache_template_loading= in some config/environments/*.rb files.

I then learned and setup Git.

# On Dreamhost, create a remote repository:
mkdir -p ~/git/projectname
cd ~/git/projectname
git init

# On local laptop:
cd projectname # This is folder of my existing app that is not yet versioned
git init
git add .
git commit -m "First revisioned copy of my code."

# Still on local laptop, configure it as a clone the remote repository.
# The remote repository will then be "upstream" of this local one:
git remote add origin mrmike@mikedll.com:git/projectname # Name remote repo 'origin'
git config branch.master.remote origin               # Track remote 'origin'
git config branch.master.merge master                # ... the master branch
git push                                             # Inform 'origin' of commit

# Back to Dreamhost, we create a deployed working copy:
git clone ~/git/projectname ~/projectname.mikedll.com
cd ~/projectname.mikedll.com
git pull

My code was in deployment position. I reviewed my demo app experience to fix file permissions and configure FastCGI. I reproduce a couple steps from that experience here:

# Added this to my .bash_profile, for running rake commands
export RAILS_ENV='production' 

# Initialize the remote mysql database. Here, rails worked flawlessly.
rake db:migrate

My applicaiton should now be deployed and under version control.

Review the solution

I test my solution by editing files locally, commiting them with Git, logging into mikedll.com and pulling the changes, and viewing the website at its public URL. I expect to see my changes.

Then I test my backups by pulling changes into the Windows working copy of the central repository, letting Syncplicity see the changes and back them up to its servers, and viewing my backed up files through the Flash interface at the Syncplicity website. I expect to see the changes reflected in those files.

These tests passed. I had successfully deployed my web app to Dreamhost.

I could also use Git to send changes from Dreamhost to my laptop. This will be used when bugs are fixed on the remote production server.

git commit by definition doesn't require a password like Subversion would, but git push and git pull do. I setup private/public SSH keys to simplify this edit-and-deploy process further.

I used this result to converting my existing static website, mikedll.com, into an identically setup Rails/Git/Dreamhost configuration.

A struggle while carrying out the plan was that, due to extremely poor internet at the time, I had to manually fetch certain gems including actionpack, activesupport, activerecord, actionmailer, and activeresource. RubyGems kept timing out when trying to download them. I omitted these downloads from the above list because if my internet had been better, RubyGems would've installed them for me.

A failure while carrying out the plan was that I tried to do some things with Git that may not have made sense. Originally, I had hoped to avoid a central repository, and to instead push updates from my local laptop repository to remote and backup repositories. I couldn't get this to work, though. I might have been needlessly running from a centralized setup for the sake of being different, instead of trying to get things done.

Future and Related Work

Heroku offers special Rails hosting such that your edit-and-deploy process can consist of a single rake command.

Dealing with conflicts in Rails versions is a known problem and a popular solution is to freeze rails.

The user experiences that alerted me to the deprecations of cache_template_extensions= and cache_template_loading= were Paul Sturgess' snippets collection and the Morph Labs website, respectively.

A Stackoverflow entry showed me how to configure an existing repository as if it were created with the clone command. This avoids the dirty method of cloning a new local repository and deleting the existing repository, which I had done at first and felt embarrassingly weaksauce.

My local laptop environment is actually Ubuntu on VirtualBox, but Syncplicity only works on Windows. So, I created another windows working copy off the central one just for Syncplicity. It will be my (manual) responsibility to pull updates to this copy when I want a backup that is neither on Dreamhost's hard drives nor my own.

My edit-and-deploy process is still missing some vital steps for it to scale, but it will suffice for now. Months from now as bugs begin to pile up, how will I manage them? How will I integrate tickets and milestones? With Trac? With Lighthouse?

Seemingly, Capistrano helps with Rails deployments and many system administration tasks where you have to login to many servers, but I haven't examined it in any depth yet.

Dreamhost strongly recommends launching rails apps with Passenger instead of FastCGI. I will add that as a future todo for my project.

Work with NGOs

I met Anita today, and she told me about her volunteer work with NGOs. She enjoys the work that she does. Usually, she works with children who are picked up off the street or sometimes for petty theft or running away from home. Or, sometimes they just go missing from their parents.

Frequently, the children are gone within a short time span. It could me a few months or a few days before they leave. They are often "locked up" in small rooms where they can't really go anywhere, so they don't like it.

I am used to organization like the Boys and Girls club, where short term work is frowned upon. Here, I may be able to get in simple interactions. I look forward to following through on this.

Tuesday, August 11, 2009

Create a Rails Application on Dreamhost

  1. In the Dreamhost Web Panel, panel.dreamhost.com:

    1. Create demo.mikedll.com. Wait until the domain name is resolvable.

    2. In the host configuration, check "Enable FastCGI" and serve documents from /home/myusername/demo.mikedll.com/public.

  2. SSH into the web server and run:

    # Create the app
    rails demo.mikedll.com
    
    
    # Go into the public directory
    cd demo.mikedll.com/public
    
    
    # Recurisvely add world-executable to directories
    find . -type d -exec chmod o+x {} \;
    
    
    # Recursively add world-readable to files
    find . ! \( -type l \) ! \( -type d \) -exec chmod o+r {} \;
    
  3. In the root of the application directory, run:

    # Make dynamic-content
    ./script/generate controller demo index`.
    
    
    # Initialize the sqlite database
    rake db:migrate
    
  4. Create demo.mikedll.com/public/.htaccess with:

    # Launch dispatch.fcgi on requests for which
    # there is no static file
    AddHandler fastcgi-script fcgi
    RewriteEngine On
    RewriteBase /
    RewriteCond %{REQUEST_FILENAME} !-f
    RewriteRule ^(.*)$ dispatch.fcgi [QSA,L]
    
  5. Remove demo.mikedll.com/public/index.html.

  6. Done

Expect visits to demo.mikedll.com to display static content.

Expect visits to demo.mikedll.com/demo to display dynamic content.

Pitfalls

  • This Apache error:

    [...] [error] [client ...] File does not exist: ... demo.mikedll.com/public/missing.html
    

    occurs if the .htaccess file is omitted.

  • Recent changes in the application code are not visible if an already-running instance of the application is not killed with killall dispatch.fcgi.

  • This Apache error:

    [...] [error] [client ...] FastCGI: comm with (dynamic) server ".../current/public/dispatch.fcgi" aborted: (first read) idle timeout (60 sec)
    [...] [error] [client ...] FastCGI: incomplete headers (0 bytes) received from server ".../current/public/dispatch.fcgi"
    

    occurs when then application crashes while loading. For example, I get this error when my config/environment.rb has a bug in it. The browser will hang for 60 seconds before the server responds with a terse, generic, rails error.

    To debug such issues, I run ./dispatch.fcgi at the command line. No arguments are required. Usually dispatch.fcgi will crashe, give an informative error, and provide faster debugging feedback.

Similar Work

More Trouble-shooting help is available in the Dreamhost "Rails Wiki".

Another guide on a Dreamhost/Rails/Radiant deployment features an application that has many users. It goes into more depth and draws on more experience.

My guide builds on another guide published early this year. This guide had some screenshots and clear directions. However, the .htaccess file in that guide doesn't serve static content, which was a problem for me. Someone else had the same problem. While solving this problem, I used sample .htaccess files at The Rails Playground and this Snipplr entry.

Related Technology

Passenger is a deployment configuration that is different from FastCGI. Dreamhost supports it, but I haven't gotten it to work yet.

Capistrano has something to do with automating common server tasks, like application deployment.

Tuesday, July 7, 2009

My Lasik Experience

This article is intended to supplement existing descriptions of laser eye surgery by being specific to an area, and by being new. It provides an example experience of laser surgery in Southern California, occurring from May to June of 2009.

On May 7th at 4:30 in the afternoon, I called the TLC Center in Ontario, California. I asked what the cost of LASIK for my eyes would be. Dawn, a counselor, took my call and said $2,300 to $2,600 per eye. I asked how many surgeries the TLC Center did per year, and of those, how many had complications. Dawn told me that in 2008, they did 5,000 surgeries. The highest complication rate was 2-3%, which denoted those who experienced dryness in the eye (I don't know if the dryness was chronic).

By the end of the call, Dawn and I had scheduled appointments for the counseling, preoperative exam, and operation. The preoperative exam and counseling session was scheduled for May 14th, and the a tentative surgery was scheduled for May 21st. At this time, I had myopia and minor asitigmatism, yielding vision quality of roughly 20/200.

On May 14th, I looked up the route via Omnitrans bus, and walked twenty minutes to the nearby bustop with $3.50 for a day pass. Two buses and three hours later, I walked into the lobby and was greeted by Virginia, one of the center's many friendly nurses who worked at the reception desk. Within ten minutes, Dawn opened the door to the waiting room, called my name, and we met face to face for the first time. We began the preoperative exam.

Dawn, Dr. Jenson, and Dr. Pirnazar took turns evaluating my eyes. They took my prescription, examined the shape of my cornea, and did some other tests that I didn't recognize. In Dr. Jenson's words, I was a "textbook case" for the surgery. It was Dr. Pirnazar who would perform the actual surgery. He was a large, strong man with a radiant, friendly voice and engaging demeanor.

A week later, I took the same system of buses to the center. This was the day of the surgery. By this point in time, I had spent $90 on three prescriptions: Lotemax, Zymar, and Restasis. My Blue Shield PPO plan had paid for the costs beyond $90. Dawn told me that these prescriptions are valued in the hundreds of dollars. Lotemax and Zymar are intended to help keep the eye healthy in the window of time surrounding the surgery date. Restasis is a tear stimulant that, over time, trains the eye to produce more tears than normal. It is taken for about a month and a half of time surrounding the surgery.

Over the next month, I would spend about $30 on over-the-counter artificial tears - these had to be preservative free, which I think requires that they come in small, single-use containers. At the two drug stores, the only artificial tears explicitly labelled as preservative-free were also also single-use. I tried two brands: Bausch & Lomb and Optive. I preferred Baush & Lomb, because their single-use containers would yield several drops instead of just a few, and I could re-use the same container throughout a couple of days. Anyway, these artificial tears were taken every 1-2 hours for one month after the surgery, and they serve as general-purpose lubricants to immediately relieve dry eyes.

But back to the trip on the day of the surgery. On this trip, I brought along my checkbook and driver's license, which would be used to pay $4,500 as soon as I greeted Virginia in the lobby. Through some technological means, they processed the check as soon as I handed it over, although the balance wasn't deducted from my bank account for another few days. This covered the total cost of the surgery. Dr. Pirnazar was giving a discount (supposedly due to the economy), and thus this cost was cheaper than what Dawn had quoted me over the phone.

The procedure was largely automated. First, I went through an "intralase" procedure, which roughly 40% of patients go through. It is the most modern of the available methods for creating an outer corneal flap, because it uses a laser instead of a physical blade, or microkeratome, to open the flap. The laser creates some gas bubbles under the eye's surface, which in turn force the flap open. As I sat in a chair for this, it was easy to hold still as the laser created the flap. After doing one eye, the other was done. The laser spent about 15 seconds per eye, and I remained laying down for several minutes. I felt nothing, but at one point my eyesight went grey, which is normal at this step.

After this, I could barely see. It was a little like sudden bright light after being in the dark for a long time. Now it was time for the vision correction, which would be done by a separate laser.

The doctor let me stand and his assisting nurse guided me to another chair about 10 feet away.

I laid into position, and Dr. Pirnazar prepared one of my eyes for the LASIK laser. As I stared up into the machine, I saw what looked like a complex grid of dozens of red LEDs. A thick green light was in the center, and I was told to focus on the green light, even as my eyesight went blurry. An eyeball tracker checked my eye position about 4,000 times per second. If my eyeball had stopped looking at the green light, the machine would've shut down. I was able to hold still, however. I felt no reflex to blink, and even if I'd wanted to, my eyelids were held open. As the laser reshaped my cornea, the machine emitted a very mild burning scent. The smell is normal, and vaguely resembles a burning PCB board. No actual burning takes place in my eye, though - the laser itself is cool. It was active on my eye for about 30 seconds. On some patients, this takes up to 45 seconds. Upon completion, Dr. Pirnazar returned the corneal flap into its place. Then he did the other eye.

As I stood up from the surgery, my vision was washed out, as if I'd just opened my eyes in a swimming pool. Light was mildly irritating. Before leaving the surgery room, Dr. Pirnazar and I posed for a picture. Then, I was given protective plastic eyeware (like cheap Racketball goggles).


As my roommate drove me home, I was producing so many tears that they temporarily spilled into my nose. This is normal following surgery. I took a Xanax that they give everyone for free on the day of surgery (for sleeping), and slept with the eyeware. It is strongly recommended that all patients go to sleep after the surgery.

30 hours and another $3.50 later, I sat in the examination chair and passed a 20/20 eye exam. My left eye was 20/20, and my right eye was 20/15. My present side effects included severe halos (glowing lights) at nighttime, mildly overpowered brightness at daytime, marked dryness upon waking, and occasional irritation, as if an object was in my eye. The glows/halos are understood to persist for 1-3 months, as the eye continues to heal.

During the one-day-after examination, Dr. Pirnazar observed a small "fiber" lodged in my left eye, which was a little unusual. This would make my left eye more uncomfortable than my right eye during the first week of healing. To monitor progress, he asked to see me on a weekly basis for the next couple weeks. On these visits, he told me that my left eye was gradually "melting" this debris, and it was being absorbed back into my body by my eye's natural healing process.

One of the primary concerns here was that my body would aggressively react to the foreign body, concentrate white blood cells into the area, and inflame the eye. Inflammation is the main scenario that Lotemax is designed to fight, so I continued this prescription for three weeks, compared to the ordinary one week recommendation.

During the first week after surgery, my eyes were particularly vulnerable, and I needed to avoid any heavy physical exercise. The purpose of this was to avoid cringing due to physical stress, and of course to avoid being physically struck in the eye. I needed to avoid swimming, and to avoid getting water in my eyes during showers. I couldn't stare at a computer screen without frequently taking breaks and lubricating my eyes with artificial tear drops.

The second week after surgery was much more enjoyable than the first. I felt very little irritation during the day. With this new comfort, seeing things at all times became a newfound pleasure. I wasn't wearing glasses anymore, and this fact seeped into my daily self-awareness. My confidence and freedom probably increased by about 5%.

Six weeks after the surgery, my symptoms were almost entirely gone. I was able to be awake at all hours of the day without needing artificial tears - the dryness in my eyes was gone. The glowing halos around light sources at night was also generally gone. Only certain lights at very far distances exhibited a small glowing halo effect. Irritation only occurred at the end of my day.

I am very satisfied with my experience. The advantages are more or less the exact same freedoms that most people enjoy with contacts, only my vision is even slightly better than 20/20, and I don't need the contacts.

One ideal of my philosphy has been to live as simply and as mobile as possible, and this helps. Now, there is one less physical item that I need to own and worry about. I can physically move with more ease.

Cost Summary of the Surgery:

Surgery$4,500
Prescription Drugs (after Health Insurance)$90
Artificial Tears$40
Bus Fair for Routine Checkups$17
Total:$4,647


Thursday, May 28, 2009

Cholesterol has improved

I've had my cholesterol measured March in 2006 and just recently, in May of 2009.

My total cholesterol went from 129 to 126 mg/dL. My HDL or "good" cholesterol went from 50 to 63 mg/dL. My LDL or "bad" cholesterol went from 65 to 52 mg/dL. I find this encouraging. I don't know what the error margin is for these tests, so the change might not be as significant as I think.

I correlated these improvements with improvements in my diet. In 2006, I was eating a lot of fast food, college food, and Denny's. Lots of sugar. In the Fall of 2008, I met with trainers Bruno and Adriana, and they pointed me toward simpler whole wheat grains, nuts, and vegetables.

Estee, my landlord, works near Los Angeles as a physical therapist. She says that greater than 3 out of 4 of her patients are taking medication for either high blood pressure or high cholesterol. I would like to avoid joining that group of people.

Sunday, May 17, 2009

How to install software despite a broken CD rom drive


Problem: I bought Adobe Creative Suite for $1500, and I want to install it on my laptop. However, the CD Rom drive on my laptop is broken. What should I do?

Assumptions: The optical media are conventional CDs, not DVDs. I have a working Linux and Windows boxes at work. I have the software keys.

Rejected Solution: Download install media from Adobe.

Adobe calls this "Electronic Software Delivery" (ESD), and this doesn't work, because Adobe doesn't seem to carry installation media for old versions of their software. I found a page on their website saying as much (can't find it now). This is business-shady.

Working Solution: Create ISO images of the optical media on a computer with a working CD drive. Transfer these to the laptop via network. Mount them with cd emulation.

Creating the ISO images is a matter of copying all the bits off the media, for which software exists. Copying the data is fast. Assuming a CD is, at most, 700 MB, an that modern CD drives (c2004 and onward) can transfer data at a rate of 2 MB/s, we have:

700 MB * (1s / 2MB) * (1m / 60s) = 4.2m

About 5 minutes transfer time per CD.

For Windows software, use LC ISO Creator. For Linux software, use dd:
sudo su
dd if=/dev/scd0 of="Desktop/Adobe Creative Suite Extras Disk 2.iso"
chown mrmike:mrmike Adobe\ Creative\ Suite\ Extras\ Disk\ 2.iso

I used both. My Linux box was a c2009 installation of Ubuntu.

To mount the ISO images, use Daemon Tools Lite (I heard about this application through Martin). Note that this software feels invasive and its creators seem a little business-shady.

Use Syncplicity to transfer the ISOs to the laptop. Note, Syncplicity currently doesn't support Linux, so I had to first scp some of the ISOs to the spare Windows box.


To test the solution out, I tried mounting and installing merely Photoshop on my spare Windows box. It worked.

All I need to do now is give my laptop time to download these new ISOs. In total, I just created about 3GB of ISO data, so that will take awhile. Assuming I can transfer 3MB / min on my home cable internet line (which seems reasonable to me), that will take about a day:
3004 MB * ( 1m / 3MB ) * ( 1h / 60m ) = 17 hours.
Related Work: For data backup, I used to use a combination of rsync scripts and Dreamhost. However, Dreamhost no longer guarantees data backup (indeed, I'm not sure if they ever did). For drive emulation, I used VirtualCD about 10 years ago, though I don't see the advantage of it over Daemon Tools. I'd be interested in finding something less invasive than Daemon tools, though. Syncplicity is a special case of online data backup, which in turn is a special case of home user data backup, which has been an extremely saturated market, both presently and historically. I know that as of late 2008, Megan was using Mozy, for which which I've seen commercials on Hulu.

Future Work: Setup a "portable USB application drive", and buy an external hard drive to support my growing data store. This could reduce upload/download times. My Syncplicity account is up to 48 GB now - at least 25% of that is installation media.