Installing Laravel on Linux Mint / Ubuntu

Recently, I went to install the Laravel PHP framework on a fresh virtual machine. Here are the steps that I went through.

First of all, I created a directory for my project and made this my current directory.

$ mkdir myproject
$ cd myproject

I decided to use Composer to install Laravel. In order to install Composer I needed to use curl. So, I needed to install curl.

$ sudo apt-get install curl
$ curl -sS https://getcomposer.org/installer | php

If you want to execute composer directly, so instead of typing…

$ php composer.phar

You can do the following…

$ sudo chmod +x composer.phar
$ composer.phar 

If you want composer to be globally accessible from any folder on your Linux environment, then use the following…

$ mv composer.phar /usr/local/bin/composer

Next, I went to install Laravel. Here, I got an error saying “Mcrypt PHP extension required”. This was resolved as follows.

$ sudo apt-get install php5-mcrypt

Then, I tested mcrypt as follows.

$ php --ri mcrypt
Extension 'mcrypt' not present.

So, I needed to more than just install mcrypt. This was resolved as follows.

$ sudo ln -s /etc/php5/conf.d/mcrypt.ini /etc/php5/mods-available
$ sudo php5enmod mcrypt
$ sudo service apache2 restart

Now, I could test as below.

$ php --ri mcrypt

mcrypt

mcrypt support => enabled
mcrypt_filter support => enabled
Version => 2.5.8
Api No => 20021217
Supported ciphers => cast-128 gost rijndael-128 twofish arcfour cast-256 loki97 rijndael-192 saferplus wake blowfish-compat des rijndael-256 serpent xtea blowfish enigma rc2 tripledes
Supported modes => cbc cfb ctr ecb ncfb nofb ofb stream Directive => Local Value => Master Value mcrypt.algorithms_dir => no value => no value mcrypt.modes_dir => no value => no value

Finally, I was ready to install Laravel using Composer. In the following command, I use the folder name “back-end” to store my Larvel files.

$ php composer.phar create-project laravel/laravel back-end –-prefer-dist

There is a note on the Laravel documentation that Laravel needs to configure folders within app/storage with write access by the web server. I achieved this as follows.

$ cd back-end/app
$ sudo chown -R www-data:www-data storage

Next, I wanted to set to Apache virtual hosting to point to my Laravel public folder. As in a previous blog entry, I copied my /etc/apache2/sites-available/000-default.conf to mysite.conf.

$ cd /etc/apache2/sites-available/
$ cp 000-default.conf mysite.conf

In mysite.conf, I updated the DocumentRoot directive to point to my local repository. The extract from mysite.conf is as follows. The public folder is Laravel’s public folder.

...
DocumentRoot "/home/myuser/myproject/back-end/public/"
<Directory "/home/myuser/myproject/back-end/public/">
Options Indexes FollowSymLinks MultiViews
    # changed from None to All
    AllowOverride All
    Require all granted
</Directory>
...

Note the line “Require all granted”. This command has been updated from my earlier blog post and is required by Apache 2.4. One of my previous blog entries listed the following lines instead. These are now out of date.

Order allow,deny
Allow from all

Once mysite.conf was updated, I used the following commands to activate the new virtual host. I disabled the default site, enabled my virtual site and reloaded Apache.

$ sudo a2dissite 000-default && sudo a2ensite mysite
$ sudo service apache2 reload

Finally, I tested my Laravel installation by loading localhost on my web browser to get Lavavel’s default home page.

Advertisement

Transferring Files to Virtual Box

To facilitate debugging some problems with encoding large video files using open source software running on Linux, I decided to run the encoding software on a virtual machine. Initially, I copied the data onto the Windows host from a memory stick.

The next step was to do the encoding. I needed to copy the source video to my VirtualBox virtual machine. My first thought was to use Virtual Box’s file sharing feature. To enable this, I had to install the VirtualBox Extensions. Then, I discovered that I would have to buy a licence, to use the Extensions beyond the trial period, as only VirtualBox itself is open source. To try it out, I opted to install the Extensions for the one month trial period. However, I got an error when trying to install the Extensions.  A second attempt yielded the same result.

I did not have time to resolve the error, as I needed to look at problems found with video encoding. To get working on our debugging, I copied the video files to the virtual machine using a USB stick, which turned out to be very slow and laborious.

Then, I remembered from previous research that I had configured port forwarding on my virtual machine. This experiment involved loading web pages on the Windows host served by Apache on the virtual machine (VM). I did this through port forwarding. At the time, I had also configured port forwarding to allow ssh access to the VM.

A colleague pointed out that scp uses the same port as ssh (port 22). So, I should be able to use port forwarding to transfer files using scp from Windows to my VM.  Please see the below screenshot.

Image

At this point, all that I needed to do was configure WinSCP on Windows to transfer to the local host using the port 3022 which was mapped to port 22 on the VM. So, on the WinSCP Login page, I set up a session with as follows:

File protocol: sftp

Hostname: 127.0.0.1

Port number: 3022

This worked very well. It still took some time to transfer the large video files back and forth. The files were so large that it was better to delete them from the VM once they had been encoded and copied back to the host via scp, as the VM’s virtual hard disk was limited in size.

It would be a faster work flow to enable file sharing as the shared folder would be available on both the host and VM simultaneously. However, this was a useful solution given the fact that the VirtualBox Extensions would not install on my system. In addition, if only occasional file transfer is involved, then the VirtualBox Extensions may be avoided, thus avoiding the associated licensing costs.

Video Encoding with Large Files

This week’s issue relates to encoding large video files. I had previously integrated open source software to encode video from a variety of formats to H.264 (using an mp4 container). The software worked fine with test data from small video files. Recently however, we have been testing with more realistic data from longer videos.

Soon into this stage of testing, we found that the encoding software which was running on our test server was failing.

We use the open source software ‘melt’ on Linux to do the video encoding. ‘Melt’ usually runs as a two-pass process.

So I went to look at our logs and determined that the ‘melt’ first pass in particular was failing. Next, I checked from running ‘top’ in another terminal that melt was using more and more memory and was terminating at 92% of total memory.

As I suspected that the Linux Out-Of-Memory killer had terminated the process, I went to check the kernel logs, as follows.

$ tail /var/log/kern.log
kernel: [32941851.928772] Out of memory: Kill process 3318 (melt) score 927 or sacrifice child
kernel: [32941851.928788] Killed process 3318 (melt) total-vm:4400676kB, anon-rss:3551936kB, file-rss:140kB

This confirms that the OOM-killer terminated the melt process as memory needs to be reclaimed to ensure that Linux could continue to run.

Now that I knew the source of the problem, first I tweaked the melt parameters to reduce its workload. However, the same problem occurred again with only a slight improvement in how far the encoding went.

Following one from this, I checked our Linux box and saw that no swap space had been allocated. From here, I could see that the solution was to configure swap memory in Linux to allow the melt process to complete.

So, we allocated 2GB of swap memory in addition to the existing 4GB of RAM and ran the encoding test again. This time, everything ran smoothly, with the swap memory being used to supplement RAM.