Installing Laravel on Linux Mint / Ubuntu

Recently, I went to install the Laravel PHP framework on a fresh virtual machine. Here are the steps that I went through.

First of all, I created a directory for my project and made this my current directory.

$ mkdir myproject
$ cd myproject

I decided to use Composer to install Laravel. In order to install Composer I needed to use curl. So, I needed to install curl.

$ sudo apt-get install curl
$ curl -sS https://getcomposer.org/installer | php

If you want to execute composer directly, so instead of typing…

$ php composer.phar

You can do the following…

$ sudo chmod +x composer.phar
$ composer.phar 

If you want composer to be globally accessible from any folder on your Linux environment, then use the following…

$ mv composer.phar /usr/local/bin/composer

Next, I went to install Laravel. Here, I got an error saying “Mcrypt PHP extension required”. This was resolved as follows.

$ sudo apt-get install php5-mcrypt

Then, I tested mcrypt as follows.

$ php --ri mcrypt
Extension 'mcrypt' not present.

So, I needed to more than just install mcrypt. This was resolved as follows.

$ sudo ln -s /etc/php5/conf.d/mcrypt.ini /etc/php5/mods-available
$ sudo php5enmod mcrypt
$ sudo service apache2 restart

Now, I could test as below.

$ php --ri mcrypt

mcrypt

mcrypt support => enabled
mcrypt_filter support => enabled
Version => 2.5.8
Api No => 20021217
Supported ciphers => cast-128 gost rijndael-128 twofish arcfour cast-256 loki97 rijndael-192 saferplus wake blowfish-compat des rijndael-256 serpent xtea blowfish enigma rc2 tripledes
Supported modes => cbc cfb ctr ecb ncfb nofb ofb stream Directive => Local Value => Master Value mcrypt.algorithms_dir => no value => no value mcrypt.modes_dir => no value => no value

Finally, I was ready to install Laravel using Composer. In the following command, I use the folder name “back-end” to store my Larvel files.

$ php composer.phar create-project laravel/laravel back-end –-prefer-dist

There is a note on the Laravel documentation that Laravel needs to configure folders within app/storage with write access by the web server. I achieved this as follows.

$ cd back-end/app
$ sudo chown -R www-data:www-data storage

Next, I wanted to set to Apache virtual hosting to point to my Laravel public folder. As in a previous blog entry, I copied my /etc/apache2/sites-available/000-default.conf to mysite.conf.

$ cd /etc/apache2/sites-available/
$ cp 000-default.conf mysite.conf

In mysite.conf, I updated the DocumentRoot directive to point to my local repository. The extract from mysite.conf is as follows. The public folder is Laravel’s public folder.

...
DocumentRoot "/home/myuser/myproject/back-end/public/"
<Directory "/home/myuser/myproject/back-end/public/">
Options Indexes FollowSymLinks MultiViews
    # changed from None to All
    AllowOverride All
    Require all granted
</Directory>
...

Note the line “Require all granted”. This command has been updated from my earlier blog post and is required by Apache 2.4. One of my previous blog entries listed the following lines instead. These are now out of date.

Order allow,deny
Allow from all

Once mysite.conf was updated, I used the following commands to activate the new virtual host. I disabled the default site, enabled my virtual site and reloaded Apache.

$ sudo a2dissite 000-default && sudo a2ensite mysite
$ sudo service apache2 reload

Finally, I tested my Laravel installation by loading localhost on my web browser to get Lavavel’s default home page.

Advertisement

Improving the development cycle for back-end development

My first web project was based on the LAMP stack. We used NetBeans on Windows as the code editor for writing PHP. Once the code was written, it was transferred to a remote Linux server to be served by Apache or to be run as PHP unit tests. So, the workflow consisted on coding on Windows, and then using a file transfer program such as FileZilla or WinSCP to move the PHP script to the Linux server, where it could be tested. If the original code needed to be changed after testing, we would follow the same cycle of updating the code on Windows and then moving it to the remote Linux server for testing.

We used Subversion on Windows for source code control. So, code was checked on from the repository into a Windows directory for editing.

Recently, I started a new back-end project for a different client. As this was a new project, I was able to start over with a fresh approach. It looked to me like the previous approach of developing on Windows and testing on Linux was less than optimal.

When I started the new project, I set up a LAMP stack on my computer. This time I decided to develop on NetBeans on Linux. This allowed me to test on my local Linux environment, when was running Apache 2, MySQL and PHP.

At first, I followed a similar approach to the first project. For the new project, we use Git as the source code control system. So I cloned a copy of the back-end repository to its own folder in my home directory.

Initially, I left the Apache Document Root at its default of “/var/www” and copied my PHP code there for testing. After testing I would have to move my updated PHP scripts back to their folder in my home directory so I could check them in to the Git repository.

It soon became obvious that this was a time consuming and error prone process as I had to remember which files were updated during the edit/test cycle. As we use the Laravel 4 PHP framework and follow its directory structure, it created a mental burden to remember what files had been updated, so they could be moved back to the local repository for staging and committing.

The solution was to update the Apache Document Root to point to my local repository. This would mean that there would be no need to move files back and forth, and a simple scan in Git Gui would indicate what files had changed and would need to be committed to the remote repository.

So, I made a copy of my /etc/apache2/sites-available/000-default.conf which defines the default website for Apache on Linux Mint.  In the new file (e.g. mysite.conf),  I updated the DocumentRoot directive to point to my local repository,disabled the default site, enabled my virtual site and restarted Apache.

The script to enable my new virtual host is below.

sudo a2dissite 000-default && sudo a2ensite mysite

After I restarted Apache, I attempted to load my Laravel website loaded. However, the Laravel public folder was not visible. This was crucial as my PHP code depended on Laravel. Further investigation showed that the public folder had it own .htaccess file. When I disabled this file (by re-naming it), the public folder became visible again in the browser.

This solved my immediate problem, but I still wanted my Laravel routing to work. As the problem was with .htaccess,  this implied that mod_rewite need to be enabled. Therefore the solution lay in the Apache  AllowOveride directive. This had to be changed from the default of None to FileInfo for Laravel. I added the following code to mysite.conf, and restarted Apache and all worked as intended.

<VirtualHost *:80>
…
…
          DocumentRoot  "/home/myuser/myproject/my_backend_repository"
        <Directory "/home/myuser/myproject/my_backend_repository">
          Options Indexes FollowSymLinks MultiViews
            # changed from None to FileInfo
            AllowOverride FileInfo
            Order allow,deny
            allow from all
         </Directory>
…
…
< /VirtualHost >

While there is some configuration involved, there are important advantages to using this approach over my original approach of using local editing and remote testing mentioned at the start of this blog entry.

First of all the check-in cycle is much simpler as the code is never moved out off the local repository for testing. It is easy to know what files have changed, so the correct updates are always checked in.

Secondly,  as testing is done through the local Apache server, there is no need to use file transfer to move the code from the local machine to the server. This improves workflow.

Next, development can proceed even without an Internet connection to the remote server. This can be handy is there are any networking problems.

Finally, it is also nice that the response time from the local server is always immediate.