Versioning in Git using Tags

Recently, I looked at tagging different versions of my project. First, I reviewed the Git manual1 and found the instructions for creating an annotated tag.

git tag -a v0.2 -m 'development version'

Once you have a tag, your list of tags can be reviewed using

git tag

Of course, at this point, the tag is still on the local repo. So, to push this tag to the master server, I used the following, where v0.2 is my tag.

git push origin v0.2

Now, when a team member wants to clone or fetch the latest repo, they will get the tag as well. Then, if they need the code from a tagged version, they can check out the tag as follows.2

git checkout tags/v0.2

 

References

Git Manual, 2012, accessed  21 April 2014, <http://git-scm.com/book/en/Git-Basics-Tagging&gt;.

Stack Overflow, 2012, accessed  21 April 2014, <http://stackoverflow.com/questions/791959/how-to-use-git-to-download-a-particular-tag&gt;

Advertisement

Getting the Duration of a Video with PHP

I wanted to calculate the duration of a video in seconds as in integer variable. For this I needed software outside of PHP. So, I decided to use the open source video encoding library, avconv, running on Linux Mint / Ubuntu.

If you pass a video to avconv, it returns meta-data about the video, including its duration, e.g.

$avconv -i myvideo.mp4
avconv version 0.8.10-6:0.8.10-0ubuntu0.13.10.1, Copyright (c) 2000-2013 the Libav developers
built on Feb  6 2014 20:53:28 with gcc 4.8.1
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'myvideo.mp4':
Metadata:
major_brand     : mp42
minor_version   : 1
compatible_brands: mp42mp41
creation_time   : 2013-11-23 13:44:21
Duration: 00:00:03.76, start: 0.000000, bitrate: 393 kb/s

A search on the Ubuntu forums returned an easy way to parse the above output using Linux scripting1.

avconv -i myvideo.mp4 2>&1 | grep 'Duration' | awk '{print $2}' | sed s/,//

This will extract the timestamp after the text after “Duration”. The 2>&1 is important as avconv sends it’s output to standard error rather than standard output. I coded this in PHP as below.

$cmd = "avconv -i '$video' 2>&1 | grep 'Duration' | awk '{print $2}' | sed s/,//";
exec($cmd, $result, $error);
$duration = $result[0];

Once, I had the above data (e.g.  “00:00:03.76”) in a string, I needed to convert it to an integer value. Further research returned the following snippet of PHP code

list($hours,$mins,$secs) = explode(':',$duration);
$seconds = mktime($hours,$mins,$secs) - mktime(0,0,0);

The first mktime returned a timestamp relative to the current time, so we need to subtract the number of seconds from the current timestamp at midnight. This gives us the number of seconds in our video as an integer value.

 
References

Raguet Roman, 2012, accessed  21 April 2014, <http://www.askubuntu.com/questions/224237/how-to-check-how-long-a-video-mp4-is-using-the-shell>.
2012, Stack Overflow, accessed  21 April 2014,<http://www.stackoverflow.com/questions/4605117/how-to-convert-hhmmss-string-to-seconds-with-php>.

 

Installing Laravel on Linux Mint / Ubuntu

Recently, I went to install the Laravel PHP framework on a fresh virtual machine. Here are the steps that I went through.

First of all, I created a directory for my project and made this my current directory.

$ mkdir myproject
$ cd myproject

I decided to use Composer to install Laravel. In order to install Composer I needed to use curl. So, I needed to install curl.

$ sudo apt-get install curl
$ curl -sS https://getcomposer.org/installer | php

If you want to execute composer directly, so instead of typing…

$ php composer.phar

You can do the following…

$ sudo chmod +x composer.phar
$ composer.phar 

If you want composer to be globally accessible from any folder on your Linux environment, then use the following…

$ mv composer.phar /usr/local/bin/composer

Next, I went to install Laravel. Here, I got an error saying “Mcrypt PHP extension required”. This was resolved as follows.

$ sudo apt-get install php5-mcrypt

Then, I tested mcrypt as follows.

$ php --ri mcrypt
Extension 'mcrypt' not present.

So, I needed to more than just install mcrypt. This was resolved as follows.

$ sudo ln -s /etc/php5/conf.d/mcrypt.ini /etc/php5/mods-available
$ sudo php5enmod mcrypt
$ sudo service apache2 restart

Now, I could test as below.

$ php --ri mcrypt

mcrypt

mcrypt support => enabled
mcrypt_filter support => enabled
Version => 2.5.8
Api No => 20021217
Supported ciphers => cast-128 gost rijndael-128 twofish arcfour cast-256 loki97 rijndael-192 saferplus wake blowfish-compat des rijndael-256 serpent xtea blowfish enigma rc2 tripledes
Supported modes => cbc cfb ctr ecb ncfb nofb ofb stream Directive => Local Value => Master Value mcrypt.algorithms_dir => no value => no value mcrypt.modes_dir => no value => no value

Finally, I was ready to install Laravel using Composer. In the following command, I use the folder name “back-end” to store my Larvel files.

$ php composer.phar create-project laravel/laravel back-end –-prefer-dist

There is a note on the Laravel documentation that Laravel needs to configure folders within app/storage with write access by the web server. I achieved this as follows.

$ cd back-end/app
$ sudo chown -R www-data:www-data storage

Next, I wanted to set to Apache virtual hosting to point to my Laravel public folder. As in a previous blog entry, I copied my /etc/apache2/sites-available/000-default.conf to mysite.conf.

$ cd /etc/apache2/sites-available/
$ cp 000-default.conf mysite.conf

In mysite.conf, I updated the DocumentRoot directive to point to my local repository. The extract from mysite.conf is as follows. The public folder is Laravel’s public folder.

...
DocumentRoot "/home/myuser/myproject/back-end/public/"
<Directory "/home/myuser/myproject/back-end/public/">
Options Indexes FollowSymLinks MultiViews
    # changed from None to All
    AllowOverride All
    Require all granted
</Directory>
...

Note the line “Require all granted”. This command has been updated from my earlier blog post and is required by Apache 2.4. One of my previous blog entries listed the following lines instead. These are now out of date.

Order allow,deny
Allow from all

Once mysite.conf was updated, I used the following commands to activate the new virtual host. I disabled the default site, enabled my virtual site and reloaded Apache.

$ sudo a2dissite 000-default && sudo a2ensite mysite
$ sudo service apache2 reload

Finally, I tested my Laravel installation by loading localhost on my web browser to get Lavavel’s default home page.

Video Encoding with Large Files

This week’s issue relates to encoding large video files. I had previously integrated open source software to encode video from a variety of formats to H.264 (using an mp4 container). The software worked fine with test data from small video files. Recently however, we have been testing with more realistic data from longer videos.

Soon into this stage of testing, we found that the encoding software which was running on our test server was failing.

We use the open source software ‘melt’ on Linux to do the video encoding. ‘Melt’ usually runs as a two-pass process.

So I went to look at our logs and determined that the ‘melt’ first pass in particular was failing. Next, I checked from running ‘top’ in another terminal that melt was using more and more memory and was terminating at 92% of total memory.

As I suspected that the Linux Out-Of-Memory killer had terminated the process, I went to check the kernel logs, as follows.

$ tail /var/log/kern.log
kernel: [32941851.928772] Out of memory: Kill process 3318 (melt) score 927 or sacrifice child
kernel: [32941851.928788] Killed process 3318 (melt) total-vm:4400676kB, anon-rss:3551936kB, file-rss:140kB

This confirms that the OOM-killer terminated the melt process as memory needs to be reclaimed to ensure that Linux could continue to run.

Now that I knew the source of the problem, first I tweaked the melt parameters to reduce its workload. However, the same problem occurred again with only a slight improvement in how far the encoding went.

Following one from this, I checked our Linux box and saw that no swap space had been allocated. From here, I could see that the solution was to configure swap memory in Linux to allow the melt process to complete.

So, we allocated 2GB of swap memory in addition to the existing 4GB of RAM and ran the encoding test again. This time, everything ran smoothly, with the swap memory being used to supplement RAM.

Using a virtual machine as a development environment

For my last project, I decided to use Linux as my development environment.  My laptop runs Windows and I wanted to try using Linux without setting up a dual-boot system. I tried VMware Player and installed Linux Mint with the Mate interface. I have Linux Mint with Cinnamon, so this would be a way to try a different variation.

I found the virtual machine to have good performance apart from the User Interface, which showed some lag for scrolling, especially when using MySQL Workbench, which depends on a graphical representation for the schema.

Overall, it was good to have a complete development environment which is separate from that of my main work. I configured the Apache environment to work with my repository. All of these changes are encapsulated in one folder on the host machine, which I zipped into an archive file. This makes it easier to make configuration changes as you can always roll back.

Recently, I looked at Oracle Virtual Box. This is open-source software. It looks to be quite a good program, and I plan on using it going forward for a personal project. I may move some of the work of my main project from Windows onto a Linux virtual machine, time permitting.

Improving the development cycle for back-end development

My first web project was based on the LAMP stack. We used NetBeans on Windows as the code editor for writing PHP. Once the code was written, it was transferred to a remote Linux server to be served by Apache or to be run as PHP unit tests. So, the workflow consisted on coding on Windows, and then using a file transfer program such as FileZilla or WinSCP to move the PHP script to the Linux server, where it could be tested. If the original code needed to be changed after testing, we would follow the same cycle of updating the code on Windows and then moving it to the remote Linux server for testing.

We used Subversion on Windows for source code control. So, code was checked on from the repository into a Windows directory for editing.

Recently, I started a new back-end project for a different client. As this was a new project, I was able to start over with a fresh approach. It looked to me like the previous approach of developing on Windows and testing on Linux was less than optimal.

When I started the new project, I set up a LAMP stack on my computer. This time I decided to develop on NetBeans on Linux. This allowed me to test on my local Linux environment, when was running Apache 2, MySQL and PHP.

At first, I followed a similar approach to the first project. For the new project, we use Git as the source code control system. So I cloned a copy of the back-end repository to its own folder in my home directory.

Initially, I left the Apache Document Root at its default of “/var/www” and copied my PHP code there for testing. After testing I would have to move my updated PHP scripts back to their folder in my home directory so I could check them in to the Git repository.

It soon became obvious that this was a time consuming and error prone process as I had to remember which files were updated during the edit/test cycle. As we use the Laravel 4 PHP framework and follow its directory structure, it created a mental burden to remember what files had been updated, so they could be moved back to the local repository for staging and committing.

The solution was to update the Apache Document Root to point to my local repository. This would mean that there would be no need to move files back and forth, and a simple scan in Git Gui would indicate what files had changed and would need to be committed to the remote repository.

So, I made a copy of my /etc/apache2/sites-available/000-default.conf which defines the default website for Apache on Linux Mint.  In the new file (e.g. mysite.conf),  I updated the DocumentRoot directive to point to my local repository,disabled the default site, enabled my virtual site and restarted Apache.

The script to enable my new virtual host is below.

sudo a2dissite 000-default && sudo a2ensite mysite

After I restarted Apache, I attempted to load my Laravel website loaded. However, the Laravel public folder was not visible. This was crucial as my PHP code depended on Laravel. Further investigation showed that the public folder had it own .htaccess file. When I disabled this file (by re-naming it), the public folder became visible again in the browser.

This solved my immediate problem, but I still wanted my Laravel routing to work. As the problem was with .htaccess,  this implied that mod_rewite need to be enabled. Therefore the solution lay in the Apache  AllowOveride directive. This had to be changed from the default of None to FileInfo for Laravel. I added the following code to mysite.conf, and restarted Apache and all worked as intended.

<VirtualHost *:80>
…
…
          DocumentRoot  "/home/myuser/myproject/my_backend_repository"
        <Directory "/home/myuser/myproject/my_backend_repository">
          Options Indexes FollowSymLinks MultiViews
            # changed from None to FileInfo
            AllowOverride FileInfo
            Order allow,deny
            allow from all
         </Directory>
…
…
< /VirtualHost >

While there is some configuration involved, there are important advantages to using this approach over my original approach of using local editing and remote testing mentioned at the start of this blog entry.

First of all the check-in cycle is much simpler as the code is never moved out off the local repository for testing. It is easy to know what files have changed, so the correct updates are always checked in.

Secondly,  as testing is done through the local Apache server, there is no need to use file transfer to move the code from the local machine to the server. This improves workflow.

Next, development can proceed even without an Internet connection to the remote server. This can be handy is there are any networking problems.

Finally, it is also nice that the response time from the local server is always immediate.