Getting the Duration of a Video with PHP

I wanted to calculate the duration of a video in seconds as in integer variable. For this I needed software outside of PHP. So, I decided to use the open source video encoding library, avconv, running on Linux Mint / Ubuntu.

If you pass a video to avconv, it returns meta-data about the video, including its duration, e.g.

$avconv -i myvideo.mp4
avconv version 0.8.10-6:0.8.10-0ubuntu0.13.10.1, Copyright (c) 2000-2013 the Libav developers
built on Feb  6 2014 20:53:28 with gcc 4.8.1
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'myvideo.mp4':
Metadata:
major_brand     : mp42
minor_version   : 1
compatible_brands: mp42mp41
creation_time   : 2013-11-23 13:44:21
Duration: 00:00:03.76, start: 0.000000, bitrate: 393 kb/s

A search on the Ubuntu forums returned an easy way to parse the above output using Linux scripting1.

avconv -i myvideo.mp4 2>&1 | grep 'Duration' | awk '{print $2}' | sed s/,//

This will extract the timestamp after the text after “Duration”. The 2>&1 is important as avconv sends it’s output to standard error rather than standard output. I coded this in PHP as below.

$cmd = "avconv -i '$video' 2>&1 | grep 'Duration' | awk '{print $2}' | sed s/,//";
exec($cmd, $result, $error);
$duration = $result[0];

Once, I had the above data (e.g.  “00:00:03.76”) in a string, I needed to convert it to an integer value. Further research returned the following snippet of PHP code

list($hours,$mins,$secs) = explode(':',$duration);
$seconds = mktime($hours,$mins,$secs) - mktime(0,0,0);

The first mktime returned a timestamp relative to the current time, so we need to subtract the number of seconds from the current timestamp at midnight. This gives us the number of seconds in our video as an integer value.

 
References

Raguet Roman, 2012, accessed  21 April 2014, <http://www.askubuntu.com/questions/224237/how-to-check-how-long-a-video-mp4-is-using-the-shell>.
2012, Stack Overflow, accessed  21 April 2014,<http://www.stackoverflow.com/questions/4605117/how-to-convert-hhmmss-string-to-seconds-with-php>.

 

Advertisement

Splitting a file path in PHP

During the week, I was faced with the problem of dividing a given string containing a file path into the file name, extension and the path to the file’s directory. For example, “/home/myuser/myfile.ext”, has to be split into “/home/myuser/”, “myfile” and “ext”.

My first instinct was to use PHP’s explode(function to split the string on the forward slash. This would give me my file name and extension in the last element of the returned array and each of the directories of my path in the preceding elements. Of course, I would then have to build my directory path from these elements before returning the result, re-inserting the forward slash along the way.

This did not strike me as an elegant way to proceed. So on further reflection, I started my solution using basename(). This returns the file name and extension for a given file path. From here, I used explode to split the base file name into it’s file name and extension. Note that I did not store the “.” between the file name and extension as the specification did not require this.

Now, I needed the directory path to the file. Of course, my input string already had this information. I just had to remove the file name and extension from the end. So, I used substr() to get the sub string from the start of the source path less the length of the base file name (with it’s extension).

This struck me as being a more succinct resolution which is more intuitive to understand. I have included some sample code below.

class PathSplitter {

    function __construct() {
        $source = "/home/myuser/myfile.ext";
        echo "Source path: {$source}\n\n";

        $splitPath = $this->splitPath($source);

        echo "Split Path:\n";
        var_dump($splitPath);
    }

    private function splitPath($source) {

        // Get the file name and extension, i.e. the basename
        $baseName = basename($source);

        // Break down the basename into file name and extension
        $parts = explode(".", $baseName);

        $name = $parts[0];
        $extension = $parts[1];

        // The path is the full path name less the basename
        $path = substr($source, 0, -strlen($baseName));

        $splitPath = array(
            "path" => $path,
            "name" => $name,
            "extension" => $extension
        );
        return $splitPath;
    }

}

Including Autoloader and Laravel

This week, the challenge was to integrate the front and back ends. The front end code was deployed to our test server. There was a predefined folder structure for the back-end code that I had to copy my PHP code. I kept the Laravel folder structure and just copied across my code into the new structure.

Once we tested the app, the first problem was that the Laravel auto-loader “include” was not working. This was traced to the difference in the directory structure on the test server and the one that Laravel expected.

My initial thought was to change my local folder structure to match the test server. Then, I saw that the production folder structure was going to differ slightly to the test environment. So, it made sense to configure the autoloader “include” statements depending on server environment – local, staging or production.

Laravel has support for configuring different environments, and provides an app->environment() method to give the current environment.

 $environment = App::environment();

So, you can test your environment as follows.

 if (App::environment('staging')){
        ....
 }

However, I could not use the Laravel framework in this case, because the auto-loader had to be included in index.php, before the app variable was set up. In fact, the location of start.php (where $app is set up) would be different on each server.

The solution was to test the PHP global, $_SERVER.

 if( $_SERVER['SERVER_NAME'] == 'localhost') {
     // If the environment is local...
     require __DIR__.'/../bootstrap/autoload.php';
     $app = require_once __DIR__.'/../bootstrap/start.php';
 }
 elseif( $_SERVER['SERVER_NAME'] == 'staging.zzzzz.com') {
     // The environment is staging...
     require __DIR__.'/../staging_server/bootstrap/autoload.php';
     $app = require_once __DIR__.'/../staging_server /bootstrap/start.php';
 }
 elseif ( $_SERVER['SERVER_NAME'] == 'www.zzzzz.com') {
     // The environment is production...
     require __DIR__.'/../production_server/bootstrap/autoload.php';
     $app = require_once __DIR__.'/../production_server/bootstrap/start.php';
 }

 /*
 |--------------------------------------------------------------------------
 | Run The Application
 |--------------------------------------------------------------------------
 */
 $app->run();

So, in this case, I could not use Laravel’s configuration framework, as Laravel had not yet started. It was time to use plain old PHP.

Improving the development cycle for back-end development

My first web project was based on the LAMP stack. We used NetBeans on Windows as the code editor for writing PHP. Once the code was written, it was transferred to a remote Linux server to be served by Apache or to be run as PHP unit tests. So, the workflow consisted on coding on Windows, and then using a file transfer program such as FileZilla or WinSCP to move the PHP script to the Linux server, where it could be tested. If the original code needed to be changed after testing, we would follow the same cycle of updating the code on Windows and then moving it to the remote Linux server for testing.

We used Subversion on Windows for source code control. So, code was checked on from the repository into a Windows directory for editing.

Recently, I started a new back-end project for a different client. As this was a new project, I was able to start over with a fresh approach. It looked to me like the previous approach of developing on Windows and testing on Linux was less than optimal.

When I started the new project, I set up a LAMP stack on my computer. This time I decided to develop on NetBeans on Linux. This allowed me to test on my local Linux environment, when was running Apache 2, MySQL and PHP.

At first, I followed a similar approach to the first project. For the new project, we use Git as the source code control system. So I cloned a copy of the back-end repository to its own folder in my home directory.

Initially, I left the Apache Document Root at its default of “/var/www” and copied my PHP code there for testing. After testing I would have to move my updated PHP scripts back to their folder in my home directory so I could check them in to the Git repository.

It soon became obvious that this was a time consuming and error prone process as I had to remember which files were updated during the edit/test cycle. As we use the Laravel 4 PHP framework and follow its directory structure, it created a mental burden to remember what files had been updated, so they could be moved back to the local repository for staging and committing.

The solution was to update the Apache Document Root to point to my local repository. This would mean that there would be no need to move files back and forth, and a simple scan in Git Gui would indicate what files had changed and would need to be committed to the remote repository.

So, I made a copy of my /etc/apache2/sites-available/000-default.conf which defines the default website for Apache on Linux Mint.  In the new file (e.g. mysite.conf),  I updated the DocumentRoot directive to point to my local repository,disabled the default site, enabled my virtual site and restarted Apache.

The script to enable my new virtual host is below.

sudo a2dissite 000-default && sudo a2ensite mysite

After I restarted Apache, I attempted to load my Laravel website loaded. However, the Laravel public folder was not visible. This was crucial as my PHP code depended on Laravel. Further investigation showed that the public folder had it own .htaccess file. When I disabled this file (by re-naming it), the public folder became visible again in the browser.

This solved my immediate problem, but I still wanted my Laravel routing to work. As the problem was with .htaccess,  this implied that mod_rewite need to be enabled. Therefore the solution lay in the Apache  AllowOveride directive. This had to be changed from the default of None to FileInfo for Laravel. I added the following code to mysite.conf, and restarted Apache and all worked as intended.

<VirtualHost *:80>
…
…
          DocumentRoot  "/home/myuser/myproject/my_backend_repository"
        <Directory "/home/myuser/myproject/my_backend_repository">
          Options Indexes FollowSymLinks MultiViews
            # changed from None to FileInfo
            AllowOverride FileInfo
            Order allow,deny
            allow from all
         </Directory>
…
…
< /VirtualHost >

While there is some configuration involved, there are important advantages to using this approach over my original approach of using local editing and remote testing mentioned at the start of this blog entry.

First of all the check-in cycle is much simpler as the code is never moved out off the local repository for testing. It is easy to know what files have changed, so the correct updates are always checked in.

Secondly,  as testing is done through the local Apache server, there is no need to use file transfer to move the code from the local machine to the server. This improves workflow.

Next, development can proceed even without an Internet connection to the remote server. This can be handy is there are any networking problems.

Finally, it is also nice that the response time from the local server is always immediate.