kulekci.net kulekci.net      About      Speaking      Elasticsearch

Git Submodule

I did not use git submodules in production. This article for only information. Sometimes it can be really good option.

Working more than a year on git, i like that and i want to share some experience from here about submodule.

Submodule is great idea to attach external repos to your project to a path. Now, we can try to create a lib directory and we put some submodules in lib’s subdirectiories. I am exampling to you how the submodules is adding to a repo.

Adding a Submodule to your repository

git submodule add https://github.com/headjs/headjs.git lib/js/headjs
git submodule add https://github.com/marcuswestin/store.js.git lib/js/storejs
git submodule add https://github.com/cebe/smarty.git lib/template-engine/smarty
git submodule add https://github.com/fabpot/Twig.git lib/template-engine/twig

We add submodules to our project now you can see your added library on your status

git status

All your modules writing on “.gitmodules” file which is created by git. You can see with following code your gitmodules file content.

cat .gitmodules

For me, .gitmodules file content is following

[submodule "lib/js/storejs"]
    path = lib/js/storejs
    url = git://github.com/marcuswestin/store.js.git
[submodule "lib/js/headjs"]
    path = lib/js/headjs
    url = git://github.com/headjs/headjs.git
[submodule "lib/template-engine/twig"]
    path = lib/template-engine/twig
    url = https://github.com/fabpot/Twig.git
[submodule "lib/template-engine/smarty"]
    path = lib/template-engine/smarty
    url = https://github.com/cebe/smarty.git

Removing Submodule

If you want, you also can remove some submodules from your project. You can use to remove one or more of your submodule(s).

git submodule rm lib/js/headjs

### Updating Submodule

There are some updates your submodule while you are working with your project. For example, Twig template engine release a new version. You can use to update your submodules following lines in to your submodule’s path.

git submodule update

Cloning a Project with Submodules

To clone a project with its submodules, you can use –recursive parameter of git. You can check following commands.

$ git clone --recursive git@github.com:hkulekci/git-submodule-test.git

That’s all for now. To more information, you can look references part.

References :

PHP Large Size Array

In PHP, we use arrays lots of times and every our projects. In this article, I will try to handle some operations for large size array. For example, we can use in_array method to check if a value exists in an array.

$some_values = array(1, 2, 3, 4, 5);
var_dump(in_array(1, $some_values));
// bool(true)

The result is clear. $some_values array has “1” element as a value. Sometimes we must check all the array to compare each other. For example, we have a number list in an array and we try to check singularity of all elements.

First Example:

$some_values = array(1, 2, 3, 3, 4, 5);
$temporary_values = array();
for ($i = 0; $i < count($some_values); $i++) {
    $found = false;
    for ($j = 0; $j < count($temporary_values); $j++) {
        if ($temporary_values[$j] == $some_values[$i]) {
            $found = true;
            break;
        }
    }
    if (!$found) {
        // Checking or processing some other things
        $temporary_values[] = $some_values[i];
    }
}
print_r($temporary_values); // All elements of array is singular

Second Example:

$some_values = array(1, 2, 3, 4, 5);
$temporary_values = array();
for ($i = 0; $i < count($some_values); $i++) {
    if (!in_array($some_values[$i], $temporary_values)) {
        // Checking or processing some other things
        $temporary_values[] = $some_values[$i];
    }
}
print_r($temporary_values); // All elements of array is singular

In two example, run time is the same and they are O(n2) complexity in worst case. You can check in_array method to understand why they are the same. in_array method use php_search_array method directly. And they have a loop to move on array.

If we have some small array, in_array function save our time. Do we have large size array? If we have an array and it has 10,000+ elements. In this time, php return us some errors like that:

Fatal error: Maximum execution time of 30 seconds exceeded in .... on line 4

Because, our application execution time is more than our php configs. There are two solution for that. First one is change your max_execution_time from your php.ini file. Second one is change your application.

I will try to explain how we can change our application for this spesific problem. Firstly, we will get rid of inner loop.

$some_values = array("a1", "a2", "a3", "a4", "a5", ..., "a999999", "a1000000");
$temporary_array = array();
for ($i = 0; $i < count($some_values); $i++) {
    $temporary_array[$some_values[$i]] = $i;
}
$temporary_values = array();
foreach ($temporary_array as $key => $value) {
    $temporary_values[] = $key;
}
print_r($temporary_values); // All elements of array is singular

To minimize out code, we can use some php standart function. Firstly, we use array_flip to create temporary_array.

$some_values = array("a1", "a2", "a3", "a4", "a5", ..., "a999999", "a1000000");
$temporary_array = array_flip($some_values);
$temporary_values = array();
foreach ($temporary_array as $key => $value) {
    $temporary_values[] = $key;
}
print_r($temporary_values); // All elements of array is singular

Try to use array_keys to get keys of temporary_array.

$some_values = array("a1", "a2", "a3", "a4", "a5", ..., "a999999", "a1000000");
$temporary_array = array_flip($some_values);
$temporary_values = array_keys($temporary_array);
print_r($temporary_values); // All elements of array is singular

Now remove some unused variables from our code.

$some_values = array("a1", "a2", "a3", "a4", "a5", ..., "a999999", "a1000000");
$temporary_values = array_keys(array_flip($some_values));
print_r($temporary_values);

Thank you

Cote.js

Last weekend (2014-09-27), I attended JsIst event. And the event was very impressive for me. There were lots of interesting subject and I met lots of new friends. You can find presentations on event website. I have especially been interested in a presentation, which is “Scaling Node.js Applications with Redis, RabbitMQ and cote.js” was presented by Armagan Amcalar.

I am new on NodeJS but I tried to use sockets, cache systems on my projects. And it is very entertaining subjects for me. I used zmq library which is ØMQ nodejs library. I tried to handle mysql too many connection error with zmq. I created a tunel and lots of requester connect to a responder. But I think, ZeroMQ is not very easy to use. I spent lots of time to install on ubuntu server which is the easiest server to install something. apt-get is complete solution to lots of things. Finally, I installed on my server but my development machine (local computer) is MacOSx and I could not install on it. I had to install vagrant to create a development platform, is the same my ubuntu server. And all of them was only to start project. I didn’t want to extend more. It was exhausting.

In JSIst presentation, Armagan Amcalar talked about how we can scaling NodeJS application with cote.js, RabbitMQ and Redis. All three are good tools and solution to scale and another lots of things. I decided to try cote.js to my project because it was zeroconf. And following days, I will try to create a solution for my older mysql too many connection error. There were an example, was almost the same with mine, on its own repository And to see how it works, I created a basic example which is below:

Prerequests :

$ sudo npm install cote.js

That’s all.

And I am starting use it. Here is my little sample:

There are some mechanism to communicate, I choose request/respond mechanism on my example. You can create a responder basicly. You should just set a name and you can set a respondsTo parameter to separate your answers to many responder, or to merge your responds to one responder. I used one responder to many requester, to handle all requests from one source.

// responder.js

var Responder = require('cote').Responder;
var responder_counter = 0;

var randomResponder = new Responder({
    name: 'counterRep', // You can use this parameter separate your responds.
    respondsTo: ['counterRequest']
});

randomResponder.on('counterRequest', function(req, cb) {
    var answer = responder_counter++;
    // You can getting data from database and sending to requester
    console.log('request', req.val, 'answering with', answer);
    cb(answer);
});

The below is one of my requester. In my requester, I am getting data from responder and process it and sending somewhere etc. My other requester is the same on my example, I changed only time interval value. It is 100ms for requester2.js file.

// requester1.js

var Requester = require('cote').Requester;

var Request = new Requester({
    name: 'counterReq',
    requests: ['counterReq']
});

Request.on('ready', function() {
    var counter = 0;
    setInterval(function() {
        var req = {
            type: 'counterRequest',
            val: counter++
        };

        Request.send(req, function(res) {
            // Process data and do someting
            console.log('request', req, 'answer', res);
        });
    }, 200);
});

That’s it. I did not configure lots of things. I did not bother on installation or configuration ip, port, … etc. I only use it. That’s cool. And they have a cute and colorful logs on terminal. Here is my screenshot.

Cote.js Example Screenshot

I added some comments on my basic example. I will add there following days some codes and try to develop more complex structure.

You can find cote.js on https://github.com/dashersw/cote.