Markov chain sentence generator in erlang

Long ago I’d implemented an python IRC bot (first post on the blog) that used Markov chains to speak semi-intelligible drivel based on a corpus(brain/soul) that is filled with other drivel that gets talked about in chatrooms.

Here is the logic behind using Markov chains to generate text that sounds intelligible :

– Feed the program a big input text ( from a text file or from a chatroom), and have it parse it into a ‘bag’ ( a dictionary with multiple values for a key, which are arranged in a list ).

– The key can be a number of words, which is the chain length. e.g: {‘the’, ‘dream’, ‘of’} is one key of length 3. {‘what’, ‘can’} is a key of length 2. The value for these keys is the word that follows them in the input body of text.

e.g: “the dreams of men never die.” yields  { {‘the’, ‘dreams’, ‘of’} -> ‘men’, {‘dreams’, ‘of’, ‘men’} -> ‘never’ ……}

as the bag, for a chain length of 3 words. Let’s assume we have a chain length of 4.

1. Now pick at random any 4 consecutive words from the input text. This is our initial state S0. If S0 = {W0,W1,W2,W3}, then our output string is W0 W1 W2 W3 right now.

2. Based on current state S0, and the value of this key in the bag ( say V), we move to a next state S1. i.e: If S0 = {W0,W1,W2,W3}, and Bag[S0] = V, then S1 = {W1,W2,W3,V}. Our output string is W0 W1 W2 W3 V. Our state becomes S1 now.

3. Repeat 2. with state S0 = S1 (the output state of step 2) until we have the output string of the desired length or property.

Here is a small implementation in erlang.

– Save the above file as markov.erl

– Compile with erlc markov.erl

– Run with erl -noshell -s markov start “/path/to/corpus/file” -s init stop

Here is another version ( a distributed version, dishing out chores to small processes ) – . Be warned though – the distributed version will easily stall your dual core laptop if you feed it with a monstrosity comprising of 10K words. It spawns one process for each word.

On running either of the above versions, a random line that sounds intelligible shall be displayed. The program uses Markov chains of length 3. You can change it by changing the constant ARITY defined at the top.

Running it on a txt file (about hackers and painters) as its corpus yielded the following output – with a chain length of 3 –

Posted in Uncategorized | Tagged , , , , | 3 Comments

A genetic algorithm example in Erlang.

I’ve never cared too much about genetic algorithms. It sounded way too far fetched and impractical, so I’d absolutely no interest or knowledge about it. That was, until today  – and this awesome article explaining genetic algorithms. There’s nothing like a well written article, and this was surely one of that kind. At the very least, it inspired me to write my own version of the idea, although without the fancy graphs, etc. Since the time I read the article, I’ve already had a few places where I think such an idea could easily be put to work :).

The idea behind this program is to begin with a base population of X people, and try to breed/mutate into Y ( where Y is specified on the command line ), through general selection and random mutation. Basically, what this means is that we can select the population fit to breed, and can randomly mutate the progenies formed out of the mating. Varying the parameters (like the policy with which we select the population fit to mate, or the probabiliy with which mutation occurs, leads to interesting results!)

So here it is in erlang, in all its glory

1. Download and save as genetic.erl

2. Compile as erlc genetic.erl

3. Run from the command line as erl genetic.erl -s genetic test dinosaur -s init stop     (If you want to evolve into a ‘dinosaur’).

The output will display a series of cross-breeding, followed by random mutations until a ‘dinosaur’ is formed ( from random 8 lettered words ).

To reiterate – please read the post linked to, it’s awesome.



Posted in Uncategorized | Tagged , , , , , , , | 3 Comments

Erlang websocket server ( websocket protocol 76 )

*Note* – As of Oct 24, 2011, This version of websocket server will only work properly with Google Chrome <= 13.X. The new and last call ( hopefully stable) version of the websocket draft has been released, and soon someone will implement the handshake, which changes a bit ( the headers change a little, and so do the framing/encoding bits ).

In this post earlier today, I’d written a simple websocket server implementing the 76thwebsocket protocol ietf draft in Python. Erlang felt bad and whined about it so I redid it in Erlang. :)

It’s mostly based on Joe Armstrong’s original implementation of draft 75 of the  same, but written from scratch to implement the 76th draft, which differs from the 75th, as mentioned here.



Posted in Uncategorized | Tagged , , , | 1 Comment

Python websocket server ( websocket protocol 76 )

*Note* – As of Oct 24, 2011, This version of websocket server will only work properly with Google Chrome <= 13.X. The new and last call ( hopefully stable) version of the websocket draft has been released, and soon someone will implement the handshake, which changes a bit ( the headers change a little, and so do the framing/encoding bits ).

A simple python server that handshakes an HTML5 enabled browser connecting to it using websockets. Also includes basic message framing. pywebsocket was way too overkill for what I needed, and there were no other python implementations implementing revision 76 (most implement 75 which is slightly different) so I decided to implement a quick handshake myself. The server’s end of the handshake has 13 steps. If you think that’s too much, you should know that the client’s side has 43 steps !

TODO : disconnect handshake ( doesn’t interrupt functionality though ).

Websockets mark the death of workarounds like ajax, orbited/comed, polling, since a browser can now simply open a socket and connect to any application with an open socket that is willing to handshake it according to the w3c’s recommendations.

Enjoy !

Posted in Uncategorized | Tagged , , , | 6 Comments

A very basic erlang Crawler

1.Save the code below as spyder.erl

2.Run as : erl spyder.erl  -s spyder start -s init stop

This is a very crude crawler – will crawl all the links in a page, and then further. It doesn’t protect you from black holes, and will crawl away without concern for robots.txt. Just something to brush up my rusty(probably visible in the code ?) erlang. But what the hell – it works ! Feel free to improve on it :)

Posted in Uncategorized | Tagged , | 1 Comment

[OwlKun] Integrating OAuth with Twitter’s API in python

I believe everyone’s heard that  Twitter will be doing away with basic authentication come June ’10. They’re switching to OAuth. It’s a mechanism via which a consumer (C) and access the resources of a site (S), on behalf of a user (U), without a username or a password being provided to C by U.  ( C can be either an application or a website, U is a user registered on S .)

In our case, C is our application ( which here is OwlKun). U is you – your twitter account. S is Twitter.

Now, every python twitter API/library will eventually have to switch to OAuth, and many have already done so, but there is absolutely no comprehensive documentation on how to go about integrating your app with twitter, using any given language apart from the existing source code.

Lines 75-104 contain the OAuth specific material, others contain integration stuff so that you can actually run these commands from inside vim. :)

Here’s the link to OwlKun’s source :

Etymology : OwlKun = Owl ( Stays awake at nights, it was night time when I wrote this ) + Kun = Friend ( Since I have no friends since it’s a nice and friendly application that’s well documented. )



Posted in Python, Twitter, Uncategorized | Tagged , , , | 1 Comment

Setting your currently playing rhythmbox song as your pidgin status message.

This script will set your pidgin status message as the song that’s currently being played by your Rhythmbox.

It’s a simple process – save to a file called, and run it. i.e:

1. Save it to a file – say
2. Run it : python

3. Stare in awe :)




Posted in Dbus, Linux, Metal, Music, Python, rhythmbox, Uncategorized | Tagged , , , , , , , , , | 1 Comment

Why Love ?

This is in attempt to answer one of the questions that baffle me, and expectedly a few others – why love ? That is, why do we fall in love ? The kind of love I’m talking about is  love at first sight. Those who think such love doesn’t exist are merely lucky or unlucky enough not to have experienced it. However, it does exist. But why ? First, a little about evolution.

Like it or not, we have all evolved. From stuff that came from some star. In the words of Carl Sagan, “We’re star stuff, harvesting star light.”. There’s absolutely nothing special or divine about us, except that we might be able to prolong our survival through our developed brains.

Has it ever occurred to you why we find certain things repulsive or scary while certain other things attractive ? We’re instinctively disgusted by the smell of rotting flesh, scared by the sight of a snake, but charmed by flowers. Why ? The answer lies in evolution.

Throughout evolutionary stages, our ancestors roamed the African plains, hunting and gathering. In the course of their lives, they encountered various stuff. Like rotting flesh. To the first human tribe, rotting flesh probably did not smell ‘bad’. It probably didn’t smell at all.  It was like all flesh, except that it had harmful bacteria that made the person who ate it, sick . Unaware, they ate it. And fell sick. Some died. The ones that survived, made a note of this in their brains. Over generations, this piece of information about rotting flesh being bad, got converted into a series of chemical reactions in the brain that developed a ‘smell’, which is essentially a flow of hormones to your body that made the smeller feel nauseated. Thus, the smeller became averse to the rotting flesh, through the ‘bad’ smell. Chances are that the rotting flesh does not smell bad to a dung beetle, since they thrive in Hydrogen Sulfide.

Now how does all this add up to love at first sight ?

When you think about it, a person’s face is merely a collection of contours that might or might not appeal to you in the same way it does to another.  Why this difference in taste ? If your ancestors, ( not just your parents, but 10s or 100s or 1000s of people above you in your hierarchy ) found a great degree of success in producing kids with a female having a given kind of face, their brain stored this information. Passed on over generations, if each of the people along that path found great success of producing healthy offspring in the same kind of face, the feeling that the bearer of that given kind of face will help you produce an offspring better than the bearer of any other kind of face, intensified greatly.

This is where the intensity of love that people talk about comes in. How can love at first sight be so intense if you don’t even know the person ? Modern social dynamics requires that people get to know each other over a period of time and then, your love could be called intense. Most people would rubbish Love at first sight as mere lechery or weakness of character, without even looking into what causes it. So now, we have two kinds of love –

1. The cultivated love, ( the one that grows with companionship, trust, etc. )

2. Love at first sight.

Let’s dissect both of them with the perspective that we’ve held up until now – evolution.

1. The cultivated love between Male M and Female F

Well, throughout the generations, no particular kind of trait stood out, and hence no particular face stood out greatly. They all averaged out, and good still remained good, but no kind of face was electric.  However, throughout the generations, one thing was common – the chances of bearing healthy young would increase if the couple formed a bond of trust and affection with one another, preferably for life.  Hence the ‘Cultivated Love’.

2. Love at first sight between Male M and Female F.

From an evolutionary perspective, this kind of love would occur when the  males slept around with females that had a lot of similarity in their facial/physical features.

For example, if the first generation male picked a female that had narrower eyes, a pug nose, large shoulders, a small chin, etc. and if he was able to produce healthy offsprings with her – this kind of a face would get persisted into his memory. But this attraction  was still not written in stone. The next generation ( his offspring ),  found a female that had similar features, and was once again able to produce healthy offsprings with her. The genetic persistence into his limbic brain got a little stronger. Some 50 generations of similar females later, the rule of attraction was that much closer to being  written in stone. How ? The limbic portion of the brain of the 51st generation kid was imprinted with brain code telling it – “This kind of face/physique is right for you. If you copulate with her, your kids will be strong. Go for it ! “.  When the kid actually sees a girl with a similar face or physique,  his brain tells him – “Hey, that girl is so pretty ! Go for it !“.  How ? Once again, through a rush of hormones, that give the kid a high. The kid of course, interprets this as love, and goes for it. The female agrees, and they have healthy offspring. Now, this feeling is intensified over generations. We’re probably into 10000s of generations, maybe more, since humans developed the brain. Imagine the intenstity of this love at first sight, if most of our ancestors reproduced with similar girls.

This is why Love at first sight is not balderdash. And is probably more intense than Cultivated Love, atleast at first.

Now, what happens when the female you have fallen in love with at the first sight, refuses your advances ? Is it doom ? Does it mean your genes will end with you ?

Hardly. Even ignoring the wrong  assumption that there’s only a single female with the approximate facial/physical characteristics that your brain has been told is good for you, there’s the option of using that which makes us human – the frontal cortex. I.e: Cultivated love, through conditioning. Without the frontal cortex, and our control executive, we would have been left at the mercy of our ancestors’ experiences. However, with the inclusion of this newly evolved brain, we get to cope with constantly changing surroundings. It’s like the inclusion of a RAM chip and a hard drive  over a ROM chip – the latter is read only, while the former are configurable.

While making a choice, the executive area of the brain is consulted first. The limbic portion usually comes into play during emergencies, or when the executive area is too swamped, or resting (like when we’re daydreaming ), or when we’re intoxicated ( alcohol, lack of sleep ) .So, for the most part, you can configure your executive brain to trigger the release of the same hormones that get triggered when love at first sight occurs. This means that cultivated love can be just as fun as love at first sight, since the same hormones get released with both of them.

However, since your limbic portion still has some other, possibly conflicting, information about the person you’re supposed to be with, you will have to lead a life of slight  subconscious conflict ( which you’ll probably never realize anyway.) All it will mean is that your offspring will not feel AS attracted to the girl that you fell in love at first sight with, since you were not able to produce offspring with that girl. If this continues, the rule written in stone might get erased and 1000 generations later, your descendant might find another kind of girl attractive.

Conclusion  (and some fundae :P ):

There is nothing divine or pure about either us or love. No Cupid, nothing.  The ability to feel loved is but in your own hands. And the best way is to love everyone, like the sun – the object all gods are based on. The sun gives warmth to one and all without distinguishing between people, and without asking for anything in return. If we, are able to achieve such a level of giving love, it must mean we have love. For, in the words of John Marks Templeton, “How can you give love if you don’t have love ?” .

So, if you’re feeling piqued because some girl dumped you, don’t ! Mainly because you’ve got other options, and secondly, because you’ve got company !! :D

Continue reading

Posted in Death, Evolution, IITG, Love, random, sex | Tagged , , , , , , , , , | Leave a comment

Cognos | The cognos #prompt()# macro

Here’s another bitch. The Cognos #prompt()# macro.

In case you’re wondering about giving a default value to the prompt, here’s how you do it :

#prompt(‘prompt_name’, ‘prompt_type’, ‘default_value’)#

However, there is a catch. The prompt won’t accept just anything as its default value – the third argument can only be a column name in case of strings. However for numbers, it accepts numeric constants. For example :

#prompt(‘id’, ‘INTEGER’,’-1′)# is correct,


#prompt(‘name’, ‘STRING’, ‘HELLO’)# is wrong. It will have to be :

#prompt(‘name’, ‘STRING’, column_name)#

Weird. I’m not very sure about this, but After a lot of hit and trials on it, where there is no documentation available on this particular topic, I think this is correct.

Please let me know if this is correct/wrong, if you, like me are fedup with the lack of documentation too.

Posted in Uncategorized | 20 Comments

PERL | DBI | Batch Upload/Insert – Row wise vs Column Wise binding.

Today’s topic is Bitch uploading. Oops – Batch Uploading. Using Perl DBI. Into SQL Server. It’s hard to believe that anything perl is poorly documented. Well, atleast non-intuitively documented. How else am I supposed to find out how to bind row-wise instead of column-wise to a prepared statement ? You would think that `perldoc DBI` would help, but it has but one nondescript line about row-wise binding.

Anyway, since I am documenting anything that people might find useful in their lives, here’s what I have found about row-wise binding in DBI as opposed to column-wise binding ( which is easy, well – better documented ). I show you examples of both.

To bind column-wise, use this :

$sth = $dbh->prepare(“insert into database..table values (?, ?, ?)”);
@array1 = ( 1, 2, 3 ) # array that contains the values of column1 across all the rows.
@array2 = ( 1, 2, 3 ) # array that contains the values of column2 across all the rows.
@array3 = ( 1, 2, 3 ) # array that contains the values of column3 across all the rows.

$sth->execute_array({}, \@array1, \@array2, \@array3); # {} is on purpose, not a typo. Read the perldoc for more.


To bind row-wise, use this :

$sth = $dbh->prepare(“insert into database..table values ( ?, ?, ?)”);
$sth->execute_array({ArrayTupleFetch => sub { return_a_referecnce_to_an_array(a row)_with_three_columns }});

what the second statement does is that it calls the subroutine repeatedly over and over ( by itself, without you specifying a loop ) till it returns an undef.


$sth->execute_array({ArrayTupleFetch => sub { return shift @array_of_row_references }} );

so, when called repeatedly, the code will keep shifting an array ( thereby returning its first element ) till it runs out, in which case it will return undef.

Have fun with bitc..err..batch uploading !


All this info is courtesy the Author/Architect of the perl DBI – > Tim Bunce ( or something to that effect, apologies to him – a great guy, ubiquitous with his help ).

Posted in Uncategorized | 3 Comments

Adding Robotic/Vocoder effect to your song using Audacity.

Have you heard songs where it appears as if a robot is singing a line instead of a person ? Want that effect for your songs ? Now you can. Yes we can !

You will need Audacity ( comes for windows/Linux/Mac – I use the linux one, pretty much the same as the others ) , the vocoder plugin ( which will automatically be there if you install the latest audactiy version. ) Here’s what to do :

1. Load up your song.

2. If your song is stereo ( has two tracks, Left and right, ), split the track into left and right, by clicking on the small arrow that is there in the control box to the left of the track.

Once the track has been split into left and right tracks, select  the entire right track, and then replace it with a sound that has a high amplitude to it, like a synth lead or a rave lead . Or, for experimenting, just with noise. To generate white noise, Click Generate -> Noise -> White.

3. Then recombine the left and right tracks to form a stereo track. Do this by again clickin on the small arrow to the left of the lef track, and selecting ‘Make stereo track’.

4. Now, in Effects->Plugins, select Vocoder. You might want to play with the settings a bit, but a higher number of vocoder bands mean a higher quality of robotic sound, and a lower value would mean more noise.

Apply the effect and enjoy a slightly robotic voice ! To get a more robotic voice, you would like to have a sound with more energy in it. If I had to exemplify, it would be something like a lot of synth organs playing. Like a cathedral noise with lots of strings ( cello/violins/hurdy-gurdy ) but synchronized. Luckily, I had such a song, ( The Undertaker’s WWE entrance theme played using a synth organ kinda instrument ), so my effect sounds realistic. With white noise, it won’t sound very robotish, so you need a proper sound which is in tune, unlike noise, as your carrier ( more on carriers below ).

Basically, how is works is that to vocode, the plugin needs two signals – one is the carrier signal ( the synth lead sound or the noise – in the right track ), and the other is the main sound that we want to vocode ( your song – in the left track ).  The vocoder  does the rest.

It was good for playing around, so I thought I’d let everyone know – there’s a tonne of cool plugins for audacity and it’s fun to play around with them, so by all means experiment !!

Next time, I’m trying the Pitch Snap effect ( commonly known as the T-Pain effect ? )

BTW, if you have windows and FL Studio ( Fruity Loops), you might want to visit , an awesome site for people interested in composing/mixing/merging music. Some really nice music stuff out there !

Posted in Uncategorized | Tagged , , , , | 19 Comments