Kindle Universal Dashboard

Kindle is Super awesome !

Because of its e-ink display. I wanted a display to present data which has – Least power consumption, Not painful to eyes & obviously one which doesn’t emit blue light.
E-Ink displays fit perfectly to my requirement – acquiring a display which can be driven using Raspberry Pi or Arduino is hard, size/cost ratio is much higher.

On googling I found some display modules which were more or near to 70-80$ – even smaller display – which impulsed me to get a Kindle Touch (8th Generation) at around 74$ approx.

Kindle - MadhurendraKindle comes with an experimental browser but it is narrowed version of WebKit, which is pretty useful if you want to display static content or just want to make a control panel, it can easily render normal websites which use js/CSS & works pretty well. But support for HTML5 is almost absent – so you can’t use WebSockets to push data, using long polling/fast polling is only solutions so far.

Moreover, there was another problem which I had to fix – Kindle has fixed timeout which sends it to sleep mode – for mine it was 10min, after digging I found you can use ~ds or similar but for me, nothing worked.

We can only hope that support to remove timeout or change timeout period will be added in future releases. I think old kindle supports.

If you can’t change timeout or you want to use few other features I suggest you to jailbreak. Follow steps mentioned here http://www.mobileread.com/forums/showthread.php?t=275877 , Don’t jump, It works for kindle touch 8th generation. Tried, tested, working !  For KT3 you will need to install MRPI along with KUAL. Once done your kindle is officially out of warranty 😀 . Post that you need to install USBNet – it will  allow you to ssh to your kindle.  All this will allow you to execute “lipc-set-prop com.lab126.powerd preventScreenSaver 1″ this on the kindle. It will simply disable screensaver. 🙂

Once you have your kindle whose screen doesn’t lock you can simply go & execute  a simple nodejs script to push data.

Note : Kindle doesn’t support WebSocket & none of transport methods in socket.io except “polling”. 

Below is nodejs server code :

Below is code of client.html

Voila – It works !

Below is video of working  in case you want to see demo before getting hands dirty 🙂

Warning & Update : This method might consume more power than expected, as experimental browser has loading status – which continuously refreshes a section of the screen. To overcome this problem I will be polling server with long interval difference – which will be adjustable by the server.

Note for nerds :  Since this method uses browser – it’s more flexible – but if you are possessive about power consumption & screen space – You can use JAVA to develop kindlet application. Lightweight pub/sub protocols like MQTT should help you in the way.


photo_20161030_124622Designing a wall holder : 
You can google for covers or design own or use some double sided foam tape. Since i had access to 3d printer i got two of http://www.thingiverse.com/thing:832273 printed & hung it on the wall – it just helped me in reading few books apart from using it only as display. SWAP!    

Use it as :

  • Scoreboard
  • Notification
  • Weather system
  • Wallpaper slideshow
  • News/RSS feeds display
  • Home automation control
  • Anything
  • Book reader 🙂

At the end, even if you place it behind your monitor it won’t hurt or push the new data to your eyes & spoil the code castle you were building.

Offline Wikipedia with Elasticsearch and MediaWiki

Wikipedia is Awesome ! It’s open, its free – Yea. & its huge in size, millions of articles  but as developer how to exploit the free knowledge.

I started digging internet just to find ways to exploit my fresh 9+ GB of XML Gzipped archive which seemed to me of no use as even a simple text editor can’t open it. (Just out of excitement what’s inside, how its structured, Schema ! )

Luckily people have already imported it. Elasticsearch is fast, reliable & its good for searching, so https://github.com/andrewvc/wikiparse was a saver.

  • Installed elastic search
  • Ran command to import

it took almost 48 hour in an i5, with 8gb ram – where mistake was i used same harddisk for data storage & database. Time might vary.

Data was imported but its still of no use ! Why ? Its in text/wiki format, parses is needed.

After doing search only solution i found was using mediawiki api, which is in PHP there were lots of things missing as its only for mediawiki but not for parsing plain text. (Though i didn’t spend much time in learning internal API)

I quickly downloaded mediawiki, ran nginx with php, installed it & used API.php.
it was good to see my offline API too, but still many things were missing, confusing, API has hard to modify structure. So i created a parse.php

So all steps were :

NodeJs – Improving performance using native binding

This is known fact that javascript is really slow in terms of sync operations like multiplications, divisions etc. – Since nodejs is based on js – Node inherits the curse.
Check http://benchmarksgame.alioth.debian.org/u64/compare.php?lang=v8&lang2=gpp this article to cross check the fact. Nodejs is fast in some aspects. I love it 😛

Doesn’t matter how much you improve your algorithm but c++ will outperform if you are running a sophisticated algorithm but one can push through imprecation.
Nodejs is based on V8 (Most ports), v8 based on C++, You must be already aware of that you can use C++ in nodejs – But it seems complex.

But it is not – It is easy to build a native extension & compile in seconds https://medium.com/@devlucky/how-to-get-a-performance-boost-using-node-js-native-addons-fd3a24719c85#.3uzqa9r4w this article does well in explaining basics. but if you want to dive deep check official documentation.

if you want a really explanatory presentation check https://ivanvergiliev.github.io/node-cpp/#opening.

Email Spoofing – Why its dead !

There was a time, when mail spoofing was an art, was a thing to impress people, was a way to phish attack someone.
With increasing intelligence in spam filters – it became harder, you need good IP reputation to deliver mail to box.
But now it has become almost impossible to spoof address like [email protected] . Why ? Have computer turned intelligent ? No.

The problem of spam protection isn’t new to market. So people came up with DNS based solutions which can allow sender to list IP addresses authorized to send mails.
“Sender Policy Framework (SPF) for Authorizing Use of Domains in E-Mail” – You can read rfc at https://www.ietf.org/rfc/rfc4408.txt (if you want to dig).

The standard was good, Not good it was best! It block all ways to prank people, but mails were still being delivered, because Network administrator weren’t smart enough to add all server. So as workaround big providers ran algorithms on top to make sure genuine mails which are failing spf are not delivered to spam.

This is all good – but for hardcore phishers it became little hard, people do check mails regularly & getting into network is just distributing malware.
Attacker can perform MITM alter content of mail while its being delivered.

There wasn’t any check.

Solution was DKIM – DomainKeys Identified Mail (DKIM) Signatures , it allows all mail servers to sign messages & certain header fields using defined hashing algorithms & verification using public/private key. Public key is published as DNS record, but private key is kept private.

Acquiring private key is little hard. Its hardest thing. You need to regulate keys to make sure that no one cracks it – if you keep key size 2048 it will make mail delivery slow, if you keep it 512bit with present computing its easy to crack.

DKIM provides way to authorize only certain application to send mail, but there was still no way to get reports on how effective is measure, how many mails are being spoofed & what to do with spoofed mails.

Mails were being delivered even after DKIM failure.

People came with DMARC standard – again it was published using DNS TXT record – it helps in getting reports & also blocking mails. Check the rfc at https://datatracker.ietf.org/doc/rfc7489/

Certainly as every security system comes with an overhead, These standard make mail processing resource intensive. There are many ways to reduce processing cost keeping security upto-date.

There were many spamming attacks originating on behalf of our site, post implementation of DMARC using DMARC Plus, they almost reduced 80% after few months.

One thing to note – if you make a single mistake in any of DNS record you can miss all your mails – So its better to take advice from someone who knows Standard well & can help you in deploying. Make sure you go Slow…

Website Optimization – Cache Cache Cache !

You must have heard about cache in web (caches are everywhere in computer science), most times you find it really buggy when changes aren’t reflected as soon as you make them.
For sites with small traffic these things are buggy – but they contribute a major in server traffic when you have a million hits even a thousand.

From server to browser what what we can cache ? & why to cache ?

  1. Enabling code caching – if you are running nodejs server etc things are in your favour because process is running & its already using existing variables still you should make sure that you don’t fetch/store too much data – you should implement cache in your code AND if you are using php or similar scripting language – let me tell you things are really slow – each time you make a request if apache – php thread is spawned, nginx a new thread along with socket is created – PHP code is compiled to opcode & then executed, That’s a lot – You can use opcaches or APC for optimizing php script. Alternatives may follow other languages.
  2. Caching static content – Since static content are not changing every other minute or day – most files are kept as it is for years!, you should tell your server that these contents are rarely changed – Check nginx cache config & Apache cache config
  3. Setting cache expiry header – This one is definitely under you control doesn’t matter if you are in shared hosting or running own server. You should send cache expiry header with all static content. Basically cache expiry tells browser to keep the file for next n days – though it is not strict , browsers send head request to see if file is changed or not.
  4. Offline Storage/WebSQL/Offline Application – Yes, You read it right – You can use offline storage to cache insensitive data on users browser only – this will reduce load on server & data transferred – you can even cache js & css.
  5. CDN – Content delivery networks can also help you a lot in caching – since libraries like jquery, bootstrap etc are so common today if you use CDN your page might not need to load JS & CSS, This is because that file might already exists in browser cache when some other website requested it. You should Thank to other guy – someday other guy will thank you.

Website Optimization – Minfiying Output using PHP

Minification is technique used in which we remove all unnecessary Whitespaces Which include tab, space, newline, carriage return etc. This is done just to reduce data transfer.
Since not everyone is serving a million hits a seconds – minifying html doesn’t help much, Instead enabling GZIP compression is a better technique.

Apart from this – minifying css & js help in reducing number of requests – this reduces number of http connection created to serve a client hence reducing load on server.

These advantages are only a small part of web optimization process but still it is adopted widely – Why ? – Show off !, Make code unreadable, looks cool!.
Though server level optimizations ( like – GZIP, Cache proxy for static content, Apache vs Nginx, CDN, Server location, Number of DNS lookup, Serving content based on user device etc.) work better than just compressing code & uploading.
As an example – there is latency of 400+ ms from India to NY, while 100+ ms for Singapore from India – if we have 10 request per page using singapore server will save 3sec!.

I think i am diverging from main 0bj3ct!v3 of this article. Coming back to minification everyone wants their code to look cool. So recently i’ve been working on college fest site & had idea to minify the code. Did google search & finally came up with following code !

Just including the below code makes the html one-liner & cool.
This is just an application of output buffer, you can do really cool things by handling output buffer.

You can read more about ob_start here, its really interesting.

Note: This code has methods to minify css & js – They are just for reference. I suggest using Grunt with cleancss & uglifyjs, or either. Also you should not use this technique in sites with heavy traffic – it will increase load on server, reduce response time.

Maintaining JS Libs – Changing bower path

If you are web developer it becomes hard to manage all js libraries if you aren’t using any package manager.
Few folks created bower to make the job easier. (As far as functionality is concerned, it works same as npm, if you have used.)

Steps for installations are just for linux guys – windows people – DIY.

First you need to install npm sudo apt-get install npm
You can install it by sudo npm install -g bower

Once you have installed bower, You can install a web package, just by executing following command in the directory of project –

bower install jquery

You can even use your own git repository as source or any other url

bower install https://github.com/jquery/jquery.git#2.0.3

Above command will install jquery for your project.

Once you have installed it successfully things will be kept in
bower_components/jquery
you can refer it by
bower_components/jquery/dist/jquery.js. Simple. That’t it ?

No.

if you are like me – possessive about your directory structure – You want everything inside something like assets/lib/jquery/dist/jquery.js, or simply want it inside assets/lib directory instead of too many file/directory in root.

You can create a file in root of project .bowerrc, put following content inside the file
{
"directory" : "assets/lib"
}

Since filename begins with . it will be hidden in file browsers, clean.

Again – do you find these long assets path annoying ? – Grunt to rescue – check http://bower.io/docs/tools/#grunt

Optimal way of scheduling long job – nodejs/js

If you are using nodejs for some backend stuff to schedule some-work, probably you will hate scheduling because of two factors

  1. You can’t set long timeout
  2. Timers are removed when process restart

You would probably favor cronjobs for scheduling – or may be polling database for regular interval. Loading nodejs script every 5min is a costly job – developers understand, so you can’t follow php/short interval scheduling.

First problem can be solved by of setting a recursive call to a function, with time out. Many solutions on web.
But there lies a problem – You can’t cancel the timeout if you needed to after a recursive call. To solve the recursive call problem i wrote few lines of code.

Above code behaves same as setTimeout if you don’t need timer object updates, else you can register for that.

Second problem – Timers are removed when process are restarted. – That shouldn’t be hard – You can create a function to execute atomic functions,
Note atomic – atomic because if your setTimeout code will depend on variables state you will need to load those variables from database which will make job harder – better way is to schedule something like – send email to [email protected] instead of send message to currently loggedin users.

Solution of second problem really depends on your problem – but if you analyze your scheduled jobs closely you will find that they are really atomic, you made them non-atomic !.

if you really need that your jobs are never lost from memory better way is to run a stable nodeserver & use IPC. – but that would be practically hard to maintain.

Javascript Domless events

Using jQuery or nodejs makes you addicted to pubsub (publish/subscribe) model.
Nodejs has nattive support for event emitters but javascript doesn’t have. Events are only binded to a DOM, using which doesn’t make sense and its also adds to performance cost.

To solve the problem i wrote a custom class which implements on and emit method, Yes some methods like once are missing.