Kindle Universal Dashboard

Kindle is Super awesome !

Because of its e-ink display. I wanted a display to present data which has – Least power consumption, Not painful to eyes & obviously one which doesn’t emit blue light.
E-Ink displays fit perfectly to my requirement – acquiring a display which can be driven using Raspberry Pi or Arduino is hard, size/cost ratio is much higher.

On googling I found some display modules which were more or near to 70-80$ – even smaller display – which impulsed me to get a Kindle Touch (8th Generation) at around 74$ approx.

Kindle - MadhurendraKindle comes with an experimental browser but it is narrowed version of WebKit, which is pretty useful if you want to display static content or just want to make a control panel, it can easily render normal websites which use js/CSS & works pretty well. But support for HTML5 is almost absent – so you can’t use WebSockets to push data, using long polling/fast polling is only solutions so far.

Moreover, there was another problem which I had to fix – Kindle has fixed timeout which sends it to sleep mode – for mine it was 10min, after digging I found you can use `~ds` or similar but for me, nothing worked.

We can only hope that support to remove timeout or change timeout period will be added in future releases. I think old kindle supports.

If you can’t change timeout or you want to use few other features I suggest you to jailbreak. Follow steps mentioned here http://www.mobileread.com/forums/showthread.php?t=275877 , Don’t jump, It works for kindle touch 8th generation. Tried, tested, working !  For KT3 you will need to install MRPI along with KUAL. Once done your kindle is officially out of warranty 😀 . Post that you need to install USBNet – it will  allow you to ssh to your kindle.  All this will allow you to execute “lipc-set-prop com.lab126.powerd preventScreenSaver 1″ this on the kindle. It will simply disable screensaver. 🙂

Once you have your kindle whose screen doesn’t lock you can simply go & execute  a simple nodejs script to push data.

Note : Kindle doesn’t support WebSocket & none of transport methods in socket.io except “polling”. 

Below is nodejs server code :

var http = require('http'),
	fs = require('fs');
	//sanitize = require('validator').sanitize;
	
var app = http.createServer(function (request, response) {
	fs.readFile("client.html", 'utf-8', function (error, data) {
		response.writeHead(200, {'Content-Type': 'text/html'});
		response.write(data);
		response.end();
	});
}).listen(1337);

var io = require('socket.io')({
    "polling duration": 60,
    'log level' :1
}).listen(app);

io.set('transports', [ 'polling']);

io.sockets.on('connection', function(socket) { 
	socket.on('message_to_server', function(data) { 
		var escaped_message = (data["message"]);
		io.sockets.emit("message_to_client",{ message: escaped_message }); 
	});
});

Below is code of client.html

<!DOCTYPE html>
<html>
	<head>
		<script src="/socket.io/socket.io.js"></script>
		<script type="text/javascript">
			var socketio = io.connect();

			socketio.on("message_to_client", function(data) {
				document.getElementById("chatlog").innerHTML = ("<hr/>" + data['message'] + document.getElementById("chatlog").innerHTML);
			});

			function sendMessage() {
				var msg = document.getElementById("message_input").value;
				socketio.emit("message_to_server", { message : msg});
			}
		</script>
	</head>
	<body>
		<input type="text" id="message_input"/>
		<button onclick="sendMessage()">send</button>
		<div id="chatlog"></div>
	</body>
</html>

Voila – It works !

Below is video of working  in case you want to see demo before getting hands dirty 🙂

Warning & Update : This method might consume more power than expected, as experimental browser has loading status – which continuously refreshes a section of the screen. To overcome this problem I will be polling server with long interval difference – which will be adjustable by the server.

Note for nerds :  Since this method uses browser – it’s more flexible – but if you are possessive about power consumption & screen space – You can use JAVA to develop kindlet application. Lightweight pub/sub protocols like MQTT should help you in the way.


photo_20161030_124622Designing a wall holder : 
You can google for covers or design own or use some double sided foam tape. Since i had access to 3d printer i got two of http://www.thingiverse.com/thing:832273 printed & hung it on the wall – it just helped me in reading few books apart from using it only as display. SWAP!    

Use it as :

  • Scoreboard
  • Notification
  • Weather system
  • Wallpaper slideshow
  • News/RSS feeds display
  • Home automation control
  • Anything
  • Book reader 🙂

At the end, even if you place it behind your monitor it won’t hurt or push the new data to your eyes & spoil the code castle you were building.

Custom Nginx with PageSpeed

Pulling nginx from source, customizing & auto updating across servers is a pain,  We use pagespeed & custom header modules across all our servers, it autominifies resources & increases page performance without giving pain.

Below is script we use :

Continue reading

Offline Wikipedia with Elasticsearch and MediaWiki

Wikipedia is Awesome ! It’s open, its free – Yea. & its huge in size, millions of articles  but as developer how to exploit the free knowledge.

I started digging internet just to find ways to exploit my fresh 9+ GB of XML Gzipped archive which seemed to me of no use as even a simple text editor can’t open it. (Just out of excitement what’s inside, how its structured, Schema ! )

Luckily people have already imported it. Elasticsearch is fast, reliable & its good for searching, so https://github.com/andrewvc/wikiparse was a saver.

  • Installed elastic search
  • Ran command to import

it took almost 48 hour in an i5, with 8gb ram – where mistake was i used same harddisk for data storage & database. Time might vary.

Data was imported but its still of no use ! Why ? Its in text/wiki format, parses is needed.

After doing search only solution i found was using mediawiki api, which is in PHP there were lots of things missing as its only for mediawiki but not for parsing plain text. (Though i didn’t spend much time in learning internal API)

I quickly downloaded mediawiki, ran nginx with php, installed it & used API.php.
it was good to see my offline API too, but still many things were missing, confusing, API has hard to modify structure. So i created a parse.php

<?PHP
header('Access-Control-Allow-Origin: *');
header('Access-Control-Allow-Methods: GET, POST');  
header('Access-Control-Allow-Headers: Origin, X-Requested-With, Content-Type, Accept, Key');

function fixLink ($this, $nt, $section, $tooltip, &$result)  {
		$linkName = "editSection-$section"; // the span around the link
		$anchorName = "editSectionAnchor-$section"; // the actual link anchor, generated in Linker::makeHeadline
		$chromeName = "editSectionChrome-$section"; // the chrome that surrounds the anchor	
		$result = preg_replace('/(\D+)( title=)(\D+)/', '${1} class=\'editSectionLinkInactive\' id=\'' . $anchorName . '\' onmouseover="editSectionHighlightOn(' . $section . ')" onmouseout="editSectionHighlightOff(' . $section . ')" title=$3', $result);
		$result = preg_replace('/<\/span>/', '<span class="editSectionChromeInactive" id=\'' . $chromeName . '\'>&#8690;</span></span>', $result);
		// while resourceloader loads this extension's css pretty late, it's still
		// overriden by skins/common/shared.css.  to get around that, insert an explicit style here.
		// i'd welcome a better way to do this.
		$result ='';
		return true;
	}


require(dirname(__FILE__) . '/includes/WebStart.php');

$wgHooks['DoEditSectionLink'][] = 'fixLink';

//include("api.php");

$output = $wgParser->parse(
    $_POST['text'],
    Title::newFromText('Some page title'),
    new ParserOptions());
   
echo json_encode(array('html'=>$output->getText()));

So all steps were :

NodeJs – Improving performance using native binding

This is known fact that javascript is really slow in terms of sync operations like multiplications, divisions etc. – Since nodejs is based on js – Node inherits the curse.
Check http://benchmarksgame.alioth.debian.org/u64/compare.php?lang=v8&lang2=gpp this article to cross check the fact. Nodejs is fast in some aspects. I love it 😛

Doesn’t matter how much you improve your algorithm but c++ will outperform if you are running a sophisticated algorithm but one can push through imprecation.
Nodejs is based on V8 (Most ports), v8 based on C++, You must be already aware of that you can use C++ in nodejs – But it seems complex.

But it is not – It is easy to build a native extension & compile in seconds https://medium.com/@devlucky/how-to-get-a-performance-boost-using-node-js-native-addons-fd3a24719c85#.3uzqa9r4w this article does well in explaining basics. but if you want to dive deep check official documentation.

if you want a really explanatory presentation check https://ivanvergiliev.github.io/node-cpp/#opening.

Email Spoofing – Why its dead !

There was a time, when mail spoofing was an art, was a thing to impress people, was a way to phish attack someone.
With increasing intelligence in spam filters – it became harder, you need good IP reputation to deliver mail to box.
But now it has become almost impossible to spoof address like [email protected] . Why ? Have computer turned intelligent ? No.

The problem of spam protection isn’t new to market. So people came up with DNS based solutions which can allow sender to list IP addresses authorized to send mails.
“Sender Policy Framework (SPF) for Authorizing Use of Domains in E-Mail” – You can read rfc at https://www.ietf.org/rfc/rfc4408.txt (if you want to dig).

The standard was good, Not good it was best! It block all ways to prank people, but mails were still being delivered, because Network administrator weren’t smart enough to add all server. So as workaround big providers ran algorithms on top to make sure genuine mails which are failing spf are not delivered to spam.

This is all good – but for hardcore phishers it became little hard, people do check mails regularly & getting into network is just distributing malware.
Attacker can perform MITM alter content of mail while its being delivered.

There wasn’t any check.

Solution was DKIM – DomainKeys Identified Mail (DKIM) Signatures , it allows all mail servers to sign messages & certain header fields using defined hashing algorithms & verification using public/private key. Public key is published as DNS record, but private key is kept private.

Acquiring private key is little hard. Its hardest thing. You need to regulate keys to make sure that no one cracks it – if you keep key size 2048 it will make mail delivery slow, if you keep it 512bit with present computing its easy to crack.

DKIM provides way to authorize only certain application to send mail, but there was still no way to get reports on how effective is measure, how many mails are being spoofed & what to do with spoofed mails.

Mails were being delivered even after DKIM failure.

People came with DMARC standard – again it was published using DNS TXT record – it helps in getting reports & also blocking mails. Check the rfc at https://datatracker.ietf.org/doc/rfc7489/

Certainly as every security system comes with an overhead, These standard make mail processing resource intensive. There are many ways to reduce processing cost keeping security upto-date.

There were many spamming attacks originating on behalf of our site, post implementation of DMARC using DMARC Plus, they almost reduced 80% after few months.

One thing to note – if you make a single mistake in any of DNS record you can miss all your mails – So its better to take advice from someone who knows Standard well & can help you in deploying. Make sure you go Slow…

Website Optimization – Cache Cache Cache !

You must have heard about cache in web (caches are everywhere in computer science), most times you find it really buggy when changes aren’t reflected as soon as you make them.
For sites with small traffic these things are buggy – but they contribute a major in server traffic when you have a million hits even a thousand.

From server to browser what what we can cache ? & why to cache ?

  1. Enabling code caching – if you are running nodejs server etc things are in your favour because process is running & its already using existing variables still you should make sure that you don’t fetch/store too much data – you should implement cache in your code AND if you are using php or similar scripting language – let me tell you things are really slow – each time you make a request if apache – php thread is spawned, nginx a new thread along with socket is created – PHP code is compiled to opcode & then executed, That’s a lot – You can use opcaches or APC for optimizing php script. Alternatives may follow other languages.
  2. Caching static content – Since static content are not changing every other minute or day – most files are kept as it is for years!, you should tell your server that these contents are rarely changed – Check nginx cache config & Apache cache config
  3. Setting cache expiry header – This one is definitely under you control doesn’t matter if you are in shared hosting or running own server. You should send cache expiry header with all static content. Basically cache expiry tells browser to keep the file for next n days – though it is not strict , browsers send head request to see if file is changed or not.
  4. Offline Storage/WebSQL/Offline Application – Yes, You read it right – You can use offline storage to cache insensitive data on users browser only – this will reduce load on server & data transferred – you can even cache js & css.
  5. CDN – Content delivery networks can also help you a lot in caching – since libraries like jquery, bootstrap etc are so common today if you use CDN your page might not need to load JS & CSS, This is because that file might already exists in browser cache when some other website requested it. You should Thank to other guy – someday other guy will thank you.

Website Optimization – Minfiying Output using PHP

Minification is technique used in which we remove all unnecessary Whitespaces Which include tab, space, newline, carriage return etc. This is done just to reduce data transfer.
Since not everyone is serving a million hits a seconds – minifying html doesn’t help much, Instead enabling GZIP compression is a better technique.

Apart from this – minifying css & js help in reducing number of requests – this reduces number of http connection created to serve a client hence reducing load on server.

These advantages are only a small part of web optimization process but still it is adopted widely – Why ? – Show off !, Make code unreadable, looks cool!.
Though server level optimizations ( like – GZIP, Cache proxy for static content, Apache vs Nginx, CDN, Server location, Number of DNS lookup, Serving content based on user device etc.) work better than just compressing code & uploading.
As an example – there is latency of 400+ ms from India to NY, while 100+ ms for Singapore from India – if we have 10 request per page using singapore server will save 3sec!.

I think i am diverging from main 0bj3ct!v3 of this article. Coming back to minification everyone wants their code to look cool. So recently i’ve been working on college fest site & had idea to minify the code. Did google search & finally came up with following code !

Just including the below code makes the html one-liner & cool.
This is just an application of output buffer, you can do really cool things by handling output buffer.

You can read more about ob_start here, its really interesting.

Note: This code has methods to minify css & js – They are just for reference. I suggest using Grunt with cleancss & uglifyjs, or either. Also you should not use this technique in sites with heavy traffic – it will increase load on server, reduce response time.

<?PHP

//buffer & process all the output using minify_html.
ob_start("minify_html");

function minify_html($input) {
    if(trim($input) === "") return $input;
    // Remove extra white-space(s) between HTML attribute(s)
    $input = preg_replace_callback('#<([^\/\s<>!]+)(?:\s+([^<>]*?)\s*|\s*)(\/?)>#s', function($matches) {
        return '<' . $matches[1] . preg_replace('#([^\s=]+)(\=([\'"]?)(.*?)\3)?(\s+|$)#s', ' $1$2', $matches[2]) . $matches[3] . '>';
    }, str_replace("\r", "", $input));
    
    // Minify inline CSS declaration(s)    
    if(strpos($input, ' style=') !== false) {
        $input = preg_replace_callback('#<([^<]+?)\s+style=([\'"])(.*?)\2(?=[\/\s>])#s', function($matches) {
            return '<' . $matches[1] . ' style=' . $matches[2] . minify_css($matches[3]) . $matches[2];
        }, $input);
    }
    return preg_replace(
        array(
            // t = text
            // o = tag open
            // c = tag close
            // Keep important white-space(s) after self-closing HTML tag(s)
            '#<(img|input)(>| .*?>)#s',
            // Remove a line break and two or more white-space(s) between tag(s)
            '#(<!--.*?-->)|(>)(?:\n*|\s{2,})(<)|^\s*|\s*$#s',
            '#(<!--.*?-->)|(?<!\>)\s+(<\/.*?>)|(<[^\/]*?>)\s+(?!\<)#s', // t+c || o+t
            '#(<!--.*?-->)|(<[^\/]*?>)\s+(<[^\/]*?>)|(<\/.*?>)\s+(<\/.*?>)#s', // o+o || c+c
            '#(<!--.*?-->)|(<\/.*?>)\s+(\s)(?!\<)|(?<!\>)\s+(\s)(<[^\/]*?\/?>)|(<[^\/]*?\/?>)\s+(\s)(?!\<)#s', // c+t || t+o || o+t -- separated by long white-space(s)
            '#(<!--.*?-->)|(<[^\/]*?>)\s+(<\/.*?>)#s', // empty tag
            '#<(img|input)(>| .*?>)<\/\1>#s', // reset previous fix
            '#(&nbsp;)&nbsp;(?![<\s])#', // clean up ...
            '#(?<=\>)(&nbsp;)(?=\<)#', // --ibid
            // Remove HTML comment(s) except IE comment(s)
            '#\s*<!--(?!\[if\s).*?-->\s*|(?<!\>)\n+(?=\<[^!])#s'
        ),
        array(
            '<$1$2</$1>',
            '$1$2$3',
            '$1$2$3',
            '$1$2$3$4$5',
            '$1$2$3$4$5$6$7',
            '$1$2$3',
            '<$1$2',
            '$1 ',
            '$1',
            ""
        ),
    $input);
}
// CSS Minifier => http://ideone.com/Q5USEF + improvement(s)
function minify_css($input) {
    if(trim($input) === "") return $input;
    return preg_replace(
        array(
            // Remove comment(s)
            '#("(?:[^"\\\]++|\\\.)*+"|\'(?:[^\'\\\\]++|\\\.)*+\')|\/\*(?!\!)(?>.*?\*\/)|^\s*|\s*$#s',
            // Remove unused white-space(s)
            '#("(?:[^"\\\]++|\\\.)*+"|\'(?:[^\'\\\\]++|\\\.)*+\'|\/\*(?>.*?\*\/))|\s*+;\s*+(})\s*+|\s*+([*$~^|]?+=|[{};,>~+]|\s*+-(?![0-9\.])|!important\b)\s*+|([[(:])\s++|\s++([])])|\s++(:)\s*+(?!(?>[^{}"\']++|"(?:[^"\\\]++|\\\.)*+"|\'(?:[^\'\\\\]++|\\\.)*+\')*+{)|^\s++|\s++\z|(\s)\s+#si',
            // Replace `0(cm|em|ex|in|mm|pc|pt|px|vh|vw|%)` with `0`
            '#(?<=[\s:])(0)(cm|em|ex|in|mm|pc|pt|px|vh|vw|%)#si',
            // Replace `:0 0 0 0` with `:0`
            '#:(0\s+0|0\s+0\s+0\s+0)(?=[;\}]|\!important)#i',
            // Replace `background-position:0` with `background-position:0 0`
            '#(background-position):0(?=[;\}])#si',
            // Replace `0.6` with `.6`, but only when preceded by `:`, `,`, `-` or a white-space
            '#(?<=[\s:,\-])0+\.(\d+)#s',
            // Minify string value
            '#(\/\*(?>.*?\*\/))|(?<!content\:)([\'"])([a-z_][a-z0-9\-_]*?)\2(?=[\s\{\}\];,])#si',
            '#(\/\*(?>.*?\*\/))|(\burl\()([\'"])([^\s]+?)\3(\))#si',
            // Minify HEX color code
            '#(?<=[\s:,\-]\#)([a-f0-6]+)\1([a-f0-6]+)\2([a-f0-6]+)\3#i',
            // Replace `(border|outline):none` with `(border|outline):0`
            '#(?<=[\{;])(border|outline):none(?=[;\}\!])#',
            // Remove empty selector(s)
            '#(\/\*(?>.*?\*\/))|(^|[\{\}])(?:[^\s\{\}]+)\{\}#s'
        ),
        array(
            '$1',
            '$1$2$3$4$5$6$7',
            '$1',
            ':0',
            '$1:0 0',
            '.$1',
            '$1$3',
            '$1$2$4$5',
            '$1$2$3',
            '$1:0',
            '$1$2'
        ),
    $input);
}
// JavaScript Minifier
function minify_js($input) {
    if(trim($input) === "") return $input;
    return preg_replace(
        array(
            // Remove comment(s)
            '#\s*("(?:[^"\\\]++|\\\.)*+"|\'(?:[^\'\\\\]++|\\\.)*+\')\s*|\s*\/\*(?!\!|@cc_on)(?>[\s\S]*?\*\/)\s*|\s*(?<![\:\=])\/\/.*(?=[\n\r]|$)|^\s*|\s*$#',
            // Remove white-space(s) outside the string and regex
            '#("(?:[^"\\\]++|\\\.)*+"|\'(?:[^\'\\\\]++|\\\.)*+\'|\/\*(?>.*?\*\/)|\/(?!\/)[^\n\r]*?\/(?=[\s.,;]|[gimuy]|$))|\s*([!%&*\(\)\-=+\[\]\{\}|;:,.<>?\/])\s*#s',
            // Remove the last semicolon
            '#;+\}#',
            // Minify object attribute(s) except JSON attribute(s). From `{'foo':'bar'}` to `{foo:'bar'}`
            '#([\{,])([\'])(\d+|[a-z_][a-z0-9_]*)\2(?=\:)#i',
            // --ibid. From `foo['bar']` to `foo.bar`
            '#([a-z0-9_\)\]])\[([\'"])([a-z_][a-z0-9_]*)\2\]#i'
        ),
        array(
            '$1',
            '$1$2',
            '}',
            '$1$3',
            '$1.$3'
        ),
    $input);
}

Associated friends, companies, and partnerships.

This is small list of people, products & companies, I am currently associated with:

  1. hunto.ai
  2. tikaj.com
  3. quickboarding.com : No-code/Low Code Solution for quick-onboarding
  4. blinkstore.in : Create your own merch store
  5. UPADI Vet : A cattle equipment store
  6. Passwrd.in : A simple password generator
  7. thejetboy.com : Awesome blog on Aerospace

Maintaining JS Libs – Changing bower path

If you are web developer it becomes hard to manage all js libraries if you aren’t using any package manager.
Few folks created bower to make the job easier. (As far as functionality is concerned, it works same as npm, if you have used.)

Steps for installations are just for linux guys – windows people – DIY.

First you need to install npm `sudo apt-get install npm`
You can install it by `sudo npm install -g bower`

Once you have installed bower, You can install a web package, just by executing following command in the directory of project –

`bower install jquery`

You can even use your own git repository as source or any other url

`bower install https://github.com/jquery/jquery.git#2.0.3`

Above command will install jquery for your project.

Once you have installed it successfully things will be kept in
`bower_components/jquery`
you can refer it by
`bower_components/jquery/dist/jquery.js`. Simple. That’t it ?

No.

if you are like me – possessive about your directory structure – You want everything inside something like `assets/lib/jquery/dist/jquery.js`, or simply want it inside `assets/lib` directory instead of too many file/directory in root.

You can create a file in root of project `.bowerrc`, put following content inside the file
`{
“directory” : “assets/lib”
}
`

Since filename begins with `.` it will be hidden in file browsers, clean.

Again – do you find these long assets path annoying ? – `Grunt` to rescue – check http://bower.io/docs/tools/#grunt

Optimal way of scheduling long job – nodejs/js

If you are using nodejs for some backend stuff to schedule some-work, probably you will hate scheduling because of two factors

  1. You can’t set long timeout
  2. Timers are removed when process restart

You would probably favor cronjobs for scheduling – or may be polling database for regular interval. Loading nodejs script every 5min is a costly job – developers understand, so you can’t follow php/short interval scheduling.

First problem can be solved by of setting a recursive call to a function, with time out. Many solutions on web.
But there lies a problem – You can’t cancel the timeout if you needed to after a recursive call. To solve the recursive call problem i wrote few lines of code.

function setLongTimeout(callback, timeout_ms, updateTimerID) {
    //if timout is more than 32bit int
    var timer;
    if (timeout_ms > 2147483647) {
        //now wait until the max wait time passes then call this function again with
        timer = setTimeout(function () {
            var id = setLongTimeout(callback, (timeout_ms - 2147483647), updateTimerID);
            if (typeof updateTimerID ==='function')
                updateTimerID(id);
        }, 2147483647);
    } else {
        timer = setTimeout(callback, timeout_ms);
    }
    if (typeof updateTimerID ==='function')
        updateTimerID(timer);
    return timer;
}

Above code behaves same as `setTimeout` if you don’t need timer object updates, else you can register for that.

Second problem – Timers are removed when process are restarted. – That shouldn’t be hard – You can create a function to execute atomic functions,
Note atomic – atomic because if your setTimeout code will depend on variables state you will need to load those variables from database which will make job harder – better way is to schedule something like – `send email to [email protected]` instead of `send message to currently loggedin users`.

Solution of second problem really depends on your problem – but if you analyze your scheduled jobs closely you will find that they are really atomic, you made them non-atomic !.

if you really need that your jobs are never lost from memory better way is to run a stable nodeserver & use IPC. – but that would be practically hard to maintain.