Nishant Kothary on Perfection

Thoughtful musings by Nishant Kothary on perfection in web development and design, but I most enjoyed this description of Russel’s Paradox, from Fermat’s Enigma: The Epic Quest to Solve the World’s Greatest Mathematical Problem:

Russell’s paradox is often explained using the tale of the meticulous librarian. One day, while wandering between the shelves, the librarian discovers a collection of catalogues. There are separate catalogues for novels, reference, poetry, and so on. The librarian notices that some of the catalogues list themselves, while others do not.

In order to simplify the system the librarian makes two more catalogues, one of which lists all the catalogues which do list themselves and, more interestingly, one which lists all the catalogues which do not list themselves. Upon completing the task the librarian has a problem: should the catalogue which lists all the catalogues which do not list themselves, be listed in itself? If it is listed, then by definition, it should not be listed. However, if it is not listed, then by definition it should be listed. The librarian is in a no-win situation.

Check out Nishant’s work on Microsoft’s Build 2013 website.

And also his copyright statement.

 

Good Advice on AngularJS

Angular.js has helped me understand what people when they describe a tool or framework as “opinionated”. Unlike Backbone, which, after you learn the importance of setters and getters, pretty much gets out of your way, Angular seems to have very specific expectations  for how you should do things. While working on our current project, my team spent the first couple of days marveling at how straightforward Angular seemed, and then another couple of days trying to understand why we couldn’t do anything that was a little more complex in a way that Angular liked.

More than any other piece of advice, Josh Miller’s Stack Overflow response to a question about thinking the ‘Angular way’ helped me out:

Don’t even use jQuery. Don’t even include it. It will hold you back. And when you come to a problem that you think you know how to solve in jQuery already, before you reach for the $, try to think about how to do it within the confines the AngularJS. If you don’t know, ask! 19 times out of 20, the best way to do it doesn’t need jQuery and to try to solve it with jQuery results in more work for you.

I had indeed been trying to do DOM manipulation with jQuery, and it was only after I took Josh’s advice to go completely without it that I started to make real progress with the problems I was trying to solve.

By the end of the week, I found myself ripping out large, chunks of HTML that I’d thought were too different to template, and making up for the differences with controllers and some carefully applied scoping.

I’m still a little vague on what constitutes just cause for moving something into a directive (can’t the same things be accomplished with partials and controllers?), but Angular seems to be sinking in.

Underscore Contributor

I made my first open source contribution today, which is pretty exciting (for me). On Sunday night, I was working my way through Underscore‘s unit tests, which I’m porting to Bound.js to use for testing Underscore compatibility. I noticed that one of the unit tests for lastIndexOf() was actually calling indexOf(), most likely because someone had just copied and pasted the unit tests for indexOf() and then forgotten to change one method call.

Easy fix, so I forked Underscore, cut a branch and submitted a pull request. This morning Brad Dunbar approved and merged it, and now I’m an Underscore contributor.

N Queens

While the topics and technologies are coming fast and furious at Hack Reactor, one assignment has held my attention more than any other in the last couple of weeks. Once through the eleven-hour day, I find myself spending my copious free time revising my strategies for the ‘N Queens’ problem (in which some number (n) of queens must be placed on an n by n chessboard so that they do not threaten each other).

It has been a very long time since I wrote much recursive code, so my first pass at the problem was pretty naive, very expensive and just focused on remembering how to structure a base case and recursive call. I realized I could represent the board as an array of n elements (in which positions on the board were represented by using the index of each element to represent a piece’s column, and the value of the element to represent the row), rather than a matrix:

n-queens-board

var sampleSolution = [1,3,5,0,2,4];

 

Therefore, a brute-force approach to generating solutions simply required all permutations of the sequence 1 to ‘n’. However, my approach was to generate all possible board permutations in which each column held one queen, and to then iterate over each permutation in the array, convert it to a two-dimensional array that represented the board, and then test that for queens conflicts. So, yeah. Slow.

My first rewrite was almost top-to-bottom. I started by rewriting my validity checking functions to run on the one-dimensional representation of the board, removing the step of converting each permutation to a matrix board when I checked it. This way, I could check boards as they were generated, and only push valid boards into the solutions array. That made a huge difference.

Next, I started looking at ways that I could prune possible board spaces. Since a board permutation is generated column by column, I could run a validity check on the board every time a new element was added to the sequence. If a partial sequence had queens conflicts, then it was safe to ignore any permutations that began with this partial sequence. This change dramatically increased the performance for large boards.

What next?

The immediately obvious improvement lay in the way I was testing partial sequences. For each board fragment (e.g. [1], [1,3], [1,3,5], etc.), I would test each element (column) for queens conflicts. In order to reduce the number of checks, I implemented a breadcrumb hash to to track conflicts. I gave each diagonal on the board a number, and hashed a to a simple array for every diagonal on the board that was occupied. This way, every new element added to the candidate board could be checked against the hash, rather than re-checking conflicts for every element in the array so far.

Oddly, this didn’t give me the speed improvement I was expecting. I suspect that’s a factor of my implementation, which is what I’m analyzing right now.

If I can nail that down, I’m going to investigate two more possible optimizations:

  1. I think I can check for diagonal conflicts using bitwise operations, which should be faster than my current checking method. I know that using buffers in Node.js might provide even faster bitwise operations, but I’m trying to keep things in the browser for now, so we’ll see how that goes…
  2. Concurrency: I’m also wondering if I can divide the space of potential boards to be generated so that they can be worked on by separate workers. Not too sure about that yet…