Thursday 22 December 2011

YOW 2011 - loose thoughts

Talks I attended at this year conference were focused on Agile, functional programming and low level performance. It turned out to be quite an interesting mix.

Marry Poppendieck talked about ways of supporting continuous design. One of the advices was to create an environment that facilities fair fights. To achieve this you need to hire good, diverse people. Their diversity let them solve complex problems because they can approach them from multiple angles. Once you have a team like that leave them alone and simply keep supporting them.

I haven’t learnt much new at Agile Architecture workshop delivered by Rebecca Wirfs-Brock but it was good to hear that I’m not some kind of weirdo that demands impossible :).

Simon Peyton Jones delivered 2 amazing and full of passion sessions about functional programming. I’ve done a bit of this type of programming before but I was surprised by Haskell type system and its ability to ensure lack of side effects at the compile time. On top of that on my way back to Sydney I had a good chat with Tony Morris who told me about a project where he used functional programming to create a composable data access layer. Composablity and strict control over side effects is enough to push Haskell to the top of my list of languages to play with :).

Inspired by Coderetreat Brisbane organized by Mark Ryall I’ve decided to use Conway’s Game of Life as a problem that I will try to solve every time I learn a new language. It’s worked for CoffeeScript and I hope it will work Haskell.

The end of the conference was filled by performance tips from .NET (Joel Pobar and Patrick Cooney) and Java (Martin Thompson) land. Both sessions emphasized that computation is cheap and memory access is slow. By simply keeping all required data in L1/L2 CPU cache you can cut the execution time by half. Obviously this was presented using a micro benchmark but still it is something that it’s worth keeping in mind.

Functional languages rely on immutable data structures to isolate processing and in this way control side effects. Unfortunately this means that a lot of memory is pushed around and this has a significant influence on the overall performance. Again, it’s all about trade-offs :).

Martin talked about “mechanical sympathy” which is boils down to having at least basic understanding of hardware your software runs on. Without it your software not only doesn’t take full advantage of the underlying hardware but often works against it which has severe impact on the overall performance. It’s one of those advices that most of us won’t use on daily basis as most our infrastructure is in the Cloud but it’s good to keep it in mind.  

We’ve been told multiple times that “free lunch is over” and CPUs are not getting any faster. Martin proved that this is not correct. It’s true that we are not getting more MHz, we are actually getting less MHz than we used to but CPUs get faster because every year engineers design better chips. The message Martin is trying to send is that we should prove/disprove assumptions based on evidence and not beliefs.

All in all it was another good conference and I will try to attend it next year.