This blog contains reflections and thoughts on my work as a software engineer

lørdag den 29. december 2007

The Pragmatic Programmer - book review

And Christmas came... Somehow it always comes as a small surprise even though the 24. of December is a well defined, hardcoded constant in all calendars available here in Denmark.

I have spent the hours from approximately 10pm to 11pm for the last week or so reading a book authored by Andrew Hunt and David Thomas. The book is named The Pragmatic Programmer and in short it hails a pragmatic approach to developing software and offers a walkthrough to the various concepts and tools which can help individuals and teams adapt this approach.

I had no idea what the book was about - it was handed to me by one of my fellow collegues just before departing the office for Christmas leave. I had the choice between this book and another labelled "Pragmatic source control"... I know myself well enough to know that if I read a book which excels in detailed descriptions of setting up servers and programs I will eventually get rid of it in about 2 or 3 chapters and go on to something more interesting. Writing about the YAGNI principle on my blog, for instance ;o)

The choice was easy to make - I went for Pragmatic Programmer. After having read it I can tell you that it sure does deliver much of what is promised in the foreword. I had the feeling that the book and it's foreword was a paradox within itself because the foreword describes the content of the book as being close to a silverbullet towards developing software the "right" way while the authors make quite a big point out of clarifying that there is no "right" way of doing things and that every bit of software ever written is not even close to being perfect...

However - at the end of the day these are only my personal views and they should not in any way blur the fact that this is indeed a great book to read for anyone working with software development. It aims primarely at people actually coding the software but there are a lot of views and golden tips hidden within which should be considered by more parties in a software project than just the guys coding. An example is a story on how requirements gathering should be approached. The fact is that software requirements are not something you gather - they are something hidden which you dig for. I love this way of describing complex social behaviour in simple terms... Another great metaphore is that developing software should be compared to founding and maintaining a garden. Both enviroments are highly organic and both enviroments depend higly on external factors. Both enviroments are not "done" when you have finished working on them - both enviroments can decay and you will have to pay attention to maintainenance to ensure the state of your enviroment. Both enviroments require tools and educationed staff and the right tools shortens the maintenance time required. I will remember this metaphore forever because it is something everyone owning a house with even the smallest garden can relate to. This metaphore and others like it are the strongest part of the book because it enables the software engineers on a project to communicate needs and requirements to non-technical staff.

A few othe issues worth of mentioning: There is a comprehensive list of tools mentioned and the book does try to embrace both the single developer and software projects within the pragmatic paradigm. I feel that the part of the book about "Pragmatic teams" and similar chapters do violate the Don't Repeat Yourself (DRY) principle also described in the book. If you follow the principles and tips you have already read you can easily deduce yourself towards the issues described in the Pragmatic Team chapter. An example is that quality is described in the first chapters using the Broken Window metaphor - the conclusion is that you should always fight beginning software rot. You read on and in the final chapters about Pragmatic Teams you run into terms such as "Quality is a team issue" which is true - but if you have earlier read that software decay is something to fight proactively during review and refactoring you have already made it a team issue. There are a few of these DRY violations but again: If you do not intend to read the book like I did from one end to the other because you are a i.e. a Project Manager or Product Owner the chapter itself has great value.

Final thoughts: Read this book if you in any way are working with developing and delivering software. The book can provide you with tools and thoughts which you can use to ease communication. Furthermore it digs into basic tools and patterns on how to attach various concepts of software engineering. I enjoyed 95% of the book and compared to the vast majority of the books I read nowadays 95% is well above Good Enough for me ;o)


Links: The Pragmatic Programmer website

fredag den 21. december 2007

Remotely logoff RDP session

If you have ever tried to get the "Terminal server has exceeded the maximum number of connections" errormessage while attempting to log on to a machine using RDP you can remotely shut down one of the connections using PS-Tools... How cool is that? :o)

http://codebetter.com/blogs/gregyoung/archive/2007/12/20/rdp-stuck.aspx

torsdag den 20. december 2007

Constant requiring explicit parsing == evil

I'm working on a project which includes the use of a service-enabled image uploader and imageviewer. The service defines a load of constants and properties in web.config which is fair enough.

However, when porting the assemblies to our local machines (we run all services, source and subsystems on our local machines to be able to work truly disconnected) we discovered a strange error while attempting to upload a test image to our service. Due to a lack of logging within the imageservice itself the cause of the error wasn't appareant at first but a quick review with the programmmer in charge of the imageservice reveiled that it was an issue with the configuration file.

One of the constants in the configuration defined a maximum height and width of an image being uploaded. If the size exceeded the maximums defined the upload would be rejected. The constants were defined in this way:

1024;768

What made the imageservice burp on our local machines had something to do with our regional settings which caused the configuration to try and interpret 1024;768 as an integer.

With our retrospective cap well in place what could / should have we done better to avoid such a problem from occuring?

- Never define constants which require explicit parsing to become valid
- If you decide to do provide detailed logging before and after the parsing occurs
- Take into account within your software that machine regional settings are - well, machine dependent ;o)

tirsdag den 18. december 2007

Die Hard 4.0

I have just finished watching the movie Die Hard 4.0. I'm currently sick and it seemed like a nice movie to watch while trying to get your mind away from Immune System vs. The Fly Virus inside my body.

So we rented a movie - and I must admit that while John McClane fought his way through a bunch of evil terrorists while the entire nation of USA were shut down by the mastermind of the evil terrorists I couldn't help myself thinking "yeah right...". The plot is basicly that a former employee of the NSA (or whatever) got fired, found himself pretty mad about the issue and decided to hack himself into all critical infrastructure systems in USA and shut them down in order to get to people's financial information to live as a rich man the rest of his life.

I like a good movie - I really do. I like a good actionmovie too - and it wasn't a bad action movie. But for the last half a year I have undergone a change in the way I percieve my profession so I couldn't help thinking that the plot was ridiculous to the point of being purely naïve. Let's face it:

The evil terrorists ran everything from something similar to a shutdown factory in the suburbs. A few, but dedicated men were able to hack themselves into loads of various systems and control everything from gaspipes to phonelines, from surveillance cameras in elevators to defense communications systems. What would be the preconditions for such an act?


  • All systems in interest of our evil terrorists would be interconnected and online. If they were not online nothing like this could ever take place.

  • All systems should be based on a similar framework and GUI. Otherwise nobody would ever be able to figure their way into all the various subsystems and trigger the needed events. How the hell would anybody ever want to even try to make core frameworks and guidelines for all areas of public IT and expect the various departements to use them? All the same it appeared that all areas such as banking systems, energy sector, transportation departements etc all had state-of-the-art infrastructure and I doubt that will ever happen... There's a reason why COBOL is still used in the banking world and it is not for the UI package. I doubt that it would ever happen that public IT could be build using the same, shared platform the way I have just seen used by both good and bad guys in the movie.

  • All systems would have built-in support for shutting down public, critical infrastructure basicly by pushing a few buttons. I don't fancy thinking of myself as a wise man - but I don't think it would be in the interest of the US of A to create such a system. I didn't mention YAGNI here, did I?

  • You would be able to control systems like traffic lights online - something which I think is possible today but if I was a decent system architect I would never allow somebody from the outside to modify the system in such a way that it could be possible to turn all lights green in a crossing. The simplest of algorithms could prove that command to be invalid - even if it came from a user logging in with SuperGod of SuperGod administrator rights. I would build in checks for evil commands and simply reject them on-site if somebody tried to tamper with the lights in a crossing - I sincerely hope that I'm not the first one to think this one up.



There were loads of misleading information which actually made me loose focus on the fact that I rented an action movie and shouldn't expect to get anything else than action entertainment. However - being a software engineer I can't help myself thinking in terms like "Come on guys - this one is simply too thick..." when the producers of the movie start showing off some technical nonsense to the audience. Maybe it is just me - actually I wouldn't advice anyone to bet against that assumption - but I think that movies like these tend to pretend that they actually know just a tiny bit about the things that you are capable of when sitting in front of a computer.

onsdag den 12. december 2007

How to make your harddrive suck less hard...

Here we go - this is what I would tell you to do to increase harddrive performance:

Don't use the Recycle Bin
Set the amount of harddrive space allocated to the Recycle Bin to zero. For my own account I must say that it is very seldom that I accidently delete something which is urgently needed (if ever). If you have your sourcecontrol system set up correctly and doesn't work on anything which isn't properly backed up every night - why should you have your harddrive cluttered with files you don't want, you don't need and you won't miss, even if they were gone?

Get yourself a RAM-drive
For a few bucks you can get a license for XtraTools 2007. With this fabulous product you get a number of extremely useful tools for various computer maintenance tasks bundled in a single pack - and one of these tools are RAMDrive which enables you to set up a dedicated drive on your machine which only lives in memory. If you have i.e. 200 MB of memory which you never use anyway you can gain a lot of performance by creating a RAM-drive and point your enviromental variable %TMP% and %TEMP% to this drive. The beauty of this system is simple - because the files only live in memory they are gone once you reboot your computer. Sweeeeeet........ Also you should point your Temporary Internet Files to this drive for the same reason. Unless you are surfing on an oldschool 56Kbs modem connection you should never feel any delays when browsing through pages.

Defragment your harddrive often
You should use the SysInternals Power Defragmenter (a console app wrapped in a nice little GUI) for defragmenting the files on your harddrive. It takes a while to run and consumes quite a lot of CPU when it runs but you get a true boost in performance once all your DLL's in C:\Windows have been defragmented properly. No shit, Sherlock - I gained some 20+ seconds in reboot time once I had my entire C-drive defragmented. Run it once or twice a week on the files you use often - C-drive, your source repository, your Office files etc.