Monday, August 31, 2009

[Misc] Two Pragprog Books Reviewed

Book Review: The passionate programmer and Pragmatic Thinking & Learning

Recently I am getting more and more attracted by the books from "The Pragmatic Programmers / Bookshelf" (link). So I share my thoughts with a review of three books for you. Here I review:
  1. Chad Fowler, "The Passionate Programmer", 2009
  2. Andy Hunt, "Pragmatic Thinking and Learning", 2008
So let's start:

1) Chad Fowler is well known in the Ruby and Rails scene. So he he shares his visions using 53 chapters. Each with a message for you and explained. These messages look a little like the XP programming message. And sometimes they really read like these XP rules:
  • "29. Learn how to fail" (testing)
  • "18. Automate yourself into into a Job" (daily integration builds)
  • "28. Eight hour burn" (no overtime)
  • ...
But indeed his stories are nice to read and they go far beyond XP rules. They bind personal experience together with passion and and a possible new perspective for you. So his main point is to step out of daily routine, step back, get better and build up new goals for you.

Chad is quite strong in selling his point and most of the points are really fun to read (for example "20. Mind Reader" or "45 You've already lost your job"). So this book is not for experts who have already found their mission in doing independent consultant work for an apache product of which they are a top committer. It's a book for the employee wishing to get motivated and possible building up new perspectives in his career. And of course for beginners that might be reading an XP book at the same time. Chad includes nice actions for each point so that each point can be validated for yourself.

What distracted me a little is the analogy to other jobs. Many writers today cite that they have played (jazz) music in a band. And the challenges in a jazz band are quite the same to a software developer. Chad elaborates a lot on this topic. Even Andy Hunt (see the next review) draws this analogy and many other books (e.g. Presentation Zen by Garr Reynolds) can not leave this point - e.g. be the worst guy in your team - untouched.

Nevertheless it's a fun read if you want to break out and the book should also be recommended in software engineering / programming lectures.

2) Andy Hunt also wrote a remarkable book in combining cognitive sciences with software development. And indeed this is neither a neuroscience book nore a software development book. It is a wonderful walk through topics like:
  • Journey from Novice to Expert
  • This is your Brain
  • Get in your right mind
  • Debug your mind (how your mind works)
  • Learn deliberately
  • Gain Experience
  • Manage Focus
  • Beyond Expertise
This book could also be named "Your brain - The missing manual for software developers". So it's a wonderful guide to understand your brain and how to improve. The book is full of nice graphics, anecdotes and actions the reader should do. Through the book Andy collects Tips which are grouped together at the end of the book in a nice reference card.

If you read this book you will find some topics not new for you as mind maps or wikis. But Andy puts this in a context, gives many advices and he touches a lot points which will be new for you. For example the intense description or the L- and R-Mode helps a lot on how to reflect and use this modes in daily life. And there are a lot of other great new topics you can experience (as morning notes, SQ3R, etc.).

There were really just a few pages that were uninteresting to meas his elaboration on "expect the unexpected" or his way of categorizing generations.

Nevertheless it's a very practical book covering wide ranges of topics as drawing (there are some drawing exercises for you inside) up to yoga techniques. And everything could be applied for your daily life, job or your software development. So this book is a clear buy recommendation and even better a good present for your hacking friend or partner.

Thursday, August 20, 2009

[Arch] UML Tools for Mac OS X

Following up a question I received via Twitter, and the fact, that a significant part of the developer-community is using Macs, I thought this might be a good opportunity to discuss some "UML Options" for the Mac. Now, this article is not meant as a definitive answer, I would hope for some follow-ups by readers in the comments.

Ok lets start: First there is heavy weight stuff, most notable Visual Paradigm. A warning: this is a fat tool. However, among the fat tools it is the one I liked the most. I am not using it any more, but it is generally rather easy to use and very feature rich. However, it is a pretty expensive commercial tool. Yes, they have a "community edition", because it is cool to have a community edition these days. But this one was (when I used it last year) rather a joke. See it as a test-preview.

There are also other commercial tools as well, e.g. Omondo. I have not much idea about this one though. Anyone?

On the other end of the spectrum are tools like UMLet (or Violet), which are also Java-based and work more or less good also on the Mac. These tools are very basic and one should not expect much. They are definitly not suited for "real" projects or commercial application, but can be a nice option e.g. for educational purposes. Sometimes one just needs to create some simple UML diagrams for a presentation, paper or book. For such purposes these tools might be useful. Plus both are Open Source tools.

The probably best free (but not Open Source) UML tool, and the one I would recommend is BOUML and this is sure worth a try. The main issue I have with nearly all free/OS UML tools is, that they are often driven by a single person or just very few developers. Hence the future of the particular tool is always a little unclear. To make things worse, there is no accepted open file-format for UML diagrams, that would allow easy exchangeability of the tool. Hence selecting a UML tool is always sort of a lock-in situation.

Also a consideration could be ArgoUML, which is also an Open Source tool and maybe the oldest one around. Has some issues as all OS tools, but apparently has a functioning community.

Finally there are some more or less general purpose drawing programs, that can be used for technical diagrams like EER or UML models as well (with some limitations) like OmniGraffle or Concept Draw and finally also OpenOffice Draw can be used for general purpose vector-oriented diagrams.

Would be happy about comments, experiences and further suggestions!

Thursday, July 02, 2009

[Tech] Monitor your WS calls

If you develop applications, which consumes web services from other applications or integration platforms, debugging can often be very deflating. If you don't use the correct debugging tools, you don't see the generated SOAP messages which are delivered between the parties.

A very useful tool is the Open Source SOAP monitoring tool from predic8. The tool does the same as the TCP monitor from Axis, but provides a more user friendly UI and more settings and features:
  • Monitoring of SOAP and HTTP messages
  • Rule based SOAP routing
  • XML formatting and syntax highlighting for SOAP messages
  • Interception and modification of messages
  • HTTP chunking
  • HTTP 1.1
  • Loading and saving of configurations
  • Rich graphical User Interface
  • Resending of messages
The monitor acts as a proxy. Therefore your client application must send the SOAP/HTTP messages to the proxy monitor, which delegates the messages to the real endpoint. A Quick Starter Guide is also available.

[Pub] Mule Tutorial

In the current issue of the Java Magazin I published a tutorial to develop loose coupled systems with Mule. The tutorial illustrates the usage of an Enterprise Service Bus in an airport domain, where different airport systems communicate with each other over the ESB. In the example I use a set of important Enterprise Integration Patterns and show how these patterns are implemented in Mule. Some patterns I used are:
  • Event Driven Consumer
  • Content Based Router
  • Filter
  • Transformation
  • Message Splitter
The transports and connectors I used from Mule are:
  • JMS (Active MQ as message broker)
  • Quartz Transport
  • File Transport
  • XMPP transport for instant messaging
The source code of the tutorial can be downloaded here.

Have Fun!

Monday, June 29, 2009

[Misc] Hot deployment with Mule 3 M1

Some interesting news from the Open Source ESB Mule. The first milestone from the third version of Mule is out and comes with a major important feature: Hot Deployment

What is the meaning of hot deployment?

Hot deployment is a process of deploying/redeploying service components without having restart your application container. This is very useful in a production environment when you have multiple applications connected over the enterprise service bus without having to impact users of applications.

Check out the example on the mule homepage.

Thursday, June 18, 2009

[Misc] Resilient Services & Software Engineering

I recently read the interesting paper by Brad Allenby and Jonathan Fink "Toward Inherently Secure and Resilient Societies" published in Science August 2005 Vol. 309 and surprisingly enough, free to download. This paper is apparently "inspired" by the attack to the World Trade Center, however discusses resilience of important systems our modern societies are depending on in a more general way. The authors definition of resilience is:
"Resiliency is defined as the capability of a system to maintain its functions and structure in the face of internal and external change and to degrade gracefully when it must."
The further state that:
"[...] the critical infrastructure for many firms is shifting to a substantial degree from their physical assets, such as manufacturing facilities, to knowledge systems and networks and the underlying information and communications technology systems and infrastructure.

[...] the increased reliance on ICT systems and the Internet implied by this process can actually produce vulnerabilities, unless greater emphasis is placed on protecting information infrastructures, especially from deliberate physical or software attack to which they might be most vulnerable given their current structure."
The authors apparently have more physical infrastructure in mind (like physical network backbones and the like), however, I am a little bit more worried on the pace certain type of pretty fragile IT services becomes a foundation for our communication and even business models.

I wrote in a recent blog post about my thoughts on Twitter, which became even more important considering the latest political issues in Iran and the use of this communication infrastructure in the conflict. Twitter is (as we know from the past) not only a rather fragile system, it is additionally proprietary and has in case of failure no fallback solution in place.

But Twitter is not the only example: many of the new "social networks" are proprietary and grow at a very fast speed, and we wonder how stable the underlying software, hardware and data-management strategy is. Resilience is apparently no consideration in a fast changing and highly competitive market. At least not until now.

But not only market forces are troubling these days, also political activities that can effect large numbers of systems. Consider the new "green dam" initiative, where Chinese authorities demand each Windows PC to have a piece of filter software pre-installed that should keep "pornography" away from children. This is of course the next level of Internet censorship, but that is not my point here. My point is, that this software will be installed probably an millions of computers and poses a significant threat to the security of the Internet in case of security holes.

Analysis of the green dam system already reveal a number of serious issues. For example Technology Review writes about potential zombie networks, Wolchok et al. described a serious of vulnerabilities. Now this is not the only attempt in that direction. Germany for example is discussing "official" computer worms that are installed by the authorities on computers of suspects to analyse their activities. France and Germany want to implement internet censorship blocking lists of websites. The list of the blocked websites are not to be revealed and it is questionable who controls the infrastructure. Similar issues can be raised here.

I believe, that also software engineering should start dealing with resilience of ICT services and describe best-practices and test-strategies that help engineers to develop resilient systems, but also to allow to assess the risks that are involved in deployed systems. I am afraid we are more and more building important systems on top of very fragile infrastructure and this poses significant risks for our future society. This infrastructure might be fragile on many levels:
  • Usage of proprietary protocols and software that makes migration or graceful degradation very difficult
  • Deployment of proprietary systems to a large number of computers that cannot be properly assessed in terms of security vulnerabilities or other potential misuses, instead of providing the option to deploy systems from different vendors for a specific purpose
  • Single points of failure: many of the new startups operate only very few datacenters, probably even on one single location
  • Inter-dependece of services (e.g. one service uses one or multiple potential fragile services)
  • Systems that can easily be influenced by pressure groups (e.g. centralised infrastructure vs. p2p systems) e.g. to implement censorship
  • Weak architecture (e.g. systems are not scaling)
  • Missing fallback-scenarios, graceful degradation.
Comments?

Saturday, June 13, 2009

[Misc] Technical Debt

Recently I stumbled over a smart blog entry about the 'technical debt' (link).

The idea is quite nice: imagine everyone would have a 'perfect' software system in mind to be build. Well in fact we live in a 'real world' and a 100% perfect project is always a goal but not the current status. But of course we all strive for 100% as we strive for 100% test coverage.

But the fact is that some companies / developers build better code and some build a little worse code. Now imaging if we could measure this 'worsiness'. Of course a 100% accurate and correct measurement is not possible and subjective. But sonar from codehaus tries to go that way.

Their technical debt is shown:
  • in $ (!!! ouch this hurts)
  • in a spider figure
  • in the form of numbers you can drill down
What they do is they measure at least:
  • The Code coverage
  • The Complexity
  • The Code Duplication
  • The Violations
  • The Comments
There might be more measurements to be integrated soon. And you will surely agree that code comments should have a different weight then the code complexity. Should they?! But what I suggested in my comment is, that it would be great if this measurable debt would be a standard for all projects.

Software developering companies could use a low debt as a marketing instrument. And they likely sell more! The buyer will check the technical debt of the software they buy. As a usual procedure. If the debt is low, the product might be a good and changeable investment that can grow.

If the debt is high the vendor has a problem. Vendors might think they can lock buyers in because they don't check the technical debt. But I am sure time will change and tools like this will be standard in IDEs in 5 to 10 years. Even to check projects in multiple languages.

So for me it's time to face the boss with hard dollars he has to pay back. Sooner or later. The later the more expensive. Let's fight for a technical debt / good metrics analysis as a common procedure!

Stefan Edlich

Tuesday, June 09, 2009

[Tech] Cloud Computing

As I think, I already mentioned here, I believe, that Cloud Computing (and Software as a Service, but this is a slightly different topic) are true game changers in our understanding of software infrastructure and development/deployment. Currently things are still quite rough around the edges, but I believe, that in like 3-5 years the default option of application deployment will be in one cloud service or another. Putting iron into the cellar or storage-room will be what it should be in my opinion: mostly a stupid idea ;-)

In the current stream of IT Conversations George Reese talks about practical aspects and experiences with current cloud-services like Amazon S3, Simple DB, virtualisation... Recommended!

Tuesday, May 26, 2009

[Tech] Kent Beck's JUnit Max

JUnit is a testing-framework well known to every Java developer (according ports to other languages exist). Kent Beck and Erich Gamma were the core-developers of JUnit, which was published around 2000 as Open-Source framework. It is fair to say, that JUnit and its ports had a huge influence on quality assurance and can be found in nearly every modern software project.

Now Kent Beck announced a new project JUnit Max. The core concept is "continuous testing". JUnit Max is an Eclipse plugin that starts the unit test execution everytime a class is saved and controls the test-execution ordering according to the classes that are worked on and tests that have failed recently.

In my opinion this seems to be an interesting and logical next step in unit-testing-frameworks. JUnit Max however is not Open Source and follows (in my opinion) a rather strange license model (more can be found on the website). I wonder, whether the additional benefit justifies the license fees and this particular model and more important, how long it will take until this functionality is provided by an Open Source solution...

Tuesday, April 21, 2009

[Tech] What about Maven 3

At thte last Maven Meetup Jason van Zyl talked about the future of Maven and important milestones of Maven 3, including:
  • Support of incremental buildings
  • Changes about the Plugin-API
  • Better Multilanguage support
  • and more
The video and slides about the presentation are available here.