Impressions from SD East

Yesterday, I visited the Software Development conference SD Expo in Boston. While I only had the free Floor pass, the organizers sent out an email that allowed me to visit some classes. If you are interested in what I learned there, read on…

Full Article

Exhibition Floor
Noteworthy participants were Perforce (which I use at work for configuration management), Cutter (I once purchased their J2EE application server report), Parasoft (they make software test tools, the best known one being JTest), Quantum Books (an excellent tech book store 100 yards from where I work – I am their best customer), and mySql (that’s the database driving this site).

Project Jxta: The Next Generation of Networking – Garry Seidman
I was lucky enough to run into Garry Seidman, who would speak about Jxta Wednesday morning. But this way I got a personalized presentation. Despite the name, Jxta has nothing to do with Java. Jxta is a new network protocol that can run on top of everything from TCP/IP to Bluetooth to 802.11b (WiFi). There were great promises about it in the context of self-organizing hardware clusters, but this is still many years away.

Mapping Objects to Relational Databases – Scott Ambler
At almost all places I worked throughout my career, I used object oriented programming languages (mostly Java) to talk to relational databases. Scott discussed possible object – table mappings, and their individual trade-offs. His strongly supported tools that do the dirty work: persistent layer tools, caches, etc. Technologies out there are EJB (he seemed fond of CMP 2), Data Objects, and some others – I have to do some research. His biggest pet peeve was that the data people (DB Admins) and developers don’t understand each other, and don’t communicate well. Another pet peeve was coupling, and that we have far too much coupling in most enterprise systems (in the form of hand-knitted SQL). Using mapping tools can do a lot good here. Another interesting topic was “data refactoring”. He urged people to consider changing legacy schemas if possible, rather than to try to work around them. Reasoning: A bad schema will never go away – it will only get worse. Along the same lines, he encouraged people not to spent too much time initially on the data model, and then to stick to it. This is essentially working with a legacy schema, where there is really no need to do so. Be flexible and consider changing the data model. However, he also stressed the importance of Unit tests in this context. Without unit tests, this kind of “data refactoring” is next to impossible.

The Future of Open End-to-End Software Systems – James Gosling
This was mainly a fun event. James Gosling is one of the original Java architects, and he’s also the original author of Emacs. It was more of a visionary talk, and he was a lot of fun to listen to.

The Pragmatic Project Manager – Johanna Rothman
I wasn’t sure what to expect from this session, but it was terrific. It was a birds-of-a-feather session – which means that is was a moderated open discussion. Topics ranged from requirements gathering (“lock marketing people and project managers and engineers in a room, until they have requirements acceptable to everyone! This can take days, and requires senior management buy-in”) and time estimation (“we calculate with a 4-day week”), to the nitty-gritty, like discussing actual methodologies (“critical path” approach, where all the buffer time is put at the end of the project, worked very well for a number of people). I was impressed by an example Johanna brought up from 1990. The architect enforced the “Test-First” design, where first the unit tests were written against a “sceleton” architecture, before any code was produced. This was a 100,000 line C project. They delivered on time, and the customer didn’t find any bugs for the first five years! (after that Johanna lost track of that project.) Johanna also stressed the importance of conciously picking a lifecycle model for the project (waterfall, spiral, incremental, agil, rup, code+fix). Her point was that except code+fix, there is no “right” or “wrong” approach, only better and worse. But without conciously picking, things are more likely to go wrong. Another point she stressed was the value of refactoring. She successfully performed the following with five different teams (at different companies): Towards the last third of the project, she forced the teams for a whole week to stop implementing new features. Instead, they were supposed to refactor the code in teams. Monday they were sceptical, Wednesday they were restless, Thursday they hated her for not letting them work on new stuff. However, by Friday they loved her. In all cases, the architecture got significantly simplified, the code base shrank by 10% or more, because finally the engineers understood the problem completely – at the time they started coding, the didn’t. For somebody who has never done this, this may sound like a waste of time. But at the end of the week, none of the engineers thought that, because with the improved architecture, new features were easier to implement than before. In addition, refactoring cut down testing time significantly – cleaner architecture means less exceptional cases, which means significantly less bugs.