A short description of what we do with respect to professional software development at HDM

Professional large scale development

Trends in software development

The End of Software Engineering has come, more...

At least if you believe in Tom DeMarcos second thoughts on his earlier statements on measurement and control in software projects. An ex- control freak gone soft over the years or a necessary correction?.

Social Software and Social GUIs, more...

A few comments on two articles by Clay Shirky and Joel Spolsky on social software and what kind of GUIs are needed to support group behavior. Simple usability is just no longer enough. I extended it with some ideas on very large multi-touch devices and how they change the GUI again.

Requirements engineering as ill posed problem specification, more...

The current way to specify requirements creates "inverse problem specifications" - a rather costly and slow way to pose a problem. It works backward from a given solution and needs to find the correct input parameters by tedious approximation. A nice theoretical argument for agile project management where business and IT together create requirements.

Beautiful architecture, more...

Short notes on the new book from Oreilly. Good articles on various kinds of architectures..

Math for books on logic, knowledge representation, more..

To read books on symbolic processing, logic and knowledge representation some basic math on first order logic, sets and graphs, relations etc. is needed. An easy read by John F. Sowa.

NETT colloquium on new trends in information technology, more..

Yesterday I attended NETT at the University of Freiburg. A short report and comments on the tracks on communication technology and economics/technology. Keywords: network coding, cross-layer architecture, cloud computing and compliance.

Design vs. Programming Language - a proper antagonism?

The recently published second edition of the excellent book "Software Architektur:Grundlagen-Konzepte", amongst others written by members and friends of the Computer Science and Media faculty at HDM is an opportunity to discuss the relation between design an programming language. While architecture and design are key, underestimating the dangers but also the power of a good programming language can really cut down on your productivity. A short discussion of some common misconceptions around architecture and programming languages.

Infoq.com - the new portal site, more...

Take a look at an amazing new portal site for IT-interested people. Excellent articles and videos from on of the makers of theserverside.com.

API is UI or "why API matters", more...

Few programmers are aware that API design really is user interface design. And few know some basic rules on API design like minimal interfaces etc. Here is some information extracted from an excellent article in QUEUE (the ACM magazine).

IBMs new Unified Method Framework Methodology, more...

The Computer Science and Media faculty at HDM is one of the few universities in the world that has IBMs blessing to teach this special methodology. It has its roots in the Global Services Method and the Rational Unified Process Methodology. In its fifth installment Bernard Clark, Senior IT Architect and Managing Consultant at IBM GBS and University Ambassador for HDM will cover new channels and media in the financial industry. Governance, service-orientation etc. will be big topics of the workshop as well. Particpants will learn the continuous refinement of visions to tractable models and methods.


Let me know if you want to attend the workshop. It is usually held during the summer term at HdM on three Fridays.

Morphware and Configware - a new computing paradigm, more..

This is a discussion of a very interesting paper by Reiner Hartenstein, TU Kaiserslautern, on the success of FPGAs and the problems of programming configurable hardware. He describes the benefits of configuration (improvement of the von Neumann Architecture) and we software people understand the problems of it by now as well (;-). At least in software there is a trend back from configuration to more flexible programming languages.

I found the article in the book "nature based computing" which I had ordered for distibuted systems in the winter term. There are quite a number of nice papers, e.g. on hardware architecture, statistical methods and swarm computing.

Thou shallst not write parsers by hand, more..

Only a short reminder that parser generation toolkits exist (like Antlr) and that they should be used for reasons of quality and maintenance. And a mentioning of the second edition of Wirths book on compiler construction which is just unbelievably well written - in case you need to stock up on compiler technology and are too shy for the 1000+ pages of the dragon book. Go get Wirths book! At 24 Euro this is a bargain!.

The big picture: Generative Computing

We see a lot of evidence that large scale development nowadays is based on generative computing approaches. Enterprise Java Beans e.g. cannot be productively used without a massive amount of tooling to support developers. In many projects a combination of XML meta-data and frameworks is used to create extensible software. Frame Processors are used to generate template processors (Struts tiles etc.). Model-driven architecture is already a buzzword and begins to spread outside of OMG as well. And last but not least Aspect Oriented Programming starts getting into the focus of many developers (AspectJ, AspectWorkz).

Flexibility is a big problem for standard software. Many products suffer from feature-bloat and still lack vital features for some customers. Creating flexible frameworks that can cover special domains and allow specialized applications to be built is top notch developer know-how. The lecture will cover the concepts of domain analysis and production line engineering as well because both technologies are intimately tied to generatve approaches.

But before we tackle these things it is good to start with the basics: A professional large scale development process. Not high-end generative computing yet but still necessary to achieve the higher goals later.


Professional to me means automated as much as possible. And convenient for even large teams. Professional also means that the whole lifecycle of a project is covered. Which ties the development environment back to the software architecture: how do we split development? How do we package source code and deliverables? How do we secure our artifacts?

But there is more to large scale developemnt than just tooling. How do we structure our development process? When do we create heartbeats? Do we use rolling baselines or fixed baselines?

And last but not least: do we use some form of extreme programming/scrum or do we follow a conventional top down method like Rational Unified Process (yes, I call it top-down no matter how much Rational tries to make it look "Xtreme").

And I would like to pass on 16+ years of experience in large software projects - from Unix kernels, embedded controls to frameworks and web portals.


Not so sure about it yet. I'd like to tie in students with specific experiences (eclipse, cvs, ant etc.). I'd also like to run it as close as possible to real development which means we need some project to work on. Team work as always with each team focusing on one tool but participating/using other tools.

An old idea: Distributed System Development Environment

This talks about requirements of a large scale distributed development environment and how an XML information bus could tie all the different meta-data of tools and runtime environemnts together. The goal is to have much more impact control and traceability as we have now. Ironically, while our object systems turn more and more distributed, the tooling behind stays local and file based.