Wednesday, February 20, 2013


Over the course of time there is an inevitable conversation that arises, usually on a periodic basis, about standardizing on particular development methodologies and frameworks. Just so I'm clear, I am going to refer to everything a developer uses to accomplish his specific task/project is a tool, from the IDE to the management methodology to the frameworks. I'm also speaking in general of companies that do internal development but don't actually create software for retail sales, i.e., they don't make any profit off of development.

I'm going to start with an observation. Everyone that I can ever recall having this conversation with, without exception, have all agreed that one of the issues with development, especially at large companies, is the corporate tendency to try and impose on developers a standardized framework for each problem. Everyone see's this all the time in whatever department they are working in.

Just about everyone could come up with a story of having been in a situation where something was mandated from above that they had to live with. If you were lucky, you got a voice in the decision making process but regardless of if your favorite shiny object came out on top of the decision, you had to live with it.

They will start with statements similar to the ones below.
We need to settle on a single database and make sure that everything uses that particular DB. We need to standardize on SVN/Git/Mercurial/TFS and make sure that everything gets moved into that repository so that we can reap the benefits. We need to pick a single IDE and everyone gets to stick with it. And on and on.

In my personal experience, eventually everyone jumps on board the push for standardization, if for no other reason than they hate the product that seems to be winning. Despite the fact that these same people lament this tendency by business because they really believe that the right tool for the right job is the correct way to code!!
In other words, in almost all cases, your colleagues, peers, managers all truly believe that the task at hand should dictate the tools you use to accomplish that task as efficiently and effectively as possible.

Now, there are very valid reasons for standardizing policies, processes and whatnot out there. I adore standards, truth be told. And if I am the business and my bottom line is profit, there are even more reasons. I can leverage everything from economics of scale to product support by purchasing a single product and sticking with it. This is certainly a very large concern and an effective factor that encourages business to push strongly standardization.

Even developers in the trenches have a tendency to push for standardized frameworks and tools because most have a favorite and learning new ones can be viewed as a painful struggle which wastes their valuable time and is ultimately pointless since they could have accomplished their goal in a fraction of the time with the tools they are comfortable using. Few people willingly stretch far outside their comfort zone, even fewer will do it while under the stress of deadlines and, just a guess here, even fewer managers will want to defend the developers for doing so because that means they're productivity is diminishing because they fighting a learning curve.

Now let's step back into the theoretical world of developing software solutions for a minute. As a business my goal is to provide a software solution that generates the greatest amount of satisfaction and benefit for the consumer as possible so that I can generate as much desire for my product, and hence profit, as possible. As a developer my goal is to create a solution that meets the customers needs, is easy to use and generally exceeds their expectations if at all possible. There are other goals involved for everyone, to be sure, but they aren't my point so I'm being a little vague here.

If the corporate standards dictate that I must use a particular technology tool to accomplish my task and my tasks are widely varied, sooner or later, (usually sooner) I'm going to have a task in which I cannot deliver the best software that I can because I won't be able to use the appropriate tool(s).

Is a SharePoint installation really the best solution for a given task? It makes perfect sense to leverage the corporate SharePoint installation if the task lends itself to that environment but what if it doesn't?
Do I really need a full blown MS SQL Server or Oracle installation for every task that requires a DB back end?  What if it needs to be used by some guy in the field that has no access to an internet connection?
Is the best tool for creating a data access layer really NHibernate when the model is trivially simple? Could't I just use Entity Framework and if the application grows swap out to a more appropriately complex tool like NHibernate later when it's actually needed?

What happened to that lamentation that we all shared that the best tool for the task should be used rather than some standard that was decided upon?

Back to the NHibernate vs EF debate (cause that was what was under discussion by my coworkers earlier today) Why do we have to choose between NHibernate and EF?

In my opinion, there should be room for a choice of the most appropriate tool. Entity Framework are to Lincoln Logs what NHibernate is to Lego's. No one is saying that Lincoln Logs should be abandoned by children for Lego's because Lego's are superior. They both have their place. They are both the right tools provided you're working on the right job.

The biggest argument is that it's inefficient to use multiple tools that perform the same function and I would agree with that up to a point. If the tools do the exact same job the exact same way then maybe you have an argument for standardization assuming standardization actually generates real benefit in the long run.

A common example is the IDE. A standardized IDE has been something I've seen pushed over a large number of businesses. My question is, why? What do we gain? Typically the answer is, IDE specific issues will arise. If Bud checks in his development work and he's written it using VS2012 and Bob checks it out and opens the files in say Eclipse, the project files that are being used VS will make no sense and Bob can't compile the projects. Or there are compatibility issues between versions of IDE XYZ so we all need to be using not only the same IDE but the same version.

My answer is the same; why? It takes some effort to ensure that everyone is using the same build process but if everyone is truly using the same standard build process, why can't Bob compile Buds project? Other than convenience for offline work, why are the project files even in the repository? Chef, Maven, TFS, etc, don't care about your project files (or at least they shouldn't IMO). They exist to provide an independent and impartial build environment open to whoever is allowed to request a build.

Another example, most of our developers are familiar with Ruby, we don't want to learn C# or Scala, the paradigms are too different, it will require lots of extra effort to train them, it will be more expensive and time consuming to build the software if we do it that way.

The above answers are correct but most people ignore what they will have gained by it. Exposure to new technologies, techniques, frameworks - the things that are the meat and potatoes of your daily work, is one of the primary ways we grow as developers! Good developers will get better, bad developers will get good and everyone's selection of appropriate tools will have increased. This is never a bad thing!
There are valid reasons that this approach won't work on every project, obviously, but the benefits of learning new things in a group environment are well documented. Here is the first thing I came across when I Googled, 'learning in a group environment'. Cooperative and Collaborative Learning

If it works for kids in a classroom, why wouldn't it work for adults? Isn't that one of the lessons that agile programming and methodologies tell us over and over? Group collaborative learning is good? Surely you can find at least one project a year that is small enough to benefit from this approach. The more often you do it, the easier it will become to do it again, maybe on larger projects. And each time, the tools that the development teams are familiar with grow.

Just my two cents.

No comments:

Post a Comment