Gap Analysis - Identifying your organizations true best practices

We often start our CMMI, measurement, estimating or other process improvement engagements with clients with a "gap analysis."  Probably all of the readers of this blog have a pretty good idea what that means and so it is an easy entry point. But I don't like the term "gap analysis." Why?  Because it carries with it the implication that it is an exercise in finding out everything that our client is doing wrong.  I prefer the term "opportunity analysis." In the April 2010 edition of SoftwareTech magazine, David Herron writes a very good perspective on this issue.  My own simple perspective is that every software development organization does some things well.  It must do otherwise it would not survive.  Hence, every software development organization must know how to do things well.  Finding the things the client does well is just as important as finding the things it does not do well because the pocket of excellence provide patterns for the improvement of the weak areas. Further, identify the pockets of excellence is important because any changes must be carefully designed to ensure that the excellence is not compromised.  This is a classic mistake that we see too often - process zealots are allowed to bring everything the company does to the same "average" level.  This usually represents significant improvement is key problems areas but it can sometimes stifle the areas that were previously excellent and were making the company "special" to their customers. Finally, for those of you who have time to go to read David's article, I would also recommend the article by Capers Jones in the same edition, "Software Quality and Software Economics."  Capers always delivers thought-provoking ideas and plenty of data points.

Written by Michael D. Harris at 11:45

Impact of offshore outsourcing on your employees

As often happens, i was looking for something else when I came across a 2008 article in IEEE Computer by Mary Lacity and Joseph Rottman (don't worry - the subject matter is covered in their 2008 book, "Offshore Outsourcing of IT work).  Having just set one of our clients off on the implementation of some outsourcing options to supplement their in-house resources, the list of 20 major effects of offshore outsourcing reported by project managers caught my eye.  I strongly recommend that you think of this list as a set of risks that need to be mitigated in any outsourcing implementation.  With apologies to the authors, I have sorted their list according to my highest priorities.  The in-house Project Managers reported that they:

  • needed a mentor the first time they managed a project with offshore resources
  • had to motivate the supplier to share bad news
  • had to make offshore suppliers feel welcome and comfortable
  • needed to thoroughly verify the offshore supplier's work estimates, which tended to be optimistic
  • had to provide greater detail in requirements definitions
  • had to do more knowledge transfer up front
  • were forced to shortcut the knowledge transfer process because of deadlines set by senior IT leaders
  • had to ensure that knowledge transfer was successful by testing the supplier employees' knowledge
  • had to set more frequent milestones
  • needed more frequent and more detailed status reports
  • required more frequent working meetings to prevent client-caused bottlenecks
  • needed to accompany offshore suppliers to all client-facing meetings
  • experienced higher transaction costs which threatened their ability to deliver projects on budget
  • experienced project delays which threatened their ability to deliver projects on time
  • had to guarantee that the supplier followed pre-agreed knowledge renewal practices
  • had to ensure that the supplier transferred knowledge about new applications or technologies to the client
  • had to learn about new applications or technologies independent of suppliers to ensure that the suppliers information and bids were valid
  • had to integrate the suppliers CMM/CMMI processes into their own project management processes
  • had to ensure that the supplier's employees were fully trained as promised by the suppliers
  • had to fill many of the roles the the PMO should have performed
Written by Michael D. Harris at 14:11
Categories :

CMMI as a high-value framework for other improvement initiatives

We have been using CMMI as a framework for our consulting initiatives for many years.  In recent years, it has become clearer that you can get a lot of value out of this approach without going all the way to a full accreditation. In a recent (Jan/Feb 2010) article in CrossTalk, Jeffrey Dutton captured the essence of using CMMI in this way.  He puts forward three driving principles:

  1. Focus on Business Issues and Performance Goals
  2. Involved Leadership and Process Ownership by Process "Doers"
  3. Improvements should be made at the Speed of Business

Jeffrey goes on to explain that the CMMI framework together withe three principles can readily accommodate different approaches like CMMI itself, Lean, Six Sigma and ITIL. We are seeing more and more interest in this "multi-model" approach to get more value more quickly out of process improvement initiatives.

Written by Michael D. Harris at 15:50
Categories :

CMMI v1.3

Mike Phillips and Sandy Shrum of the SEI have published a good article about the new CMMI v1.3 so rather than rehash it here, we have added the paper to our website.  It is recommended reading!

Written by Michael D. Harris at 21:00
Categories :

CMMI Processes - Science or Art?

Putting the facts up front, we, at DCG, are SEI Partners and advocates for CMMI.  Perhaps that is one of the reasons that our ears are finely tuned for criticisms of CMMI.  Many criticisms are baseless and a consequence of misunderstanding - wilful or otherwise.  Some are real and have a firm basis in experience on the ground. One such criticism is CMMI compliance does not necessarily equate with good software in all cases.  This is the consultants way of saying, "They may be CMMI Level x but their code is awful."  Is it fair to suggest that, sometimes, efficiency does not necessarily lead to effectiveness?  "All the indicators are green but there's something wrong."  I have heard clients complain about software produced which is, "compliant with requirements but never innovative". Here at DCG, we seek to mitigate such issues by deploying a portfolio of software development best practices through a Value Visualization Framework.  However, I saw an interesting article by Joseph M. Hall and M. Eric Johnson in the March 2009 issue of the Harvard Business Review. In their article, "When should a Process be Art not Science?"  Hall and Johnson argue that the movement to standardize processes has gone overboard and that some processes require an artists judgement and should be managed accordingly.  They mention software development as an area in which this can be true and. although they don't mention it, their argument resonates very strongly with the Agile Manifesto (make this link). So how do we manage art? Hall and Johnson propose a three step process: 1. Identify what should and shouldn't be art 2. Develop an infrastructure to support art 3. Periodically reevaluate the division between art and science Again this resonates with our consulting experience in agile development shops.  Too often, the success of agile implementations is constrained to only one or two successful teams because only step 2 is implemented.

Written by Michael D. Harris at 16:17
Categories :

"It's frustrating that there are so many failed software projects when I know from personal experience that it's possible to do so much better - and we can help." 
- Mike Harris, DCG Owner

Subscribe to Our Newsletter
Join over 30,000 other subscribers. Subscribe to our newsletter today!