Establish an IT Benchmarking Program

IT quality and productivity performance improvement initiatives must include data-based information to guide and direct improved performance over time. This process begins with establishing a baseline of quality and productivity performance levels, which can then be used to track improved performance over time and better understand how IT is functioning.

A global service company that provides innovative payment and expense management solutions for consumers and businesses wanted to develop its own IT benchmarking program in order to identify opportunities to improve its development practices, so it reached out to David Consulting Group.

Benchmarking

We've put together a case study to explain how we implemented the benchmarking program and the benefits in doing so thus far. Lear more about the benchmarking solutions that we offer here.

Read the Case Study

Written by Default at 05:20
Categories :

The Estimation Center of Excellence: A Case Study

A global financial solutions provider was struggling with its limited capability to accurately predict and successfully manage its software delivery schedule. Based on DCG’s reputation in the field of software metrics, the company reached out for help.

DCG proposed the creation of an internal sizing and estimating Center of Excellence (CoE) following its Build, Operate, Transfer (BOT) approach. This approach starts with the creation of the CoE under the operation of DCG, with ownership transferred to the company over time.

To learn more about this engagement and the BOT approach, read our case study: DCG Establishes Estimation Center of Excellence for Global Financial Solutions Provider, Resulting in Increased Project Oversight and Improved Vendor Management.

The engagement is currently running successfully and the company has reported an improved ability to evaluate third party vendor bids based on software development efforts.

Questions? Leave them in the comments - we're happy to answer them!

Read it Now!

Written by Default at 05:00

Benchmarking – A Slap in the Face or an Informed Discussion?

Talking Down to the Client

Alan CameronI’ve recently been talking to some clients about their experience of software development benchmarks, either as customer or supplier, and overwhelmingly they’re telling me that it’s not often a good experience.

“Oh, I’m fed up with the whole process,” said one outsourcer. “Clients engage the blue chip benchmark suppliers, and, after a tussle and a lot of ill-will, a report lands on a desk. The results are presented as a simple answer, which is ‘the right answer,’ and we are left to scrap with our client over the results. The method of comparison and the dataset used – size and characteristics – aren’t made clear, and the result is unsatisfactory. In the worst cases, it can be contract threatening.”

I have some sympathy. Poorly created benchmarks can be misleading and in the worst cases can lead to court action. I encountered a situation once where a major outsourcer was producing a complex financial regulatory system, and the client decided to benchmark the programme. The results indicated that the software development was vastly inefficient and much too costly.

There was a dispute between client and outsourcer, which nearly went to court, until someone on the client side asked the question, “How many data points are we looking at, and what sort of applications were included in the sample?”  There was a metaphorical shuffling of feet and a sheepish reply. Basically there was one data point in the same industry and it referred to a CRM system. I don’t know who was more embarrassed, the client or the benchmarker.

Trying to commoditise the outputs may seem sensible, but the resultant model can be a gross simplification, and that is “slap in the face” benchmarking. Essentially, it’s a combative, adversarial process and it doesn’t work.

Collaborative Benchmarking

A more rational approach to benchmarking should involve a three-way discussion involving the client, outsourcer and benchmarker. Developing and enhancing applications is a skilled task, and it’s multi-dimensional. Benchmarks should reflect the things that matter to the client’s business; the benchmarker needs to be open and honest about the data used and must offer a range of answers to facilitate discussion.

We hear about time, cost and quality often and, when we benchmark, all three aspects have to be taken into account. I would add to that agility and flexibility.  Benchmark reports should balance software development business drivers against what’s being delivered. In a waterfall or similar process, where time is at a premium, either costs or defects tend to go up. Where quality is the driver, then unit costs may go up because higher skilled staff members are used, but there may be better effort productivity. If cost is key, the speed of delivery and quality may be less than optimal.

Agile should be a game changer, which is where client agility and flexibility become as important as that of the supplier. Suppliers may be well versed in delivering working software of high quality in a short time, but if the client doesn’t understand the business goals and can’t adequately groom and prioritise the product backlog, then all the benefits can be lost. But that’s a story for another time.

Go in With Your Eyes Open

The result of a benchmark should be to identify any process and cost inefficiencies in both client and supplier processes against an informed backdrop.

When you ask for a benchmark:

  • be sure what you’re asking for;
  • demand clarity and transparency from the benchmarker;
  • be prepared for rational discussion with the benchmarker and supplier;
  • collaborate and understand - there is no right answer;
  • be prepared to look inward – client processes can be as much the cause of sub-optimal performance as those of the supplier;
  • don’t trust the man (or woman) with the simple answer to the complex question.

Unless, of course, you want a slap in the face.

 

Alan Cameron
Managing Director, DCG-SMS

Written by Alan Cameron at 05:00
Categories :

Check Your Software’s Cholesterol Level!

Rob Cross I don’t believe it! I have high cholesterol? But, I exercise every day, take my vitamins and eat low fat and low sugar foods! How is this possible? But, the results don’t lie. One simple blood test can tell someone a lot about what they thought was true, introducing a new reality.  

Many of our customers have the same reaction when we first expose them to application development metrics (ADM), introducing them to a new reality. The reaction is usually something close to, “But, I use Agile methods! I bought my engineers the latest and greatest software tools! I even invested in sending my staff to training to become black belts or something like that! This is outrageous!” 

In this case, the simple blood test was a forensic audit of the code. Just like the example blood test above, ADM analytics provide organizations with a good dose of reality in terms of what “ACTUAL” process they are following, not what they profess to follow. The only artifact in sync with your true process is the result of it (the code) and ADM analytics offer reality – and for many, an excellent opportunity to eliminate waste and capitalize on things that are working and providing savings and benefits back to the organization’s business objectives.  

Don’t misread this post; I believe in process and not to invest in it would be a big mistake. However, I also believe that not checking how you are executing on that process through some simple “blood tests” or ADM analytics would equally be a big mistake. Building a tangible product using manufacturing processes invented by Henry Ford a century ago is very different from building bits and bytes in cyberspace. One difference is that as an executive, I can walk down to the factory line and pick up the tangible product and feel it, measure it, use it and eat it (if you work for Krispy Kreme – that would be fun).

Software doesn’t always offer these advantages. Instead, we employ some practices and beliefs such as “best efforts” and tools and testing that will identify potential issues. Reality is that management, customers, competition and market demands change our actual implementation of process, mostly in negative ways. Once we accept this fact, then we can accept the reality that those processes and tactics that we believe are there to identify and mitigate such risks have been significantly diluted. What’s my point?      

There are many competing demands on organizations, but the one that usually wins is “go faster.” This decision tends to force organizations to shortcut process, which introduces unknown risks (i.e. technical debt) into the process, which manifests in the code. Just like you can’t fit 10 pounds of sugar in a five pound bag, you can’t fit five months of software in a four month cycle and still hit all your targets. By giving yourself a “blood test,” an ADM audit, you will gain transparency into the actual process your organization followed at the consequence of speed – and the opportunity is to correct the mistakes, learn, adjust and keep moving forward!

Next month I’m going to discuss what it might be like to work for Krispy Kreme and being able to eat your work in process!


Rob Cross
ProServices, Vice President

Written by Rob Cross at 06:44
Categories :

Software Quality Management – tools management versus data management … that’s the question!

Rob CrossI’m going to reveal a very personal secret to all of the folks following this blog; promise not to tell anyone okay? But, I’m really a dyed in the wool sales guy! Whoa! I’m glad I got that off my chest.

Call me biased (that’s all you’re allowed to call me), but sales people are one of our greatest assets to understand the pulse of the marketplace because they’re on the front lines punching at the 500lb marshmallow every day, understanding our customers’ pain and selling solutions. I have been involved in selling software quality solutions across industries for 14+ years and being on the front lines for that long, I’ve noticed some things that perhaps you might find interesting. 

My very first blog a couple of months ago (“An Important Question to Ask after a Very Public Software Disaster”) posed a question that I only partially answered regarding data management versus tools management.  My current company, PSC, offers independent software security and quality inspections. I mention this because our main competitors are my customers, who perceive that they do the same thing that we do. 

One observation over the years is that most executives believe just buying software tools for their engineers is good enough to identify, manage and mitigate software risks. I call this strategy tools management. This entails keeping the engineers happy by allowing them to play with latest new widget, or allow them to download free ones at will, or in worst case scenarios, to allow them to spend $100,000(s) on a big software solution with big promises of ROI. The good news is these folks understand the value of automation, the bad news is they have not realized the potential value because they don’t have a data management strategy.

I know what you’re thinking, “What are you talking about … what is a data management strategy?”  In the majority of accounts we have done business with, those who have automated tools are unable to determine the following:

  • Are engineers using the tools?
  • Do engineers understand how to assess the data output and correlate it to risks that are important to the corporation to identify, measure, manage and mitigate?
  • The results of this data management process, in order to provide them to various stakeholders within the organization.   

Finally, and this is the killer question, “Does anyone guarantee the integrity of the resulting data so the business can make critical tactical and strategic decisions based upon the results?”     

I know this sounds like it’s costly to implement internally, and you’re right, quality does not come cheap – but neither does ignoring this fundamental data management process, resulting in software glitches in the field and causing damage to your company’s brand and reputation. Take Chrysler for example, three (3) major recalls in the past three (3) months due to software errors in their vehicles. Ouch! 

The easy part is picking a tool; the hard part is correctly integrating it into your software development lifecycle, thinking through a sound data management strategy to turn risks into opportunities, and being able to answer with confidence the above questions. 

I will be sharing with you more thoughts regarding tools versus data management because there are more dimensions to this issue. It’s an exciting area of discussion within the industry because, believe it or not, software risk analytics or intelligence is an up-and-coming area of growth with some exciting technologies coming to market. 

Now that I have revealed to you my little secret and you understand my perspective from the front lines, I have to be true to my knitting as a sales professional with a plug for my company. Our customers have documented through case studies that we provided data management and analytic services to them at 1/3 the cost in comparison to their internal resources with a historical 9.5x ROI. Now, c’mon … you know that’s impressive!

I would love to talk to you about the details of these studies and the services we can provide through David Consulting Group, so feel free to leave a comment below!  For the rest of you, I will be discussing other dimensions of this topic in future posts, so you’ll have to be patient until next month. Until then, remember, it’s not the tools, it’s the data! 


Rob Cross
ProServices, Vice President              

 

 

 

Written by Rob Cross at 05:00
Categories :

"It's frustrating that there are so many failed software projects when I know from personal experience that it's possible to do so much better - and we can help." 
- Mike Harris, DCG Owner

Subscribe to Our Newsletter
Join over 30,000 other subscribers. Subscribe to our newsletter today!