Estimation Issues – Part 1: Diseconomies of Scale

We often perform program area analyses, where we work with clients to understand issues and ultimately trace the issues to root causes that negatively impact their program performance. One issue that we have seen on several programs is inadequate estimation of the size of the effort, which impacts the cost, schedule and an organization’s ability to deliver on time.

Organizations sometimes use models of the software size, usually defined in terms of lines of code (LOC), as a basis for predicting cost and schedule. There are many factors that can lead to significant deviations in software size data; here are some common issues:

  • Underestimation of infrastructure
  • Lack of understanding of requirements
  • Unrealistic interpretation of original requirements and resource estimates to develop the system
  • Unexpected impact of legacy integration
  • Over expectation of the value of commercial off-the-shelf software (COTS)

In this first of a two-part series on estimation issues, we’re going to reflect on the Diseconomies of Scale, which lead to inaccurate estimates of the size of the software development effort. In software, the larger the system becomes, the greater the cost of each unit. If software exhibited economies of scale, a 100,000-LOC system would be less than 10 times as costly as a 10,000-LOC system, but the opposite is almost always the case. Barry Boehm, in 2000, provided historical data that reflects on the diseconomies of scale for software-intensive systems development. When the range is within the 10,000 LOC to 100,000 LOC the growth is usually linear, however for programs over 100,000 LOC, the typical growth accelerates and can grow exponentially.

On a recent program, we used Boehm’s graph to map and compare the predicted size to the actual size for a program that was greater than 100,000 LOC at the start of the effort. Boehm’s typical growth prediction, for programs greater than 100,000 LOC, was accurate in estimating that the actual size for this particular program would be 80% greater than the size that had been predicted by the team.

We encourage you to use historical evidence like the diseconomies of scale before committing to a schedule or budget in the future. Part 2 of Estimation Issues will discuss other ways beyond lines of code to quantify the size of the development effort.

Let us help you with a program area analysis. Our problem area analysis method helps us work with you to dive deeply into program issues. We work with the team members and discuss the details of the system while discussing issues. We often identify gaps in engineering practices. Ideally we help teams understand those issues, resolve the issues, and bridge those gaps on current or for future programs.

Please feel free to contact me to discuss problem area analyses or estimation approaches.

 

This entry was posted in Uncategorized. Bookmark the permalink.

4 Responses to Estimation Issues – Part 1: Diseconomies of Scale

  1. Thank you – I do believe it is important to provide useful information.

    There is a related idea to this topic about optimal architecting. As the size of code get larger, you need to spend more time on architecting the system. See Current and Future Challenges for Software Cost Estimation and Data Collection, Boehm, March 8, 2010.

  2. I think this is among the most significant info for me. And i am glad reading your article. But want to remark on few general things, The site style is ideal, the articles is really great : D. Good job, cheers

  3. Your site is actually great. Keep working that way.

  4. Very nice post. I just stumbled upon your weblog and wished to say that I have really enjoyed surfing around your blog posts. After all I will be subscribing to your rss feed and I hope you write again very soon!