How Agile Development Teams Can Resolve Agile Measurement Challenges With Function Point Analysis

IT Benchmarking

Agile development empowers teams with many benefits but also presents challenges around managing and measuring its effectiveness. The way to resolve these is Function Point Analysis.

From business impediment to business enabler, IT development has come a long way since Agile has become the favored practice. Now empowered with speed and responsiveness, organizations have left the days of slow, cumbersome, inflexible, and unresponsive practices behind in the dust. Instead they’re able to support business needs and experience better alignment with changing business environments better than ever before. 

It’s easy to understand why Agile is experiencing a strong increase in adoption; as companies become more nimble to embrace the pressures they’re facing in digital transformation, IT development is able to respond aggressively to evolving competitors and exploit markets more easily. But these benefits rival the frustrations on the management side of Agile teams. The nature of Agile makes it so that IT has lost visibility and scope control while the business has lost predictability. While Agile might make teams fast and responsive, businesses don’t know when projects will be delivered, and quality of delivery is often poor. 

This is due to story points. Story points is a relative and subjective effort measurement that allows teams to estimate how much work of a certain item is required compared to a certain reference story with a fixed number of points. Story points can be used as an assessment method within a team. But how do these points happen? In an Agile Scrum environment, productivity is often associated with delivered story points, often expressed in Velocity as an estimation unit. The problem is that story points are not standardized, and productivity based on story points means nothing outside of a team itself. Even within a team, story point deflation is always lurking. 

Is it even possible to objectively measure productivity? This blog will show that using a ratio scale is the way to objectively measure productivity as proven by IDC Metri’s years of helping clients turn around this common challenge. Management information can be established through a ‘unit of measurement’, bringing answers to long-sought after questions such as which teams are performing well, which teams are not performing so well and when is which functionality ready at what cost? 

If you want to use productivity to compare teams, departments, organizations and/or suppliers, or the market, it’s a necessity to use a standard measure of output. Even when this data is about trends on your teams, this insight creates a unified and common view.  

For years IDC Metri has been offering function points to create this factual view to clients. Function point analysis was developed in the 1970s to determine the productivity of development teams when it was impossible to do this by counting lines of code. By making function point analysis independent of the technical implementation (programming language, architecture, etc.) and the development method (Waterfall, Agile, etc.), it’s also relevant today and fits into the solution that Agile teams and management need to resolve the challenges that story points create. In short function points are the de-facto standard to express the amount of functionality in a standardized size unit. 

Several manual standards are available and one international ISO standard is available for automated function point analysis: ‘Automated Function Points (AFP)’. IDC Metri prefers to use automated measuring of functional size but also employs certified analysts who can manually measure when automated measuring is not possible for whatever reason.  

To measure the size of the output of a team, it is also important to not only look at the added functionality but also at the changed and removed functionality. IDC Metri uses automated measuring of ‘Enhancement Function Points (EFP)’ to measure how much functionality has been added, changed and/or removed during a sprint, release or project. This gives the ‘Project Size’ in EFP, a standardized method to measure the output of a sprint or release. 

While Agile is hard to measure and manage for full value, the IDC Metri proven approach of using function points transforms a team-driven, fast-moving, rapid iteration process that evaluates progress on qualitative measures into something that can be quantified and predicted. 

Looking to learn more about Objective Measurement vs. Story Points? Download the IDC Metri eBook How to Measure Business Value for Agile Teams.

Contact Us

Interested in learning more about IDC Metri?

How to Measure Business Value for Agile Teams

Webinar | On-Demand

IT teams that have adopted Agile are benefitting from faster delivery cycles and more flexible systems. Despite the improved capabilities that Agile brings enterprises, IDC Metri, the leader in helping organizations realize the full value of their IT functions, has found that most companies would see a significant jump in performance if they applied quantitative assessment techniques to manage Agile efforts.

Watch the webinar to learn more about these processes and how easy it is to implement them while receiving strategic guidance around:

  • How to gain predictability, control and visibility into high-profile Agile projects
  • How monitoring and assessing Agile teams can control budgets, increase delivery speed and quality and ensure a minimum viable product
  • How building quantitative performance metrics into supplier contracts delivers improved cost, quality and performance from third-party development partners

When scaling and delivering Agile practices is vital to business success, you don’t want to miss IDC Metri’s insights on how to better impact your bottom line by realizing the full value of Agile development.

Agile Value: How Can You Manage What You Can’t Measure

IT Benchmarking

Agile development promises faster, more responsive development. This aligns better with the transformation of organizations, as they face heightened, more competitive environments. Driven by market and technology changes, organizations are re-structuring themselves and their products and services to be more Agile and opportunistic to market changes. Agile should be suited to delivering this responsiveness when building and supplying technology capabilities to transforming organizations.  

But, frequently, it isn’t.  

By its nature, Agile can and should be a major enabler supporting these changes, but many organizations find it difficult to manage and extract this value. This is due to the challenges in measuring productivity, quality, performance, and forecasting delivery. It’s hard to manage what can’t easily be measured.  

Why is Agile hard to measure and harvest value from?  

Waterfall and other goal- or milestone-focused development methodologies are structured with clear definitions of project phases (requirements gathering, sequential development, codified dev-test-QA-production flows) and milestones. Agile is more fluid. Agile measures productivity in terms of qualitative measurement, Story Points, that make cross-team productivity comparisons difficult. Agile value is based on individuals and interactions getting it done over process. It drives to create working code (moving quickly) while back-seating documentation. By working closely with the customer in the development process, it is more responsive and adaptable at the risk of increasing backlog, and expanding scope and requirements.  

While Agile is well suited for delivering capabilities in a modern, competitive landscape, getting that value is hard, but not impossible. Typically, organizations struggle in three areas of Agile value: 

  1. Predictable delivery of capabilities (reliable productivity) 
  2. Quality 
  3. Cost to performance, including with service providers 

IDC Metri’s Agile Value Management product addresses these management challenges by assessing agile development efforts across team and product performance categories. Key team factors assessed are productivity, cost efficiency, delivery speed, and quality. For product quality, we evaluate robustness, efficiency, security, changeability, transferability, and technical debt. Future, it allows for benchmarking team performance against other teams within an organization and against market peers. These assessments filter up into management dashboards, to help identify trends, and engineering dashboards that drill into specific recommendations and remediation.  

Predictable delivery of capabilities 

With the Agile framework being structured around sprints (typically two-week cycles of refactoring, back-log attack, development, and just-in-time requirements gathering), Story Points for goals, and velocity (Story Point clearing) for progress, it’s hard for organizations to translate these measures to more traditional measures of progress. A lot of motion and momentum is demonstrated, but how this leads to predictable delivery of capabilities is elusive. To address this, IDC Metri uses a proven methodology for assessing progress and ensuring predictability—automated and enhanced function point analysis (FPA).   

The IDC Metri Agile Value Management (AVM) solution assess a development team’s progress using both enhance and automated function point analysis. FPA delivers a concrete assessment of size delivered (value) and enables comparison of productivity across teams and benchmarking against industry peers. AVM provides management with the progress measurement dashboards for productivity and delivery speed. To measure and assess these, IDC Metri uses the functional output a team has delivered in a certain timeframe, leveraging the NESMA standard of functional size added + changed + deleted. In the case of automatic measurement of functional size, IDC Metri measures according to the ISO 19515 standards of Automated Function Points (AFP), and Enhancement Function Points (EFP). This data is presented in a fashion that allows managers to understand progress to goals and transparency to understand and predict capability delivery.


Ensuring predictable, or efficient development, only matters if the product being produced is of the quality (stability, security, efficiency, etc.) necessary to meet the business goals. For this reason, it is important to balance performance measures of the team with quality measures of the code. We don’t want measures and goals for performance to have the unintended consequence of driving down quality.  

AVM provides source code analysis. This analysis provides ongoing assessment and trends in team quality over time and highlighting key areas of deficit. With this analysis, an Engineering dashboard is created showing the (critical) violations found, why these are violations, where they are found and how to solve them. The most critical ones are put on an action plan. This data is also presented in easily digestible fashion for managers responsible for managing and ensuring product quality.  

The Engineering dashboards clearly identifies poor code and critical violations (CVEs) allowing the development team to better, and more rapidly, address quality issues. When adopting the guidance from the Engineering dashboard, overall development team practices improve. Quality and performance enhances, due to lower testing efforts, resulting from enhanced coding practices. Also, improving practices and identifying better practices reduces team stress and enables recently onboarded team members to become more rapidly productive.

Cost to performance 

Sourced Agile development projects are typically time and materials (T&M), which shifts budget risk from the sourcing vendor to the buyer. Previously, development projects were typically fixed prices where risk (especially financial) was weighted towards the sourcing vendor. Similarly, even with internal projects, budgeting and cost were more predictable, due to the structure and predictable nature of methodologies like Waterfall.  

AVM, by putting measurable, traceable and consistent metrics around development, helps make cost management and cost efficiency easier and transparent. Also, by providing benchmarking within an organization and against peers, a client has the context to understand the competitive meaning of these assessments (i.e., is my team underperforming in my industry in the cost/performance ratio for development?). Further, by assessing sourcing vendor current performance versus cost, goals can be set and measured consistently, over time, for assessment. AVM, with its combination of market benchmarking for services and concrete performance metrics, benchmarks the sourcing vendor performance against market peers. This enables buyers to determine whether the service capabilities they procured are delivered competitively to other vendors in the market. Furthermore, it gives leverage to the buyer in ensuring that a T&M development contract is performing at a minimum to market peers, i.e., that the buyer is not over-paying for the quality and productivity of the development they receive.  

Supplier Improvement Actions

A client example illustrates this. The client company nearshored application development and maintenance. They were concerned they were paying more than the value they received. IDC Metri performed an AVM assessment demonstrating gaps in value based on the hours (cost) put into the sprints. Productivity was 30% lower and cost 22% higher than market average. Maintenance cost four times the market average. This assessment culminated in supplier improvement actions to comply with performance and product health metrics (with ongoing verification by IDC Metri).  

To rephrase an earlier observation: if you can’t easily measure something, you can’t easily manage it. AVM allows organizations to clearly understand how their Agile development teams (staff or sourced or hybrid) perform and deliver value. It cleanly addresses three key organization struggles around Agile: predictability, quality, and cost. It makes it easy to measure and assess Agile development, which means it enables easier and effective management of Agile.  

Want a deeper example of an organization that overcame challenges in quantifying Agile value? Read “A Management Primer: How Agile Development Teams Can Deliver Value”

A Management Primer: How Agile Development Teams Deliver Value

Organizations overcome challenges in quantifying Agile value by embracing a solution that assesses, benchmarks and course-corrects Agile development teams. How is this done? Through agile value management.

Agile development is challenging to manage and measure, especially when compared with traditional development models like Waterfall.  

Management is often uncertain when functionality will be delivered and at what cost, whether it delivers required quality, and with what inherent risks. Due to its inherent nature of using Story Points to assess progress, Agile value is challenging to quantify. 

If something’s difficult to measure, it’s hard to evaluate, manage and ensure value delivery. Through an example, this blog sets out to show that it’s possible to counter these challenges and restore balance between the relationship of Agile teams and management. 

Imagine this common scenario: A major client-facing application that digitally transforms the company is a year into development with delivery postponed every quarter. Sprints are chewing through the backlog, but it keeps re-filling with refactoring, bugs and newly discovered requirements. Turn-over on the team is creating challenges in productivity and quality, and onboarding new team members takes too long, which impacts overall productivity.  

Management keeps receiving progress planning updates that show a smooth, Agile development machine with sprints and stories paced to deliver, but then the budget keeps increasing and the delivery date shifts to the right. Management is struggling to reconcile the reports versus actual cost, productivity, quality, and deadline progress. They’ve replaced the project manager once, but are frustrated by conflicting messages and are distrustful of quality, timely progress.  

The development team is confused and team members are leaving the project (and the company) due to management pressure to correct a project, where the team feels it is moving through the Agile process effectively. They feel that “management” doesn’t understand how Agile works and its value proposition; coming from Waterfall project management, they believe that management simply doesn’t get Agile.  

The development team also feels hampered by staff turnover, resulting in lost productivity, institutional memory and decreased quality. If management got off their backs and let them work, they could deliver a successful product.  

Behind management’s interest in seeing the product done and delivered is that they see it as key to their digital transformation, and a competitive and defensive necessity versus their market peers. If Agile is so much better than Waterfall, why are costs, duration and quality spiraling out of original forecasts? At least with Waterfall you knew what the milestones were and could prioritize deliverables. With Agile, it’s all about backlog and Story Points.  

Agile value management brings worth to teams and boosts productivity

Management eventually lost faith in the development process to deliver the application in the competitive timeframe needed with a minimum viable product and a semblance of cost controls. They feel like they’ve lost grip over a project that has become out of hand with no end in sight. Both groups need a concrete measurable process that allows all parties to objectively and consistently assess the team performance, product health and quality. They need a dependable way to visualize progress (productivity, cost, delivery speed, quality, security, maintainability).  

The company previously subscribed to research from IDC and heard about the IDC Metri Agile Value Management (AVM) solution that assesses, benchmarks and course-corrects Agile development teams. AVM is a systematic, quantitative, comprehensive, data-driven, and repeatable solution, based on ISO standards, and has demonstrated quantifiable value to clients.  

The IDC Metri AVM solutions team began the process of breaking down the product development process and product.  

Measures of value creation occurred across multiple sprints. Enhanced function point analysis, based on ISO standards, is to assess progress and efficiency rather than Story Points. Source code analysis assed code quality and security, to reduce refactoring and the number of CVEs. This analysis not only identified the gaps, but provided specific, tactical recommendations on how to address them. Further, the benchmarking team gauged their performance to understand how competitive their team was versus market peers.  

After six weeks of work, management and team received dashboards and analysis that clearly identified gaps and remediation.  

The engineering dashboards clearly identified poor code and CVEs that allowed the development team to better and more rapidly address quality issues. As they adopted the guidance from the engineering dashboard, the development team found their practices improving. Quality and performance improved due to lower testing efforts attributed to enhanced coding practices. Also, improving practices and identifying better practices reduced team stress, and enabled newly onboarded team members to more rapidly become productive.  

The benchmarking team let management identify areas to invest and realize team improvement. The management dashboard provided managers the opportunity to check key project factors, notice and correct negative performance trends.

Based on the provided forecasting assessment, management realized the development team wasn’t ready for the required business delivery milestones. Even if they performed to peers. So, management made the decision to augment the team with third-party staff to ensure success.

Both the development team and management needed the same thing – a common, agreed-upon definition of “reality”. They started off with two different frames of reference. For the development team, their frame was backlog, sprints, velocity and Story Points. For management it was cost, time, quality and milestones. The two groups were speaking different languages and couldn’t reconcile. By bringing in AVM, management and the development team now had one frame of reference everyone could agree with. They had a complete analysis that brought together technical and business needs.

The team had one language from which they could understand development health, address gaps and deliver agile value. 

Interested in more content around delivering agile value? Join us on the Agile Value: How Can You Manage What You Can’t Measure webinar on January 12.

A good estimate is expensive

metri event cost estimation

IT-projects are expensive and have a tendency to fail or to become more expensive. That is a persistent image. And according to long-standing research also true in about 40% of the cases. A growing number of board members are demanding a proper substantiation of the investment in new IT. That sounds like good news. Yet too often we hear from board members that in their experience a substantiated estimate is very expensive and the number of failed initiatives does not decrease. What is causing this?

For a complex project a lot of experts are required

Many IT projects are realized within an existing application portfolio, so a lot of different expertise is required to shape a good solution. Quite often, the mistake is to rely on these subject matter experts to draw up the project estimate. Drawing up a total estimate for an IT project with subject matter experts requires a lead time of six weeks at the least. This approach also requires a good overview of the project to determine whether all parts have been estimated without overlap.

Metri’s research shows that this approach leads to an estimate that is 30% too optimistic on average in both cost and lead time. Subject matter expertise is no guarantee for a good estimate. Drawing up a good estimate is an expertise in itself. This requires sufficient knowledge to understand the architecture of a complex IT project, but above all knowledge of how this translates into an estimate. A good estimate is more than the sum of a number of separate activities. By using the right expertise, a higher quality estimate can be made in less lead time.

Every project is unique

The most often heard reason to rely on subject matter experts is that every project is unique and therefore requires subject matter knowledge to estimate it. The uniqueness defines the difference between a project and a recurring activity. But unique projects also adhere to laws of their nature. As a result, the estimate of a project is more than the sum of a number of separate activities. These activities must be planned, coordinated and controlled. Partial products must be integrated, tested and accepted. Estimation research has been conducted for over four decades and a number of patterns have emerged that also apply to the actual way projects are executed today.

External expertise is very expensive

Not every organization that executes IT projects has in-house estimation expertise. This is understandable, because today every organization uses IT facilities and almost every organization executes one IT project or more. If you want to do a proper estimate, you need to hire external expertise. That requires actual out-of-pocket spending. Sometimes as much as 1% of the total project budget. That seems like a lot of money to some organizations. But is that really true? What is the real cost to have subject matter experts spend six weeks on putting an estimate together? This cost is often not directly visible, because the subject matter experts are working on the project anyway. If you are able to make this cost transparent, an (external) estimation expert is not that expensive.

A bad estimate is more expensive

Moreover, an estimate put together by subject matter experts also entails a hidden risk, that a low-quality estimate entails additional costs for the project. It may sound strange, but a bad estimate simply costs money.

When an estimate is too optimistic, costs will increase exponentially. For example, because activities have to be executed in parallel that could better be executed sequentially. Or because people have to be added to the project to meet deadlines. This is relatively expensive and gives little return on investment.

But also a too pessimistic estimate costs money, although that usually remains hidden. When an estimate is too generous, the extra time or money is usually spent. For example, to build extra features that are not really necessary. The project could have been executed for less money or less time with the same value for your organization.

Interested in what a good estimate can bring to your organization? Please feel free to contact us. An exploratory meeting is completely free of charge.

More information

Interested in learning more about IDC Metri? Let's schedule an appointment for an introductory meeting.