Think You’re Achieving Full Cloud Cost Savings With Tools? Think Again.

IT Benchmarking

How companies are partnering with a service for true end-to-end process support that brings full cloud cost savings

48% of enterprises plan to keep spending steadily on cloud, according to IDC research, making cloud costs a focus for IT leaders. And, the majority of organizations believe they’re overspending on cloud.

The first blog in this two-part series described how tools may provide an opportunity to reduce costs. But as many organizations are aware that they need to improve their cloud spending habits, the process that it takes to get there often seems exorbitant, causing them to instead disregard the changes needed to turn their cloud spending around. This blog intends to show that the time and resources involved in executing shouldn’t deter companies from making the necessary changes. 

From insights to process, these two companies found that hiring a partner to guide them through the work needed to transform their cloud costs, in ways that were custom to their needs, made all the difference in ensuring that they not only followed through on executing a plan of action but giving them a successful outcome. 

An international telecommunications company has migrated its entire infrastructure to the public cloud (AWS and Azure) and uses a broker towards AWS and Microsoft, performing contract management and basic security services. For both providers, a System Integrator (SI) has been contracted to provide managed services (IM and TAM) on top of the cloud providers.  

During the migration, cloud costs rose above the available budgets that had been set, based on advice by the SI’s. During migration, the SI’s focused on the project deadlines rather than optimizing and saving on what was already running in the cloud. The telecommunications company turned to IDC Metri for independent advice on cloud cost savings. 

IDC Metri has helped to improve tooling, and to define processes and ways of working, for this telecommunications company to analyze and manage cloud costs themselves. IT leaders can learn from their experience that recommendations from tools, including those from cloud providers, aren’t always realistic. They tend to be opportunistic, like suggesting that all instances should be reserved for three years, and that this will save over 50% of costs for those instances. That is the same as expecting your IT landscape to remain the same within that time – this is simply not true. 

PostNL has been one of the first listed companies in the Netherlands to go ‘all in’ to the public cloud, starting in 2012. Nowadays, PostNL is in the second stage transforming all of its bespoke applications from IaaS to PaaS solutions, like BI/Analytics platforms, container platforms and serverless computing. When compared to IaaS, price models for PaaS are more usage based than capacity based. Saving costs on usage-based priced services means optimizing the software, rather than the underlying infrastructure. 

Unlike the anonymous international telecommunications company in our example, PostNL doesn’t have SI’s in-between them and the cloud providers that offer managed services. The application teams, mainly DevOps based, are managing the cloud infrastructure themselves. Also here, cloud costs had an upward trend, from which PostNL has asked IDC Metri to bend it. 

IDC Metri has made recommendations, which were much less supported by tools, since these focus on IaaS, rather than PaaS. With the top 10 teams concerning costs, alignment has been done on savings, which has led to about 8% savings. A must know here is that large scale optimizations, such as applying savings plans, had already been done by PostNL itself. The savings IDC Metri helped to achieve were more on architecture and licenses. 

In conclusion, using tools that generate recommendations is only the starting point for achieving savings. First of all, the recommendations need to be taken with a grain of salt since they tend to be rather opportunistic. Furthermore, a list of recommendations is one thing, to actually achieve savings, hereby overcoming indifference or even resistance to save costs, is another thing. IDC Metri does support the full process, from analyzing costs through setting up processes to actually achieving savings. 

Can’t wait until the next blog is published to learn more about cutting cloud costs? Contact us to schedule a conversation. 

How Agile Development Teams Can Resolve Agile Measurement Challenges With Function Point Analysis

IT Benchmarking

Agile development empowers teams with many benefits but also presents challenges around managing and measuring its effectiveness. The way to resolve these is Function Point Analysis.

From business impediment to business enabler, IT development has come a long way since Agile has become the favored practice. Now empowered with speed and responsiveness, organizations have left the days of slow, cumbersome, inflexible, and unresponsive practices behind in the dust. Instead they’re able to support business needs and experience better alignment with changing business environments better than ever before. 

It’s easy to understand why Agile is experiencing a strong increase in adoption; as companies become more nimble to embrace the pressures they’re facing in digital transformation, IT development is able to respond aggressively to evolving competitors and exploit markets more easily. But these benefits rival the frustrations on the management side of Agile teams. The nature of Agile makes it so that IT has lost visibility and scope control while the business has lost predictability. While Agile might make teams fast and responsive, businesses don’t know when projects will be delivered, and quality of delivery is often poor. 

This is due to story points. Story points is a relative and subjective effort measurement that allows teams to estimate how much work of a certain item is required compared to a certain reference story with a fixed number of points. Story points can be used as an assessment method within a team. But how do these points happen? In an Agile Scrum environment, productivity is often associated with delivered story points, often expressed in Velocity as an estimation unit. The problem is that story points are not standardized, and productivity based on story points means nothing outside of a team itself. Even within a team, story point deflation is always lurking. 

Is it even possible to objectively measure productivity? This blog will show that using a ratio scale is the way to objectively measure productivity as proven by IDC Metri’s years of helping clients turn around this common challenge. Management information can be established through a ‘unit of measurement’, bringing answers to long-sought after questions such as which teams are performing well, which teams are not performing so well and when is which functionality ready at what cost? 

If you want to use productivity to compare teams, departments, organizations and/or suppliers, or the market, it’s a necessity to use a standard measure of output. Even when this data is about trends on your teams, this insight creates a unified and common view.  

For years IDC Metri has been offering function points to create this factual view to clients. Function point analysis was developed in the 1970s to determine the productivity of development teams when it was impossible to do this by counting lines of code. By making function point analysis independent of the technical implementation (programming language, architecture, etc.) and the development method (Waterfall, Agile, etc.), it’s also relevant today and fits into the solution that Agile teams and management need to resolve the challenges that story points create. In short function points are the de-facto standard to express the amount of functionality in a standardized size unit. 

Several manual standards are available and one international ISO standard is available for automated function point analysis: ‘Automated Function Points (AFP)’. IDC Metri prefers to use automated measuring of functional size but also employs certified analysts who can manually measure when automated measuring is not possible for whatever reason.  

To measure the size of the output of a team, it is also important to not only look at the added functionality but also at the changed and removed functionality. IDC Metri uses automated measuring of ‘Enhancement Function Points (EFP)’ to measure how much functionality has been added, changed and/or removed during a sprint, release or project. This gives the ‘Project Size’ in EFP, a standardized method to measure the output of a sprint or release. 

While Agile is hard to measure and manage for full value, the IDC Metri proven approach of using function points transforms a team-driven, fast-moving, rapid iteration process that evaluates progress on qualitative measures into something that can be quantified and predicted. 

Looking to learn more about Objective Measurement vs. Story Points? Download the IDC Metri eBook How to Measure Business Value for Agile Teams.

Contact Us

Interested in learning more about IDC Metri?

IT’s New Year’s Resolution For Cloud Cost Savings

This IDC Metri blog is the first of a two-part series that brings IT to reckoning about the shortcomings that often cause them to give up on efforts needed to cut wasteful cloud spending.

The start of the new year brings many people closer to realizing ways they can improve, perhaps its eating better, or fitting in more time with family and friends. There might be professional resolutions such as meeting more regularly with your boss, connecting with colleagues outside of your department. For IT, cutting back on wasted cloud spending is often high on the list but tends to eventually fall through the cracks, with no resolution to this pattern. 

According to Forbes, while executives estimate that 30% of their cloud spending is wasted, at the same time enterprises intend to spend even more on cloud services. Clearly wasteful cloud spending is a recognized yet growing problem that for many continues to go unresolved. As this blog will show, where IT leaders fall short on is not identifying areas of spending that can be improved but implementing a plan of action for cost savings and maintaining it. 

To elaborate on cloud costs, there are many tools available from cloud providers and third parties that provide reports and dashboards, and even recommendations about which instances can remove or reduce/enlarge (rightsizing). Tools that provide intelligence can also determine how to use discount options (reserved instances, savings plans, reserved capacity, etc.), how to handle licenses smartly and what to do in application architecture to save costs. And, instances can be disabled when not in use. 

In summary these resources provide insight, but knowledge into your spending is only as useful as what you do with it to turn around your spending. And how you act will determine how effective you are at plugging the holes of your spending.  

Because of the effort that’s needed its common for IT to plug their holes with patches. Take, for example, disabling instances outside working hours. In theory this is an excellent saving, but instances are part of applications, which in turn are part of chains. And then it may just be the case that data exchange takes place in a chain outside working hours. But also, test teams that are approaching a deadline may sometimes need their environment outside the pre-planned working hours. And if environments are used in the management chain, they must also be available after hours in case of an emergency. Overall savings is easier said than done, mainly because it takes work to get there. 

Rightsizing is also more difficult than it seems. Users and administrators are often hesitant about removing capacity; users see their performance decrease, and administrators see the risk that more failures will occur because there is less overcapacity to absorb issues. In the latter case, you must carefully analyze where these issues come from; a mediocre application can benefit from more capacity, but that is not a long-term solution. Remember, if the roof leaks, you can replace the bucket that collects the water with a larger tub, but that too will become full at some point. You’ll eventually need to repair the roof. 

Ultimately, you’ll have to move towards an entirely new approach in which you not only have insight into the costs, but also involve users and administrators, so that you can make the right decisions about saving on your cloud costs. This isn’t as daunting or unattainable as it sounds. In our next blog we’ll reveal how some IDC Metri Cloud Economics clients have transformed their cloud spending, so you can see how to get there too.  

Can’t wait until the next blog is published to learn more about cutting cloud costs?

More information

Interested in learning more about IDC Metri? Let's schedule an appointment for an introductory meeting.

Financial savings for beginning CIO’s

Businesses need to understand the quality of the code in their applications if they are to avoid unnecessary costs and glitches.

Whether a multibillion-pound financial firm or an online training company, software is the lifeblood of an organisation. If core applications don’t work, business stops.

The move to digital business models puts pressure on software development teams to create and adapt software more quickly and at a lower cost, which in turn increases the risk of things going wrong. But while board-level directors might have blank faces when IT approaches them about the importance of understanding software code quality, there is hope for CIOs seeking extra resources to improve software quality. The ability to use metrics to back up anecdotal evidence will help CIOs win the argument.

At an event hosted by Cast, a company that helps its customers understand software quality, IT leaders from companies including a US financial services giant, Sony Pictures Entertainment, ING, Sita, and Dutch training and education organisation NCOI, described the benefits they have reaped by gaining visibility of the software code that underpins their business.

A recent Cast survey of 500 developers across four countries suggested that a third of developers are not held accountable for poor code quality. This is because businesses do not have information about it at their disposal.

It found that more than a third (37%) of developers are not graded on code quality. In France, this figure goes up to 45%, compared with 39% for Germany and the UK, and 27% for the US. Due to this lack of software intelligence across internal teams, businesses are paying for code that is below the required quality, and suppliers are not incentivised to improve.

IT downtime can be costly

The stakes are high. Toine van Eeden, CIO at NCOI, said a large proportion of customers find the company’s courses online, so if systems are down, the company’s business is, in effect, offline. A couple of hundred contact centre workers will simply be keeping seats warm as people looking for the right educational course go elsewhere. “Every hour we are offline means €100,000 in lost sales,” he said. Understanding software quality through a software intelligence program is the answer for Cast customers. By running their code through Cast’s application intelligence platform, they can get metrics about software quality.

NCOI, a 20-year-old organisation that offers online and campus-based training and education, started using the platform about a year ago. CIO van Eeden, who is not an IT professional by training, sits on the company’s board.

Despite the company’s relative youth, it has acquired companies that date back to the 19th century, hence it has had to integrate a host of software applications, including legacy systems, from the acquisitions. “When we acquire companies, we do not keep the legacy systems, but paint them in the colour of our company,” said van Eeden.

NCOI’s big applications, which take up the most development time, are its enterprise resource planning (ERP) system and a portal that supports students and 6,000 teachers. The student and teacher portal is currently being built, while the ERP system is being replaced, so there is a lot of software development work being undertaken. All of NCOI’s development is carried out by a supplier in a nearshore location in Romania.

Streamlining software development

When van Eeden joined the company, he quickly realised that its software development processes were inefficient. “When I entered the company two years ago, we had 80 people in Romania. We only turn over €256m, so this number seemed ludicrous. But the company was making so many changes to software that it needed them. After looking deeper, it became clear that this was not being managed,” he said.

“We used to just answer the suppliers’ questions and they would build it, and we would then ask them to test it because we were too busy. Then we would often get things back and find it was not what we asked for.” But over the past 12 months, using Cast’s platform to help it understand the software it has, NCOI has been changing this. “We make sure we are more involved, to ensure we don’t provide developers with loads of irrelevant information,” said van Eeden.

“We decided only to ask for something if we really knew what we wanted, otherwise you get a lot of bouncing backwards and forwards between the business and developer. These developers cost a lot of money, so we now only ask questions if we know they can deliver.” He said many companies think the suppliers manage everything, so they end up in a situation where nobody manages it and there is no incentive for the supplier to do the best it possibly can.

In the past, if NCOI wanted to do something new it would just request more developers. The people in control at the time had lost control, said van Eeden, so the company started using metrics to see how bad it was. “I am not really a tech guy, but I like to have facts so I asked around for help with the problem.” Working with a benchmarking company, van Eeden was put in contact with Cast, which now provides NCOI with a platform-based software intelligence service. “We hand them the code and they put it in a machine that analyses everything and tells us the quality in terms of productivity, security and other things, and then a benchmarking firm takes it and compares it to the market.

It is not just about cutting costs, but improving development productivity and code quality. In the past year, NCOI has fed code into the Cast system four times, but is moving to a contract to enable it to do so monthly to keep up with more regular software updates. “This is so we can refresh our portal every month,” said van Eeden. Ironically, since using the Cast system NCOI has been using more developers because it is doing more development. “For our core ERP application, we have doubled software development productivity,” said van Eeden. “My output doubled, and the quality in the sense of downtime and the number of bugs also improved dramatically.”

Van Eeden said he knows there have been no software outages since the company has been using the software intelligence platform, whereas previously it “didn’t even look at the robustness of systems”.

How to justify the cost of software intelligence

On the face of it, the NCOI story is a good use case. But convincing the non-technical board of a financial giant to change the firm’s software development processes is not an easy sell for CIOs.

For example, the financial services sector is heavily reliant on systems, with any downtime or security breach potentially highly damaging. But that isn’t always enough to entice the board to invest in software to measure the quality of code, according to one CIO at a large US financial services firm. He explained how his team succeeded in justifying a software intelligence investment and some of the benefits it has generated.

“If I go back to 2014 [before we measured software quality] and calculate the cost of the function points then compared to now, we have saved the company about $200m in an annual investment cycle, and that has gone straight back into capacity to build better systems and being agile as a company,” he said.

“I am a firm believer in better and faster, with lower risk, higher quality and faster output. You only get that if you invest a lot in engineering,” he said.

The company is using Cast in all of its processes, and is using metrics from the platform to justify its software development budget demands. “This is an ongoing journey, but we had a specific set of targets to meet in the business case. We were able to use the Cast numbers to convince the CFO [chief finance officer], CEO and board to prove that we had improved,” he said.

The company used Cast as part of a transformation of IT at the company, which was part of a bigger project. “As part of the big project, there was a CFO team looking at this to see if we were doing what we were supposed to [in software development]. They were looking at hundreds of metrics, so when we said software was better quality, better time to market, with more functionality and all those kind of things, they could see it was true. The close monitoring we had from the finance side generated confidence that we were moving in the right direction.”

He said when they went on to share details with the CEO and the board, they presented a more overall picture, with some anecdotal stories to help the board understand. The anecdotal evidence to get them in the door, and then delivering the enterprise view with the metrics later, with the CFO team confirming the team’s findings.

As an example of anecdotal evidence, the CIO said he reminded board members that resetting the mortgage rates in a big application, which had previously taken nine months and cost $1m, was more recently done in one week for $10,000. He said taking examples like that to the board helps them understand the value.

METRI Market Analysis

[pdf height=”1000″]https://metrigroup.com/wp-content/uploads/2014/11/MarketUpdate_website191120141.pdf[/pdf]