NelsonHall: Cyber Resiliency Services blog feed https://research.nelson-hall.com//sourcing-expertise/it-services/cyber-resiliency-services/?avpage-views=blog Insightful Analysis to Drive Your Cyber Strategy. NelsonHall's Cyber Resiliency Program is a dedicated service for organizations evaluating, or actively engaged in, the outsourcing of all or part of their IT security activities. <![CDATA[Accenture’s Zoran Tackles Digital Identity Failings]]>

 

NelsonHall recently visited Accenture at its Cyber Fusion Center in Washington D.C. to discuss innovations in its cyber resiliency offerings and the recent launch of its new digital identity tool, Zoran.

Failings of existing role-based access (RBA)

Typical identity and access management (IAM) systems control users’ access to data based on their role, i.e. position, competency, authority and responsibility within the enterprise. It’s a standard best practice to keep access to systems/information at a minimum, segmenting access to prevent one user, even a C-level user, from having carte blanche to traverse the organization's operations. Not only does this reduce the risk from a single user being compromised, it also reduces the potential insider threat posed by that user.

While these IAM solutions can match user provisioning requests to a directory of employee job titles to automate a lot of these processes, there can be a breakdown in the setup of these RBA IAM tools, with roles defined too widely as a catch-all, which in turn reduces the segmentation of the access. For example, if a member of your team works in the R&D department developing widget A, should they receive access to data related to widget B?

Likewise, another issue with these solutions is privilege creep, which is where an employee who has had several roles or responsibilities has retained previous permission sets when they have moved role. These and many more issues result in RBA systems being ineffective, as they are implemented as a static picture of the organization’s employees at a single point in time. In addition, recertification is a time-consuming and wasteful exercise.

Enter Zoran

Accenture developed Zoran in The Dock in Dublin, a multidisciplinary research and incubation hub. It brought in five companies to discuss the problem of identity management, two of which stayed on for the full development, handing over data to Accenture to be used in the development of Zoran.

Zoran analyses user access and privileges across the organization and performs data analytics to look for patterns in their access, entitlements, and assignments. The trends found in this patent pending analytics algorithm are used to generate confidence scores to determine whether users should have those privileges. These confidence scores can then be used to perform automatic operations such as recertification, for example, if a user’s details change after a specified period of time.

Zoran is not using machine learning to continuously improve confidence scores – i.e. if, for a group of users, an entitlement is always recertified, the confidence scoring algorithm is not updated to increase the confidence score. Accenture’s reason for this is that it runs the risk of being self-perpetuating, with digital identity analysts being more likely to recertify users because the confidence score has risen.

Currently, Zoran does not store which security analyst approved which certification for which user, although Accenture is in the process of adding this feature.

Will Zoran be the silver bullet for IAM?

IAM tools have been relatively slow to develop from simple automation to an ML/AI state, and this is certainly a step in the right direction. However, there will have to be some reskilling and change management around the recertification process.

While Zoran aims to reduce the uncertainty in recertifying permissions for a user, there is still a very limited risk of ‘false positive’ confidence scores being given which could automatically recertify a user, or that a security analyst could certify a user in something akin to a box-ticking exercise due to trust in the confidence score provided.

Accenture also needs to improve on developing the Zoran technologies with its other technologies; for example, its work with Ripjar’s Labyrinth security data analytics platform could yield some interesting results.

NelsonHall believes tools such as Zoran, combined with more traditional IAM solutions, are likely to be the current trajectory of the IAM market, with ML further segmenting groups/roles and providing increased trust in recertification processes.

]]>
<![CDATA[Quick Takeaways from IBM European Analyst & Advisory Exchange (vlog)]]>

 

Mike Smart reports directly from IBM’s European Analyst & Advisory Exchange 2017 with some quick take-aways regarding IBM’s transition from systems integrator to services integrator and its business resiliency services.

]]>
<![CDATA[How IT Services Vendors Can Help Organizations Meet GDPR]]>

 

In this, the second of two articles on GDPR, I look at how IT services vendors can help companies meet GDPR compliance in several areas. You can read the first article, ‘The Impact & Benefits of GDPR for Organizations’, here.

Application services

Application services can help organizations in ensuring that new and legacy applications meet the GDPR articles pertaining to applications: namely Article 25, which aims to make sure that applications have ‘data protection by design and default’.

In short, application providers should be providing:

  • Security by design in the early stages of the SDLC
  • Gap analyses on what personal data is required, how it is collected, processed and handled
  • Ensuring a level of security appropriate to the risk with:
    • Encryption and/or pseudonymisation of data
    • The ability to restore personal data in case of a breach or technical issue
    • Regular security testing of practices and solutions to ensure confidentiality, integrity, availability, and resilience
    • Data minimisation efforts, using the prior gap analysis so that only the required data is collected, stored, and accessed (for example, does the organization really need to know users’ age to provide a non-age restricted service?)
  • Ensuring that the principle of least privilege is used for internal users so that they may only access required data (for example, in a telecoms provider, a customer service agent providing technical assistance need not know clients’ full payment details and history).

The difficulty arises with articles of the GDPR that require organizations to be able to provide data portability and the right to be forgotten. For data portability (i.e. the right of the user to take their data from one vendor to another), the regulation encourages data controllers to develop formats for the data to enable portability. However, in legacy systems, this data may be structured in a way that makes portability difficult.

Also, GDPR’s ‘right to be forgotten’ allows users to have their data deleted without a trace, but this has the potential for disrupting how organizations backup data, due to technological limitations and existing regulations. There are concerns that the right to be forgotten is not achievable while meeting existing regulations that require organizations to hold data for an extended period of time. For example, MiFID II, for which financial institutions must record all conversations related to a financial deal for 5 years. GDPR’s right to be forgotten does not apply when other legal justifications are in place, and the regulation is superseded by the other legal requirement. Organizations in this position will need to consider carefully which data is required and which data can be safely erased.

Organizations that use data backup services also have to ensure that their backups meet GDPR requirements. Data that is restored from backups must also be free of data that the user has requested to be erased. However, in some technical implementations, it is technically impossible to delete bits of data from backups without destroying the entire backup.

Cybersecurity

Cybersecurity vendors can help organizations meet GDPR articles that impose more stringent data security. Most of the cybersecurity services providers’ frameworks divide the act of becoming compliant into five standard operations:

  • Assessment – the vendor conducts privacy, governance, process, and data security assessments and gap analyses to identify personal data and how it is processed within the organization, and constructs roadmaps to make the organization GDPR compliant
  • Design – the vendor designs an implementation plan of the standards, controls, policies, and architecture to support the roadmap
  • Transformation – the embedding of tools, technologies, and processes
  • Operation – the execution of business processes and the management of data subject rights
  • Conform – monitoring and auditing the organization's compliance to GDPR.

Cybersecurity vendors’ incident response (IR) services will be well placed to handle cybersecurity breaches that require notification to the in-country supervisory authority. The change to incident response protocols after GDPR is enforced is the requirement to notify the authority within 72 hours. Currently, typical IR SLAs can provide off-site services in one hour, and onsite support within ~24 hours. In situations where an existing agreement is in place, remediation vendors are less able to commit to the 72-hour deadline and less able to guide their clients in contacting authorities. As GDPR comes into place, we can expect to see the number of organizations choosing IR services retainers to grow.

Other vendor initiatives

An organization need not choose a single vendor to complete all these operations. Indeed, in a number of cases, vendors are being approached after the organization has conducted assessments of their current level of compliance independently or with the help of another vendor, and managing GDPR tools and auditing the compliance is expected to be rolled into existing Managed Security Services GRC operations.

Other service providers are working to ensure that their services are GDPR compliant. Initiatives to become compliant include:

  • Cloud services providers that were previously exempt from the 1995 directive are now regulated and have been working to meet the May 2018 GDPR deadline. As most of the GDPR requirements on cloud providers are covered by ISO 27001, meeting 27001 standards will certainly help the provider demonstrate that it is working towards ‘appropriate technical and organizational measures’, as specified by GDPR
  • SaaS vendors have been mapping incoming and outcoming data flows, and how data is processed and stored, and demonstrating that they can meet users’ requirements for the right to erasure, data portability, etc.
  • ADM vendors have been performing application design services as part of an SDLC as a matter of principle for years, and will not require drastic changes beyond possibly expanding the use of pseudonymization
  • Application security vendors have been performing vulnerability and compliance testing as a core service, and have added provisions to perform GDPR gap analysis.

DPO services

A service that NelsonHall expects to grow fast is Data Protection Officer (DPO) outsourcing. The DPO role (required for data controllers and processors alike) can either be internal or outsourced (provided that the DPO can perform their duties in an independent manner and not cause a conflict of interest).

Of the vendors we have spoken to about GDPR services over the past year, none had a defined DPO outsourcing service in place, and only one (LTI) has been working towards a defined service. LTI is currently in the process of training DPO officers, and is investigating exactly how the service should be offered. NelsonHall expects to see a number of distinct offers around DPO emerge from IT services and law firms very soon.

Not long now…

With the impending enforcement of GDPR less than 200 days away, and services from vendors solidifying, organizations would do well to start considering services now emerging to help them work towards compliance.

]]>
<![CDATA[The Impact & Benefits of GDPR for Organizations]]> In this, the first of two articles on GDPR, I look at how the regulation is set to impact companies, and at the benefits of compliance beyond simply avoiding penalties.

 

 

The EU's General Data Protection Regulation (GDPR) was adopted in April 2016 and will be put into force on 25 May 2018. The unified and enforceable laws contained in the regulation replace the outdated rules (that could be interpreted differently by each member state) contained in the 1995 EU Data Protection Directive.

The regulation is of critical importance to organizations because of the steep fines that can be levied for failing to meet the requirements – up to €20m or 4% of global annual turnover for the preceding financial year (whichever is greater) for serious breaches, and €10m or 2% of turnover in less serious cases such as procedural failures.

It is worth noting, however, that these are maximum levels that can be imposed by the supervisory bodies within countries, and in reality they may be much lower. The U.K. information commissioner, Elizabeth Denham, who will be leading the enforcement of GDPR in the U.K. has stated that early talk of fines at such high levels amount to scaremongering, and that ‘issuing fines has always been, and will continue to be, a last resort’. As a proof point in the last year financial year, the U.K. ICO conducted 17k investigations of which just 16 resulted in fines.

Additionally, authorities may be even less able to handle the number of cases related to GDPR after the May 2018 enforcement period begins due to the level of staffing. The U.K. ICO is particularly strong, with 500 personnel, and plans to add 200 new positions over the next two years to help cope with the increasing number of cases related to GDPR. Other member states have lower headcount levels.

Hence, the indications are that strict enforcement may not happen from the outset when the regulation comes into force, and that organizations shown to be working towards meeting the regulation may be given some leeway. Nevertheless, organizations should be looking to start on the road to compliance as soon as possible.

The GDPR exercise should not be seen as one of solely checking boxes to avoid being fined, as there are a number of benefits to organizations in being compliant:

  • GDPR can be seen as a chance to review the company’s data handling processes, restructuring them not only to meet compliance, but also to identify potential efficiency gains or new business opportunities/revenue streams
  • Increasing the level of security of user data through encryption or pseudonymization will build trust with users, as breaches in the organization's cybersecurity are less likely to impact them
  • Performing a review of IT processes, organizations will be able to identify and eliminate ‘shadow IT’ and build proper processes that are known to the organization
  • It is a chance to improve IT systems and processes behind the scenes, e.g. through the implementation of customer identity and access management (CIAM) and backup systems.

 

In the second blog on GDPR, I will look at how IT services vendors can help companies meet GDPR compliance.

]]>
<![CDATA[Atos’ Use of Machine Learning for the Prescriptive SOC]]>

 

When NelsonHall spoke to Atos earlier in the year about its managed security services, there was a clear push to move clients away from reactive security to a predictive and prescriptive security environment, so not only monitoring the end-to-end security of a client but also performing analytics on how the business and its customers would be affected by threats. Atos’ “Security at Heart” event two weeks ago provided more information on this.

I recently blogged about IBM’s progress in applying Watson to cybersecurity; Watson ingests additional sources such as security blogs into its data lake and machine learning to speed up threat detection and investigation. At face value, the prescriptive SOC offering from Atos isn’t very different in that it starts with a similar goal: use a wider set of security data sources and apply machine learning to better support clients.

With Atos’ prescriptive security approach, it has increased the amount of security data in the data lake that it analyzes. This information can come from threat intelligence feeds, contextual identity information, audit trails, full packet and DNS capture, social media, and information from the deep and dark web.

Atos highlights its ability to leverage its analytics and big data capabilities of its bullion high-end x86 servers to apply prescriptive analytics to the data in its data lake, then use the information, through McAfee’s DXL data exchange layer and threat defense life cycle, to automate responses to security events.

Using this capability Atos can reduce the number of manual actions that analysts are required to perform from 19 to 3. The benefits are clear; cyber analysts have more time to focus on applying their knowledge to secure the client and the speed, and completeness of the service offered increases. Atos claims its Prescriptive SOC analyzes 210 indicators of compromise compared to 6 in the previous service, reducing the time to respond to a threat from 24 hours to under seven minutes, and time to protect against a threat from 4.2 hours to around one minute.

Atos has been beta’ing its prescriptive managed security offering with several clients, mainly in the financial services sector.

Another highlight of the event was Atos’ Quantum computing capabilities, with the release of its Quantum Learning Machine (QLM) quantum computing emulator. These investments in quantum computing in effect future proof some of its cybersecurity capabilities.

The general consensus currently is that scale use of quantum computing by enterprises is still around a decade away. When this happens, quantum computing will add a powerful weapon to the threat actors’ arsenal: the ability to break current encryption methods. Atos' current investment in quantum computing, and specifically its quantum computing emulator, will help organizations develop and test today the quantum applications and algorithms of tomorrow. 

]]>
<![CDATA[IBM and the Road to Cognitive Security]]> Every day the sheer amount and complexity of cybersecurity information that security analysts are required to sift through increases. Analysts arrive at the start of each day, catch-up on recent attacks and research cybersecurity news, and are thrown into analyzing security incidents: going down the rabbit hole of reviewing data flows, finding outliers, investigating IPs, searching through both internal and external structured and unstructured data sources, etc. All this work compounds the skills shortage that the cybersecurity services industry faces and the fatigue that security analysts endure.

Enter IBM’s Cognitive Security.

IBM has been training its cognitive analytics solution, Watson, to understand the ‘language’ of cybersecurity.

Watson ingests data from a massive range of material spanning structured data from the likes of its X-Force Exchange and partner data every 5-minutes; crawling unstructured data such as security blogs every hour; and every 1-3 days pulling in millions of documents from across the web. Data is then filtered, removing unnecessary information using machine learning capabilities, for which Watson then extracts relevant information and annotating it for security analysts.

Building up its knowledge corpus, Watson is able to automate the analysts’ work of searching for data, linking into QRadar. Security analysts using QRadar are provided with a set of insights on a threat, and by pressing a single button can view a knowledge graph that details the relationships between devices, threats, files, IPs, etc. and then dives into more detail.

 

Watson’s knowledge graph and threat report

 

IBM has stated that the use of this cognitive power can speed up the analysis of a security threat from 1 hour to less than 1 minute, adding more insights than the analyst would ever be able to search.

Security analysts can rate the analytics performed by Watson, and with IBM increasing the amount of information being ingested into the knowledge corpus, the quality of the insights provided is improving fast. Initial feedback on the Watson for cybersecurity beta program set up in December 2016 to 40 clients wasn’t completely positive, but with bi-weekly calls the quality of results from Watson increased rapidly.

By shifting analysts’ work focus from searching through multiple sources for data, these cognitive solutions are reducing the time spent on L1 and L2 activities: not only is there a shortage of cyber analysts, but the deployment of Watson for Cybersecurity makes the work less grunt.

The difference in analyst time spent using traditional research and with Watson

 

Where else is IBM taking cognitive security?

By teaching Watson the ‘language’ of security, IBM has built two solutions that help its cyber analysts and clients interact.

The first is Security Services Virtual Analyst, launched in October 2016, which acts as a chatbot in the client’s MSS portal. The chatbot answers common questions from clients up to 50 times faster than waiting for a security analyst.

The second, Project Havyn, allows security analysts to talk directly to QRadar to perform actions more efficiently.

IBM has recently linked QRadar Advisor with Watson to Resilient Systems, the incident response solutions vendor it acquired in March 2016. With this, the Watson QRadar app can directly send on threat information to Resilient’s QRadar app, for Resilient to have the best information upon which to act and stop the attack. In future, it is not out of the range of possibility for vendors to look to implement a cognitive solution that can both analyze threats and also immediately perform remediation actions such as patching vulnerabilities.

With the use of IoT about to accelerate, and the increasing complexity and scale of cyberattacks already apparent, the importance of the use of these cognitive technologies in cybersecurity should not be underestimated.

]]>
<![CDATA[WannaCry and the Need for IT Spend on Cyber]]> Last Friday morning, the largest ransomware cyber attack infected an unprecedented number of machines across organizations worldwide. The ransomware named WannaCry demanded $300 in Bitcoins to be paid in three days, otherwise the ransom would double. If no payment was made after seven days, data would be deleted forever.

 

WannaCry ransom message

 

Each time this ransomware infected a new computer it tried to connect to a domain; if it could not reach the domain, WannaCry continued to spread. To aid its spread, WannaCry utilized a tool known as EternalBlue to identify and use file sharing protocols on the infected systems to spread. EternalBlue is a hacking tool developed by the NSA then stolen by a group called Shadow Brokers and dumped online in April.

With the online dump of the vulnerability, known as MS17-010, Microsoft went about producing and releasing a security patch to fix the vulnerability, quickly pushing the update live to its current operating systems. Unfortunately for some, operating systems that had been EoL’d before the attack, namely Windows XP, did not have a security patch released initially as Microsoft usually charges to provide custom support agreements for old version of Windows.

Organizations hit that have in part remained on XP include Telefónica, the NHS, FedEx, Renault, and Police and petrol stations in China.

Advice for ransomware is to:

  • Isolate the system
  • Power off
  • Secure any data backups
  • Contact law enforcement
  • Change account passwords.

The FBI had previously released a controversial statement saying they often advise people to pay the ransom, though it does state that paying is no guarantee of getting data back.

The trouble with any advice to pay WannaCry is the mechanism it has to release infected systems. WannaCry has no process to uniquely identify which infected machines have paid the ransom and therefore the likeliness that any infected machines will be released by the attacker is low. Nevertheless, the hackers’ bitcoin wallets have received more than 230 payments totaling more than $65k.

So what can be done about WannaCry and other similar ransomware?

After two days, Microsoft released a patch for Windows XP to fix the vulnerability.

Before this, the attack had been slowed by a security researcher who analyzed the code of WannaCry and detected the domain kill switch it had been attempting to connect to. By registering this domain (for less than $11), newly infected systems received the kill switch and did not go on to spread the ransomware. However, since this, a second version of WannaCry was released without this kill switch.

Organizations should be looking towards their IT service providers to mitigate the threat.

In the immediate timeframe, clients should look towards their service providers to download and apply all applicable OS security patches and antivirus updates and look at what data DR systems can restore.

Moving forwards, organizations should be looking with their IT service providers at:

  • Performing a cybersecurity vulnerability analysis to assess the current state of affairs, discover the organization's crown jewels, and close vulnerabilities
  • Developing business continuity plans to ensure even if/when a cyberattack occurs, the organization knows how to react and reduce the impact
  • Developing cybersecurity training programs to reduce the chance staff will download malware/ransomware.

As part of a wider conversation, if an enterprise has business critical infrastructure that remains on outdated OS’s, it should be looking at how these systems can be secured. These systems could be upgraded to more current OS’s, or if legacy processes or applications prevent this move, perhaps look at other methods of protecting these systems such as air gapping the infrastructure or even paying for Microsoft’s extended service agreement.

In the case of the NHS, at end 2016, incredibly 90% of NHS Trusts were still using Windows XP in some capacity – yet last year, the U.K. Government Digital Service decided not to extend a £5.5m one-year support deal that would have included security patches. We imagine there are some red faces at GDS. Decisions like this in not extending this support deal have now had a huge impact in some areas of the NHS, including in some areas causing delays in the delivery of life-saving services. There are clearly lessons to be learned in both the public and private sector about managing old estates.

]]>
<![CDATA[Tech Mahindra’s Application Security Business Expands Offering, Targets Multi-Year Deals]]> In an earlier blog, we described how Tech Mahindra had expanded its performance engineering testing to embrace the Internet of Things (IoT), and here we take a look at how the company is handling another area of non-functional testing: software security testing.

Tech Mahindra provides security services through its Cyber Security practice, a horizontal line of business. The practice has a headcount of 650 personnel, has 85 active clients, and has several service offerings/sub-practices:

  • Consulting and GRC
  • Identity and Access Management
  • Security Operations and Monitoring (through security operations centers in Pune and Delhi)
  • Application Security.

Most contracts are small (up to $10m), with clients mostly in the U.S. and U.K., across sectors. Tech Mahindra has a larger client base larger in telecoms, a reflection of the company’s background in communication service provision, and it is expanding to BFSI and manufacturing.

Application Security is a significant activity for the Cyber Security practice, accounting for 25% of revenues, and with a headcount of ~120. The Application Security sub-practice is responsible for:

  • Addressing attacks that target security gaps in applications
  • Creating transactions to check data, access and privilege-based security issues
  • Addressing non-compliance to regulatory and security standards.

Application Security has several activities across the software development lifecycle, including dynamic application security testing (DSAT), ethical hacking, static application security testing (SAST), and security design review. Of these, ethical hacking (e.g. manual and automated penetration testing) remains one of the services most in demand, along with related project-based activities (including code reviews, threat modeling, and application design review), plus training and ‘shift-left’ consulting.

Penetration testing is in demand as an effective way of security testing, and also for compliance reasons; e.g. as part of quarterly audits, or by data center operations for certification purposes. Most of the applications tested are web-based applications, web sites and mobile apps.

One of the challenges face by the Application Security sub-practice is expanding project-based testing into multi-year contracts, with TCV currently up to $10m over five years. The sub-practice argues that, contrary to functional testing activities, it does not provide a pass/fail service; rather, it continuously looks for vulnerabilities, not knowing where the next attack will come from, including finding vulnerabilities in previously tested code. For this reason, Tech Mahindra has created its Application Security Bureau offering for multi-year contracts, where delivery of application security is provided by Tech Mahindra, but with governance remaining in the hands of the client organization.

In spite of this, client demand remains very much project-based, largely constrained by budget availability. As a result, Application Security has expanded its service offerings and pricing model to accommodate clients with limited budgets. These offerings are:

  • Security Test Factory (where the client buys services from a service catalog, and where teams are provided on a flexi model)
  • Security Liaison (where one of Tech Mahindra’s consultants acts as the interface between the businesses and IT, to drive understanding and coordination among stakeholders).

Security Test Factory is a very successful offering and captures 80% of spending among Application Security’s multi-year contracts.

Tech Mahindra’s Application Security sub-practice remains optimistic about the potential for multi-year contracts, with security having become a top priority for client organizations. Lack of skills are another driver, as well as lack of knowledge of the relevant hacking software tools. Application Security points out that clients tend to use traditional enterprise software products for ethical hacking. Yet Application Security’s research shows that attacks are carried out by hackers using software found on Darknet, or open source software, and require skills that most clients do not have internally.

What is the future like for Tech Mahindra’s Application Security? Digital is obviously on the agenda. The sub-practice has launched two digital flavors of its Security Test Factory: DevOps and IoT. With short development lifecycles, DevOps requires further investment in security testing automation and industrialization. Meanwhile, IoT mostly requires security assessments at the user interface level for connected devices, and remotely for sensors.

By Mike Smart and Dominique Raviart

]]>
<![CDATA[Rio 2016: How Atos is Helping the IOC Redeploy its Budget from Run to Digital]]>

Four years ago, at the time of the London 2012 Olympics and Paralympic Games, NelsonHall reported on the work Atos does for the International Olympic Committee (IOC) though its Major Events unit. See our previous commentary here. This week we visited its center in Barcelona to get an update on the work it is doing for the Rio Games starting next month,

The Olympic Games remain a fantastic opportunity for Atos to demonstrate it can handle complexity and scale for a very visible event. The numbers are humongous: 4bn viewers, 300k accreditations, 70k volunteers, 30k media members, 10.5k athletes - and also on the IT side: an expected 1bn security alerts, 200k hours of testing, 250 servers (equivalent to 1,000 physical servers) and 80 applications.

Major Events is a relatively small unit within Atos (we estimate revenues <€100m), with activity fluctuating significantly from one year to the other in terms of headcount and revenues. Major Events has diversified its client base from the IOC to other international sporting events, including the 2015 Pan American Games in Toronto. The unit is Spain-centric for historical reasons: Atos, then SEMA Group, had started servicing the IOC for the 1992 Barcelona Olympic Games. And in 2012, Atos acquired MSL Group a scoring and time group with sport domain experience, based in Madrid.

In addition to managing scale, Atos Major Events manages uncertainty: at the time of its contract renewal (until 2024) in late 2013, the company did not where the Olympics would take place in 2022 (Beijing) and 2024 (still TBD). The location impacts Atos significantly from a delivery perspective e.g. for the Sochi 2014 Winter Games, Atos faced IT labor shortage in Sochi and had to source personnel across Russia, and in Russian-speaking countries (i.e. Romania and Serbia). For the 218 PyeongChang Winter Games, Atos Major Events is facing a similar challenge, and will be relocating IT personnel from Seoul, 200km away. In total, the financial impact is significant (up to 20% in additional costs), all within the context of a fixed bid, done eight years before the event. Nevertheless, Atos highlights its margins on Major Events are positive.

Atos Major Events provides a full IT outsourcing service. This includes a SIAM role, working with ~30 technology partners (which it has not selected to work with, but has gained years of experience in joint work). In addition to its SIAM role, Atos provides systems integration services and software products (Games management System, including volunteer portal, sport entries and qualifications, accreditation service, and workforce management), as well as security services. Testing, of course, is a priority: “when we are finished testing, we start testing again”.

IOC Budget Shifting from Run Services to Digital

Reflecting a broader market evolution, the Rio Games take place in the context of shifting budgets: the IOC is looking to drive down costs on run services. IaaS (on Canopy private cloud) is a part of this change, with Atos using a Canopy datacenter in Eindhoven, Netherlands for the 2018 Winter Games. The biggest savings will come from removing the need for migrating 1k physical servers in a new onshore datacenter for each Games. Also, there a very significant space gain element. Obviously, the datacenter is located on the other side of the Atlantic for the Rio Games and Atos Major Events will be using dedicated leased lines for critical applications.

Delivery is also changing: the company will deploy its last onsite Integration Center (mostly providing testing services) for Rio 2016. Going forward, this center will be located in Madrid. As for Canopy/IaaS, the creation of a centralized remote center in Madrid will remove equipment migration needs, and associated costs. And Atos is moving back its application management work (~25 FTEs supporting its software products) from the host city to Barcelona.

What will remain in the host city is the Technical Operating Center (TOC), a command and control center providing IT infrastructure management, service desk, project management, security services. The TOC is significant (500 personnel of Atos, IOC and technology partners, over three shifts, operating 24/7 during the Games) but still needs to be onsite in the host city at this point.

The IOC is rebalancing its budgets towards digital, starting with mobility. In the London 2012 Games: just 1% of information was accessed through mobile. In Sochi, this number reached 80%! Rio will be the Games where visitors will attend one competition in one venue while accessing results of another competition on their smart phones. In total, ~8bn devices will at some point during the Rio 2016 Games access information provided by Atos Major Events.

In addition to mobility, Atos Major Events is working on integration with social media, and is investing in its media player (for streaming video, audio and data). It is also refreshing its software products to make them further user-friendly to the different communities and the media in priority.

What Else Will We See Next?

Digital will continue to be a priority for IOC, extending from mobile services to wearables and IOT (and therefore big data).

Another big digital push is services to the media and broadcasting industry. Provisioning of some level of media content is part of the plans.

To some degree, Atos is leveraging Atos Major Events capabilities in other units: certainly, in security, Major Units and the Big Data & Security unit are collaborating on methodologies, common IT architectures, and also on security scenarios.

There is also an element of cross-selling with the usage of Atos Bull SIAM software products and Bull Hoox encrypted phones. Looking ahead, Atos is considering using software products from its Unify subsidiary.

Our understanding is that Major Events is currently self-contained and uses the larger Atos, apart from security collaboration, on sourcing talent, for instance around testing. Will we see more experience sharing from Atos Major Events to the wider Atos? As Atos focuses more and more on being an integrated firm, to accelerate organic growth, this may happen. We also expect to see Major Events benefit from Atos’ investments in automation and AI over the next few years.

We would have liked to have heard more about plans around big data, analytics, AI and content, suspect that Atos is constrained contractually to disclose much about these.

In summary, the Olympic Games are a wonderful opportunity for Atos to showcase its capabilities around SIAM, project management, testing and security services, and to demonstrate it successfully handles scale, complexity and uncertainty, each time in a new location, every four years.

]]>
<![CDATA[M&A Activity (Part 3): Further Scale and Digital Remain Priorities in 2016]]> In December 2015, we published two blogs about M&A activity in the IT services industry in 2015 (here are the links for Part 1 and Part 2). This blog examines M&A activity in IT services in Q1 2016 and sets our expectations for the rest of this year.

In short, 2016 started off with a bang, with two very large IT services acquisitions announced in the first quarter:

  • Leidos acquiring Lockheed Martin’s IS&GS unit for $6bn
  • NTT DATA is acquiring Dell Services for $3.1bn.

Compared with last year, the whole of 2015 saw just one multi-billion acquisition announced: that of IGATE by Capgemini for $4.5bn. We expect to see more large deal activity.

Atos and CGI Likely Bidders for Large Transactions in 2016

Among all IT services vendors, Atos and CGI are the most likely buyers: their business models are based on inorganic growth.

  • Atos has clear growth ambitions. Its net cash position (estimated by NelsonHall) is ~€200m after the Unify acquisition. The U.S. continues to be a priority, in particular Managed Services, adding to the scale brought in by Xerox ITO
  • CGI now has net debt under control (estimated at ~CDN $2.0bn) and can borrow up to CDN $1.7bn. CGI acquisition targets include around software IP, U.S. commercial and U.K. commercial.

Meanwhile, three other acquisitive vendors, Leidos, NTT DATA and Capgemini, have put a temporary hold on their M&A activities. Leidos and NTT DATA obviously will focus on finalizing and integrating their acquisitions, also on reducing their net debt (~$3.4bn and ~$6.5bn respectively). Capgemini has a lower debt (~€1.8bn) but less appetite for debt leverage than, for instance, CGI, and still needs to integrate IGATE and prove this acquisition is working. The company has denied any interest in acquiring Hexaware.

TCS, Cognizant and Infosys have the cash make large acquisitions. TCS does not have a track record in large transactions and does not need one: it still is enjoying industry-leading growth in spite of its size ($16.3bn in revenues in calendar year 2015). Cognizant has also enjoyed industry-leading growth but appears to be more large acquisition minded, even after TriZetto. For both Infosys and Wipro, inorganic growth is key to their 2020 revenue targets. Infosys’ target is $20bn (up from $9.2bn in CY 2015). Wipro’s target is $15bn (up from $7.2bn). Both have experience in small to mid-sized acquisitions. Neither has of integrating a large acquisition. 

CSC is in a different situation: acquisitions are a key component of its turnaround. Having acquired UXC to gain scale in Australia, it is now in the process of acquiring Xchanging which will bring in insurance software assets, inter alia. We expect to see more mid-sized acquisitions from CSC.

Finally, the network of companies that is Deloitte continue to make small acquisitions across the globe, many of them digital related.

So what themes will prevail in 2016? In short, all the current hot topics will remain

Gaining scale in India

Mphasis, Hexaware and Zensar are likely targets in 2016. And PLM service vendor, Geometric Ltd, whose largest client is ISV Dassault Systems, is also rumored to being up for sale. Valuation multiples in India defy gravity but firms like Hexaware and Mpashis are within reach, at ~$1bn-$1.5bn.

Mid-sized deals in U.S. Commercial

As we have noted above, the likes of Atos, CGI and CSC, also some of the Indian oriented service providers are interested in mid-sized vendors with a presence and IP in specific U.S. commercial industries, including utilities (but not energy, although there will be some fire sale opportunities) and healthcare.

BpaaS, or at least a BpaaS aspiration, is likely to be a feature of some of these deals. An early example this year is Wipro’s announcement in February it is to acquire HealthPlan Services for $460m.

Digital Capabilities and RPA IP: Small to Mid-Sized Acquisitions

Looking at smaller acquisition activity, obvious attractive targets will continue to be firms, often privately held:

  • With digital services capabilities, including in digital marketing, UX, cyber security, and SaaS implementation services. In particular, we expect to see M&A activity around cyber hot up this year
  • That have IP around RPA or cognitive intelligence.

Many of these targets have headcounts in the 50 to 200 range and are local players. Competition for these firms is high and includes the largest global IT services vendors, with Accenture having led this drive for the last four years.

The hunt even extends to very small firms. Giants such as Accenture and IBM are acquiring firms with specialisms in perhaps digital strategy or SaaS services that have fewer than 100 employees.

The market is getting further crowded; telecom service providers continue to acquire in security while the advertising sector has expanded its M&A scope from UX to SaaS services.

And what will we see in the mid-term?

IoT, IT/OT and Big Data Will Become Increasingly Important in the Mid-Term

IoT, also the integration between IT and Operational Technology (OT) will drive a lot of M&A investment in the years to come, initially around IoT platforms, with the intent to reach scale, create a vertical-specific IOT platform, or gain point capabilities e.g. device security testing, creating device-specific apps. In all likelihood, acquisitions will be small in scale; an early example is that of Radius by Luxoft.

On a large scale firms that have IP around big data will be attractive (while this was not an IT services acquisition, that of The Weather Company for $2bn by IBM was an interesting move that will prove its value in the longer term).

]]>
<![CDATA[The New CSC Targeting Topline Growth Within 3 Years: Is Organic Growth Possible?]]> CSC has just laid out the financial targets of the standalone business which will retain the CSC moniker when the U.S federal company, CRSA, breaks off.  As well as CSC’s global commercial business, it includes non-U.S. public sector businesses (~$700m revenues in FY15).

In its FY15 (ending March 31, 2015) this part of CSC achieved revenues of $8.1bn, and an adjusted operating margin of 10%.  

However, H1 FY16 revenues were just $3.55bn, and guidance for FY16 is $7.5bn. So this is a company still in negative growth, with no sign of topline recovery in either division: GBS revenues were down 13.4% y/y (down 5.7% in CC) to $1.8bn and GIS down a painful 19.8% (down 13.2% in CC) to $1.754bn.

Yet management is now talking about a resumption of organic growth (of 1%-2% in constant currency) by H2 FY 2017, with acquisitions expected to bring an additional 1% - 2% per year. Is organic growth likely? We think not.

While we believe CSC may well resume topline growth by H2 FY17 (for the first time in many years) we believe this will be driven by acquisition activity.

Since the arrival of Mike Lawrie as CEO, there has been a sharp improvement in profitability - and the drive continues. For example over the next three years CSC is targeting a margin improvement of 125 to 175 bps from delivery and workforce optimization. And in procurement, it is looking to take out another $300m in spend

But achieving topline growth in the legacy business? Let’s look briefly at the current portfolio.

GIS: still impacted by red contracts; may shed its data centers

While the number of red contracts in Global Infrastructure Services (GIS) is far fewer than the 45 when CEO Mike Lawrie, a handful still remain – and their impact continues: they will represent a revenue decline of 200m to $250m in FY 2017.

GIS has changed its market approach, only going after large deals very selectively. But strengthening the sales culture, for both hunting and farming, and account management is not something that can be done speedily, particularly in a global organization like CSC. The company has increased sales-related expenses to 5% of revenues and claims it is both retraining and hiring aggressively. However, it is hardly an employer of choice currently.

In recent years, GIS has standardized and streamlined its portfolio, and repositioned from large asset transfer deals to smaller deals, in line with a general market shift. CSC has sought to reduce delivery fragmentation across clients, and drive hardware, software, tool and process standardization. As it admits“[previously] we had volume but we did not have scale”. This will help in pricing – but enough to win enough new business to drive topline growth?

In what would be a dramatic move to move to an asset-lite model, CSC is now considering shedding its large estate of datacenters and moving to a co-location partner model.

GBS: Turning around US consulting and growing Celeriti Fintech both key

Within Global Business Services (GBS) the consulting unit has recently seen mixed performance in terms of topline growth and profitability. In Q2 FY16, the U.K. was back to growth (18% in CC) whereas the U.S. consulting was (down 5%). CSC is confident it can replicate its success in its U.K. consulting practice in the U.S. We are not convinced.

Elsewhere, GBS is expecting slight organic topline growth (up to 2%) in its Industry Solutions and Services (ISS) business in the banking, insurance and healthcare/life sciences sectors.

Key to this will be the JV with HCL Technologies (‘Celeriti FinTech’) in which CSC has put Celeriti and Hogan, and which addresses modernization opportunities in the banking sector.  It is too early to tell how successful this JV will be - but speed is of the essence, both in the platform development and in the sales efforts.

CSC did not address in the investor day how it is going to address its fast decline (~7% in CC in H1 FY16) in its application management and software testing businesses. Traditional application management continues to prove tough, even for some of the larger IOSPs. And the AppLabs acquisition has not helped CSC achieve the kind of growth in software testing that other vendors have been enjoying recently.

“Next-Gen” Offerings: Targeting 30% CAGR

CSC claims its “next-gen” offerings will represent ~$700m of its FY16 revenues (or just over 10%). They comprise

  • Cloud $210m
  • Cybersecurity $150m
  • Big data $80m
  • “Other next-gen”: $260m.

A targeted 30% CAGR means revenues of over $1.5bn by FY19 - excluding any contribution from acquisitions. And here the targets for the legacy business get a little cloudy, particularly in “other next gen”, also what is in scope in “cloud” (e.g. does it include BPaaS).

Overall, the aspiration to achieve organic revenue growth seems optimistic.

Acquisitive Growth Will Reshape the Portfolio

CSC is essentially a company that continues to look to reinvent itself. We believe any profitable growth in the next few years will be dependent on acquisitions.

The four that CSC has recently closed or is actively considering (we have written separately about all of them in other blogs) indicate where CSC is looking to reshape its portfolio:

  • Fixnextix and (if it completes) Xchanging will boost the ISS and industry-specific BPS business in the BFSI sector
  • UXC (again, if it completes) brings in additional scale in Australia, plus useful practices for ServiceNow and Microsoft Dynamics
  • Fruition Partners brought in ServiceNow integration capabilities.

Together, these will mean an investment of ~$1.2bn…. above CSC’s guidance of acquisitions accounting for 15% of its capital allocation.

Before, CSC was talking about acquiring in areas such as cyber (for commercial, enterprises, not just in the federal). The emphasis now appears to be more strongly on GBS, and on industry IP, domain expertise and BPS in a few target sectors. While CSC has longstanding experience in both insurance software business and in insurance BPO, it has not historically really leveraged the former to build a BPS business: this would mean a shift in focus.

Another area where we might expect to see inorganic growth is in analytics.

We recognize that organic topline is not the Holy Grail when it comes to shareholder value: CGI provides a great example of a company that is superb at managing and integrating very large acquisitions every few years without achieving organic growth. In comparison, CSC’s track record in acquisition is mixed, and it does not have CGI’s “Management Foundation”. 

But CSC knows it needs to move fast. Will it reach $8.5bn revenues by FY 19? Possibly. Will it achieve this through organic growth? Probably not.

Dominique Raviart and Rachael Stormonth

]]>
<![CDATA[HP Enterprise Services to Strip Out $2bn of Annual Costs in Next Three Years in Pursuit of Margin of 7-9%]]> HP Enterprise Services has announced Q2 FY 2015 results, for the period ending April 30, 2015:

  • Revenue was $4,817m, down 15.5% y/y, and down 10% in constant currency (CC), reflecting key account run off and weakness in EMEA
  • Segment earnings before taxes (EBT) were $194m, a margin of 4.0%, up 143 bps y/y.

Q2 FY 2015 revenue by service line (with y/y revenue growth) was:

  • IT Outsourcing $2,871m (-20.2%, -10% in CC)
  • Applications and business services $1,946m (-7.6%, -2% in CC).

HP ES contributed 18% of HP Group revenue and 8% of Group EBT (up from 5% last quarter)

HP Group is nearing the completion of its 2012 restructuring plan. In Q2 FY 15, ~3.9k people exited HP making the total reduction to-date ~48k. The program has a total of 55k people expected to exit by the end of FY 2015, so a further 7k departures over the next two quarters.

HP has maintained full FY 2015 guidance for Enterprise Services of a revenue decline of between 4% and 6% on a constant currency basis, with an improvement in H2.

So where are the positives in HP ES' performance this quarter?

  • A significant improvement in revenue performance in the Apps and Business Services segment, with a CC y/y decline of just 2%. This is led by the BPO business. And some geos are showing flat to slight CC growth
  • Signings were up year over year, even without the $2bn Deutsche Bank deal closed at the beginning of the quarter (see our commentary here).
  • And “Strategic Enterprise Services” signings continue to grow.... though no details are provided.

But the problems continue at  HP ES’ ITO business. It not only continues to be impacted from contract runoff from three large accounts continues, but is also being challenged by the evolution in the market. Meg Whitman refers to “risk in the longer term sustainability of this profit level if we don’t do further cost reductions”. As such, the current intention is to streamline HP ES and take up to $2bn of gross annualized costs out of the business over the next three years in pursuit of a longer term EBT margin target of 7% to 9%. The likely charge represents around 9% of HP ES overall revenues - and 14% of the revenues of the ITO business.

The restructuring actions in HP ES and in particular ITO will include initiatives such as further offshoring, data center automation, pyramid management… the same actions highlighted by CSC earlier this week.

Nevetheless, Whitman has made a clear statement of commitment to the future of HP ES: "the Services business in ES - (and the) -  TS Consulting businesses are  becoming more strategic to the future of Hewlett-Packard Enterprise…. “increasingly, services is becoming the tip of the spear”.

]]>
<![CDATA[Atos Strategy Update: Bull and Xerox Acquisitions Examined, Indian Offshoring Still Work in Progress]]> NelsonHall recently met with the management of Atos to discuss the acquisitions of Bull and Xerox ITO and the progress being made by Atos in adopting Indian offshoring. Note that this meeting took place before Atos' Consulting & Systems Integration analyst day.

Bull: Is An Expanded Portfolio Relevant?

With its acquisition of Bull, Atos has acquired an IT firm with a significant level of hardware and software that expanded the traditional, more IT services-centric portfolio of Atos. Bull’s portfolio ranged from X86 severs to HPC, from hardened phones to its Evidian line of security software products, and its line of GCOS mainframes.

Key questions since the acquisition are whether Atos will benefit from this extended portfolio, how relevant it is, and how much R&D effort it requires from Atos.

We gathered from our discussions:

  • Revenues from hardware represent ~2.5% of Atos revenues. Within hardware products, Atos will continue in most offerings with alignment around the broad themes of big data and security, e.g. HPC and appliances (bullion servers), which represent a NelsonHall estimated €150m to €175m in revenues. Atos will continue to support other product lines without pushing hard on its commercial development
  • Within the area of security, Atos has aligned its legacy security portfolio with that of Bull and wants to continue investing in it
  • Atos is confident it can maintain its existing level of R&D investment, which it estimates at 2% (~€200m) of its revenues. Excluding Worldline, we estimate Atos will spend ~€50m to €70m in R&D around Bull hardware and software products (Bull as a standalone firm spent 6% of revenues on R&D). Excluding the Worldline business, this represents less than 1% of Atos revenues.

Can Atos compete effectively against IBM, HP, Fujitsu and Dell in terms of client reach and R&D investment? Atos is counting on its salesforce to grow revenues of the former Bull products (e.g. HPC and bullion appliances) primarily in Europe and also in French-speaking Africa (where Bull has a client base), as well as address engineering departments of its manufacturing clients (especially around HPC and in-memory appliances, for product design and simulation). Time will tell whether this strategy works but Atos highlights that the combined line of Bull products were profitable.

Xerox ITO Strengthens Presence in the U.S.

While the Bull acquisition was perhaps a surprise, the planned one of Xerox ITO was more self-explanatory:

  • Atos had in 2014 highlighted its priority to grow in the U.S., and Xerox ITO with its $1.5bn in revenues (of which 93% from North America) has filled this objective. The company considers Xerox ITO as a reverse take-over and a base for organic growth: cross-selling of project services (C&SI) to Xerox ITO clients will be a priority. Over time, Atos wants to develop a partnership approach similar to the one it formed following the 2011 acquisition of SIS from Siemens; details are unclear at this point but this will include joint investments in IPs and joint go-to-market
  • IT infrastructure management (the Managed Services business unit of Atos) has been a priority: with the acquisition of SIS, Managed Services became the company’s largest unit (51% of revenues in 2014) and is also more profitable than C&SI, despite some dilution coming from Bull’s own IT IM business. Xerox ITO will strengthen Managed Services (and also Canopy) while also driving its profitability up.

Presence in Low-Cost Countries: A Work in Progress

Finally, Atos shed some light on its offshore and nearshore presence. Growth in low-cost countries remains a priority for both C&SI and Managed Services. The company announced in its Q4 2014 results that a (relatively low) percentage, 21%, of its headcount (~18k) is now based in nearshore and offshore countries. NelsonHall estimates growth in nearshore/offshore activity in 2014 to be +56%, very high indeed. We suspect this also includes activities in emerging and fast-growth countries to service local markets (Bull in Africa and Poland, for example).

C&SI represents the largest share of this offshore/nearshore presence (38% of C&SI headcount; NelsonHall estimate: 13k). We estimate that Managed Services has a much lower low-cost country ratio (~15%) but Atos highlighted that it has accelerated its transition to low-cost countries during 2014.

Atos highlights its offshore presence and capability in India, but key questions regarding Atos’ presence in India remain:

  • Offshoring is about more than pure headcount in India. It also implies the centralization of service offerings and IPs, relatively free flow of personnel across countries, approaches such as continuous improvement and expertise in contract transitioning, as well training. We think Atos has adopted or is currently adopting this centralized approach
  • When will Atos reach the inflexion point where adoption of offshore will stop driving revenue down and actually increase revenues? The answer is not clear at this point.

What is clear is that Atos takes presence in low-cost countries seriously. However, unlike Capgemini which has adopted Indian offshoring fully, Atos is trying to find the balance between its onshore presence, further specializing its in-country personnel, and its offshore presence, where cost is the number one client requirement.

We emerged reassured from our meetings with Atos. With a net cash balance that is going to remain positive after the acquisition of Xerox ITO, Atos still has financial freedom to continue its external growth strategy and invest in offerings to fuel organic growth.

NelsonHall will shortly publish commentary on Atos' Consulting & Systems Integration analyst day.

]]>
<![CDATA[Accenture to Acquire Agilex to Enhance Digital Capabilities and Agile Delivery for Federal Sector]]> Accenture Federal Services (AFS) is to acquire Agilex Technologies, a privately-held provider of digital solutions for the U.S. federal government based in Chantilly, VA. Terms of the transaction were not disclosed. 

The acquisition will enhance Accenture’s digital capabilities in analytics, cloud and mobility for federal agencies. It also will add agile delivery expertise. Agilex brings in capabilities in agile software development for digital solutions. The company currently serves a number of federal departments and independent agencies, such as the VA, DoD, DHS, and Department of Commerce.  Commercial sector clients have included Amtrak.

Agilex was founded in 2007 by the late Robert La Rose (who had previously founded Advanced Technology Inc. and Integic, both of which were subsequently acquired), Jay Nussbaum (ex. Citibank and Oracle) and John Gall and quickly attracted senior talent to its leadership. The company offers services around

  • Mobile applications for activities such as field inspection, emergency response management, performance dashboards, biometric identification, asset management, case management, personal productivity, etc.
  • Healthcare IT - for example Agilex was involved in the deployment of the NHIN CONNECT Gateway. Also m-health - for example in May 2014 it was awarded a contract by the VA to develop and implement an enterprise web and mobile application image viewing solution
  • CRM solutions.

Agilex has grown from 20 employees in 2007 to about 800 today. Nussbaum and Gall will leave when then acquisition closes, while the company’s leadership team will be integrated into AFS.

So why the acquisition? 

  • AFS is already one of the largest U.S. federal systems integrators – this is about continuing to evolve its capabilities to be at the forefront of newer areas of demand; quite simply, Agilex brings in capabilities around digital technologies – and digital is clearly among the top priorities of the government sector
  • And governments, not just in the U.S., are looking with much more interest in agile delivery as they move away from massive monolothic projects (for example, agile delivery has been a key element in the U.K. in the development of a new Universal Credit system for the DWP)

Accenture’s 2013 acquisition of ASM Research expanded its presence in the military healthcare market (DoD and VA) - and Accenture has worked alongside Agilex in projects at the VA. 

]]>
<![CDATA[ITO Spending Growth Dips Slightly & Bookings Flat in 2014, but Expect Improvement in 2015]]> This week NelsonHall held its quarterly IT outsourcing (ITO) Index webcast. We have conducted these calls for the last six years to monitor developments in ITO from a quantitative perspective. When we introduced the Index, ITO had largely moved from full outsourcing to a selective outsourcing approach, and Indian vendors were deploying their land and expand strategy with the occasional one-off mega-deal (TCS comes to mind). The Index has been reporting market developments closely ever since, though we have been collecting and analyzing ITO contract data for many years.

Background

The data shows a steady (but non-linear) decline in Total Contract Value (TCV) from 2002 onwards. The level of new-scope contracts declined from ~80% towards ~40%, signaling fewer new deals. Then came the subprime-driven recession of 2008-9, which triggered a vast level of ITO renegotiations: in 2009, bookings were up to a very high level, but that of new-scope contracts were low (~20%). Unlike 2001 when the internet bubble burst, the 2008-9 crisis was about existing contract renegotiations, not about new deals.

Contract signings were high during 2009 and 2010. But then, booking levels declined to their lowest level since 2008, to ~$32bn. Meanwhile, the level of new-scope contracts continued to be low. In short, the market is quiet with few transactions, mostly renewals and recompetes. This signals a maturing market, also marked by the impact of offshoring (which is reducing prices and TCVs very significantly) and also - and increasingly - by cloud computing (and in particular public clouds).

About three years ago, NelsonHall complemented its ITO Index approach based on contract data with a quarterly spending analysis of IT services, professional services (i.e. consulting and systems integration) and ITO. Our quarterly spending analysis has several benefits: it provides a quarterly view on how ITO spending is going to evolve, while our contract signings analysis provides more of a 12 to 18 month view of how ITO spending will change.

What does our short-term spending analysis tell us?

Spending in IT services has continued to grow, albeit at low levels (~2% in Q4 and about the same during full-year 2014). Growth is driven by professional services (+3% in Q4 2014 and ~4% in full-year).

Meanwhile, for the first time since Q4 2012, ITO spending growth was in positive territory in Q4 2014 (up by almost 1%) and down 1% for full-year 2014. This final quarter improvement in spending growth results from better economic conditions in mature countries.

What does our 12- to 18- month bookings quarterly analysis tell us?

During 2014, ITO bookings were flat across geographies as well as in North America and Europe. Activity in fast-growth countries (India, Brazil, China) was anecdotal.

An important KPI is the level of new-scope contracts (as opposed to existing scope contracts): an estimated 40% of contracts (with a TCV over $100m) in full-year 2014. This is better than 2013, when new scope contracts accounted for ~35% of bookings (and 30% in 2012). This level is at the higher end of the traditional range and is good news.

What does NelsonHall forecast for 2015?

The outlook for IT services in 2015 remains mixed, with the improving economic conditions driving some spending. For ITO specifically, the higher level of new-scope contracts will also have an impact on spending.

However, the economic environment in mature economies is only somewhat better. It is positive for India, unclear for China and Brazil, and clearly negative for Russia. In addition, offshoring will continue to further drive prices down, resulting into lower spending.

We are therefore predicting limited higher growth in spending in IT services overall (2.5-3.5%), professional services (4-5%), and ITO (1-2%).

You can listen to a recording of this week’s ITO Index webcast here. NelsonHall regularly blogs about the ITO industry here.

 

]]>
<![CDATA[SOC 3.0 and Proactive Security Management: the HP Aspiration]]> NelsonHall recently attended HP’s security analyst day in London. The session provided a deep dive into HP's threat intelligence and the application of this intelligence into its security products line.

Concerns about security issues are expanding beyond CSOs/CISOs to the rest of the C suite, even commanding the attention of CEOs.  HP highlighted that

  • Conversations with clients now focus primarily on the business issues of security, questioning the increasing cost of security versus the level of protection delivered
  • The increasing complexity and difficulty - and cost - of resolving threats.

The increased importance of IT security is a consequence of:

  • Attacks on organizations becoming more deadly (recent examples include Target’s CEO being removed after malware was found to have had stolen details for 40m customer credit cards and Ebay where personal information was stolen for 233m customers)
  • The transformation of IT infrastructures to cloud and mobile devices
  • Needing to comply with increasing regulations (SOX, Basel III, GLBA, PCI etc.).

To illustrate the increasing attention being paid to cyber security, after the recent attack in which customer contact information was taken from 76m households and 7m small businesses. JP Morgan’s CEO recently stated that JP Morgan will likely double its level of cyber security spend within the next five years.

HP highlighted some innovation it is looking to apply to security operations centers (SOCs). HP described three levels of SOC:

  • SOC 1.0, ‘Secure the Perimeter’: base level of security analytics currently employed today by most MSSP vendors
  • SOC 2.0, ‘Secure the Application’. HP detailed the use of monitoring DNS records within security event information monitoring (SEIM). Monitoring the DNS gives a much higher number of events than the classic model (21bn vs 4.5bn within HP alone); it also gives a deeper insight into application security. Currently in beta phase at HP internally, 25% of the malware found so far is new and had not been detected by traditional methods. HP also detailed a case in which this style of DNS records search was used for an external client, using historic logs to capture a number of previously unknown vulnerabilities.
  • SOC 3.0, ‘Secure the Business’. The aspirational SOC level 3.0 uses predictive analytics and HP’s threat database to identify the types of threat that a client experiences and then proactively work to reduce the number of threats.

HP describes its internal SOC as currently at level 1.5; the monitoring of DNS records has not yet been rolled out across the company. Reaching level 3.0 – which is about proactive security management - will be a multi-year journey (around five years?) requiring a more sizeable threat database and a large set of use cases. HP will roll out its central threat database to more partners and receive information from as many clients as possible, then utilize big data analytics to discover trends in the billions of events monitored. And of course, the imminent break up of HP Group into HP Enterprise and HP Inc. will add to the complexity of servicing both new HP companies.

(NelsonHall will be publishing a market assessment in managed security services in Q4, along with detailed vendor profiles on selected key vendors, including HP)

]]>