NelsonHall: IT Services blog feed https://research.nelson-hall.com//sourcing-expertise/it-services/?avpage-views=blog NelsonHall's IT Services program is a research service dedicated to helping organizations understand, adopt, and optimize adaptive approaches to IT services that underpin and enable digital transformation within the enterprise. <![CDATA[NTT DATA Lifts the Veil on Business Consulting]]>

 

NTT DATA Group recently held its virtual consulting day. In May, we reported on the new NTT DATA after its merger with NTT Ltd., the creation of NTT DATA Japan and Inc., and its priorities. At the time, we commented: “NTT DATA Inc. remains a quiet giant whose capabilities are not yet well-known” as NTT DATA’s management discussed how it had embedded its Business Consulting capabilities within the geographies but provided little detail on plans for this part of its portfolio.

However, the recent event provided additional details. Key takeaways were:

  • Business Consulting is at scale: it has ~15k consultants across 50 geographical units. NelsonHall estimates it represents 8% of NTT DATA Group’s headcount (including Japan and Inc.). This is in line with the industry rule of thumb. It also compares well with Capgemini’s10k consultants (although we suspect the scope is different)
  • Business Consulting is strategic to NTT DATA. As one NTT DATA exec had mentioned some months ago, Business Consulting is part of the solution, and NTT DATA wants to drive a consulting-led approach to its clients
  • First created in 1998 in Japan, the first international geography for Business Consulting was the U.S. NTT DATA conducts its Global One initiative to drive coordination, service portfolio consistency, and asset alignment across geographies. The company aims to create an integrated culture and common assets. We asked about asset examples: we suspect most assets are systems integration-related, with much less around consulting so far. NelsonHall will investigate further
  • Business Consulting is focusing its portfolio on seven organizational priorities:
    • Business and Industry Strategy
    • Customer Experience
    • Data & AI Strategy
    • Organization & Talent Management
    • Supply Chain
    • Digital Strategy
    • Sustainability
  • NTT DATA emphasizes its capabilities in combining domain knowledge with its technological expertise, including GenAI. Sustainability is also a priority and will help the Group reach its $1bn objective in sustainability revenues by FY27
  • The organization’s priority verticals globally are BFSI, Automotive, Healthcare & Life Sciences, Consumer Products & Retail, TME, Manufacturing, and Public Sector. Energy & Utilities and Professional Services are the main absentees from this list. Each region may have additional vertical priorities specific to that geography
  • One UKI banking client commented that NTT DATA’s Business Consulting fees ‘are sustainable’. The client went on to say how larger and more well-known consulting competitors need to leave the project after the initial consulting phase. With its 2k practitioners, India offers a lower-cost alternative, already representing ~13% of Business Consulting.

The NTT DATA Business Consulting event was intended to be an introduction to the business. The virtual event format had several limitations, including not being able to drill down on the offerings and go beyond the surface.

Surprisingly, Business Consulting did not indicate its strategy and roadmap. No mention of new offerings, although it was clear that GenAI and sustainability are, unsurprisingly, significant priorities. No potential M&A activity was discussed, though we suspect that NTT DATA will look for inorganic growth to expand its capabilities in specific geographies.

The message from NTT DATA Business Consulting’s business was straightforward: ‘We are part of NTT DATA, we are at scale, and we can play an essential role in our clients’ transformation’.

]]>
<![CDATA[Virtusa: Mastering LLM Testing Complexity]]>

 

Virtusa recently briefed NelsonHall about how it conducts GenAI testing. With the emergence of LLMs, Virtusa has seen a rising interest in understanding how to validate them. However, testing LLMs is not easy, as traditional testing approaches are not relevant to LLMs. It requires reinventing software testing and looking beyond the output of a transaction.

Non-Deterministic LLMs Challenge How Testing Is Conducted

Welcome to the world of leading-edge technology and complexity! LLM testing is not easy and differs from testing other AI models: LLMs are non-deterministic (i.e., for the same input, they may provide different responses); other AI models, such as ML, provide the same output for the same input.

The non-deterministic nature of LLMs raises several challenges for testing/QE. The broad principle of functional testing is to validate that a specific transaction on a web application or website provides the intended result, e.g., ordering a good on a website and validating that payment has been processed and completed. However, with GenAI, the output is dynamic and can only be broadly defined. For example, testing a generated response to a question, summary, or picture under a traditional approach is not working, as there is no right or wrong answer. Also, several answers can be correct.

As part of its efforts to deal with this complexity, Virtusa has organized its capabilities under the Helio Assure Framework, which covers LLM data, prompts, models, and output.

Data Complexity

Data validation is a starting point for any LLM project. Virtusa offers traditional data validation services, such as checks around data integrity, consistency, and schema/models.

Virtusa also conducts statistical assessments specific to data used for training AI models; for example:

  • Data outliers, i.e., identifying data that deviates from the rest of the dataset
  • Data skewness review, e.g., detecting a data distribution asymmetry. Several statistical models indeed require normally distributed data.

Beyond data and distribution validation, both well-understood activities, Virtusa emphasizes two approaches:

  • Data bias detection
  • Unstructured data validation (going through semantic search, grammatic search, and context evaluation).

Of these two, data bias detection is the most difficult, mainly because bias identification varies across cultures and contexts and is challenging to automate. Virtusa continues to work on data bias detection.

Prompt Validation

For prompt validation, Virtusa relies on several approaches, including bias checks, toxicity analysis (e.g., obscenity, threats), and conciseness assessments (e.g., redundant word identification, readability). Virtusa highlights that prompt templatization, through a shared repository of standard prompts, also mitigates security threats.

Virtusa also uses adversarial attacks to identify PII and security breaches. Adversarial attack is the equivalent of pen-testing in security, initially developed in ML. The approach is technical and rapidly evolving as LLM vendors finetune their LLMs to protect them from hackers. Nevertheless, it includes methods such as prompt injection and direct attacks/jailbreaks.

LLM Accuracy Evaluation

For evaluating AI models such as LLMs, which is particularly challenging, Virtusa relies on a model accuracy benchmarking approach, creating first a baseline model. The baseline is an LLM whose training is augmented by a vector database/RAG approach relying on 100% reliable data (‘ground truth data’). It will evaluate the accuracy of LLMs vs. this baseline model.

The Roadmap is Creative Content Validation and LLMOps

Virtusa has worked on GenAI creative output/content validation, looking at three elements: content toxicity, its flow (e.g., readability, complexity, and legibility), and IP infringement (e.g., plagiarism or trademark infringement). Virtusa uses computer vision to identify content patterns present in an image or a video, classifying them into themes (clarity and coherence vs. the intent, blur detection, and sometimes assessing the relevancy of the images/video vs. its objectives. We think the relevancy of this offering for social media, education, marketing, and content moderation is enormous.

We think that GenAI is the next cloud computing and will have significant adoption: enterprises are still enthusiastic about what GenAI can bring, though recognizing they need to pay much closer attention to costs, IP violation, data bias, and toxicity. Governance and FinOps, to keep cost and usage under control, are becoming increasing priorities. GenAI vendors and other stakeholders are eager to move from T&M to a usage-based consumption model and want to monetize their investments.

]]>
<![CDATA[Hitachi Digital Services: Pursuing Market Recognition with Hitachi Group’s Backing]]>

 

Hitachi Digital Services recently held an Advisor & Analyst event in London. The company is rolling out its communication plan to increase client and industry awareness. It goes to market, selectively, with the larger Hitachi Group in the U.S. and U.K., targeting a few industries.

So, who is Hitachi Digital Services? The mid-sized IT and ER&D services vendor spearheads Hitachi Digital’s international business with a NelsonHall estimated headcount of around 6,000. Its sister companies include GlobalLogic (product engineering services) and Hitachi Vantara (enterprise storage). In Japan, Hitachi Digital has DSS, a significant business, with ¥ 2.6bn (USD $19bn) in revenues (we assume this includes some hardware and software).

Hitachi: an ERP Background

The company’s sectoral background, reflecting its Hitachi Group ownership, is asset-intensive industries, with offerings ranging from enterprise application services and software development to data, analytics, AI/GenAI, and ER&D services. It tends to target the mid-market. i.e., companies with revenues of $2bn to $3bn, though it also has some large enterprise clients, such as the North American operations of a tier-one Japanese automotive OEM.

Hitachi Digital Services’ initial core offering was ERP services (Oracle and SAP), adding Salesforce, ServiceNow, and Workday to its portfolio over time. We estimate enterprise application services today account for around one-third of its global revenues.

The company’s Hitachi Group heritage is also shown in its ER&D services activities, spanning product engineering services (e.g., embedded software, command & control systems) and Industry 4.0/IoT. Hitachi Digital Services takes a Hitachi Group-wide approach to Industry 4.0 (‘Industrial IoT’), with Hitachi Group providing industrial automation and manufacturing software (e.g., a Flexware MES product popular among Japanese automotive OEMs). The company has a Factory Lighthouse with a greenfield factory set for Hitachi Rail in Maryland. With Hitachi Group having ~800 manufacturing plants, HDS’ potential activity in this area is immense.

Expansion to Software Development, Data & AI, BFSI

Outside those two offerings, Hitachi Digital Services is also active in software development (‘Cloud ’Engineering’), data & analytics, and application maintenance & support (Managed Services). The company emphasizes its Hitachi Application Reliability Center (HARC) methodology, which it has deployed in three delivery centers (Hyderabad, Dallas, and Tokyo). HARC provides best practices across processes, people, and tools and promotes a combined software development and support team approach. HDS highlighted several times the importance of SRE and FinOps in its HARC methodology.

Hitachi Digital Services has expanded from Hitachi’s asset-intensive industries to asset-light sectors such as BFSI, where the company has several specialized niches, helping, for instance, a closed books reinsurer deploy data processing and analytics. The client wanted to formalize its expertise to help it scale the business through M&As but is impacted by having its business logic and expertise residing in Excel spreadsheets and the heads of employees.

As you would expect, Hitachi Digital Services is investing in GenAI, which is part of Hitachi Group’s investments there. Internally, the company has deployed several GenAI use cases, e.g., converting Fortran code to Python, taking a reverse engineering approach. Externally, Hitachi Digital Services focuses on sector-specific GenAI use cases, such as generating user manuals for automotive OEMs and designing predictive models (based on a combination of ML and LLMs).

Another offering that Hitachi Digital Services promotes is sustainability around carbon emissions and the circular economy (the three Rs of the lifecycle: reuse, refurbish/repurpose, recycle). HDS has consulting capabilities (with an expertise center in Lisbon) and several IPs. The IPs go beyond carbon emissions accountability and include identifying the scope three emissions of a supplier or a product. HDS highlights that depending on the vertical, clients have different sustainability needs. Manufacturing is about the supply chain and product traceability. Financial services, telecom, and data center/colocation vendors focus on data center emissions. The hospitality, retail, and real estate industries favor building energy management.

The Priority Is Client Recognition in U.S. and U.K.

A primary objective of Hitachi Digital Services is to expand its client recognition in the U.S. and U.K., its two largest markets. The company believes it can continue its current momentum (double-digit revenue growth in a slowing market) by rolling out its capabilities with existing clients in its core markets, keeping its vertical and service portfolio approach. HDS favors the creation of technical accelerators and, selectively, software products (e.g., sustainability). GenAI continues to be a priority.

NelsonHall expects further collaboration with the larger Hitachi Group, primarily for GTM. Accelerators and products are also an essential element of Hitachi Digital Services’ growth strategy, with the larger Hitachi Group spending ~$5bn annually on R&D, of which Hitachi Digital Services is a ‘significant recipient. Expect the company to do more of the same to accelerate its growth with Hitachi Group funding innovation.

]]>
<![CDATA[TCS ERP on Cloud: Focusing on SAP Modernization and Monitoring]]>

 

We recently talked with TCS about its ERP on Cloud offering. The SAP ecosystem has been going through intense change, with the planned end of SAP ECC support from December 31, 2027, and transformation with S/4HANA and recent Clean Core SAP initiatives. The change is driving accelerated ERP adoption, as demonstrated by SAP’s increasing revenue growth driven by SaaS applications. To accommodate this increased pace of change, TCS recently amended its ERP on Cloud offering.

TCS has positioned ERP on Cloud at the intersection of SAP cloud infrastructure and application services with bundled services. The offering is firmly focused on the cloud infrastructure with, for instance, migration of SAP ECC to the cloud targeting hosting modernization. ERP on Cloud also comprises the provisioning of development and test environments, as well as monitoring. It also bridges with application services and S/4HANA systems integration/transformation services. While the focus is on SAP opportunities, TCS also offers related services for other ERP and custom application production environments.

TCS’ ERP on Cloud offering is part of TCS’ Products and Platforms. While TCS’ IP investment is known for its software product portfolio, it also hosts offerings such as ERP on Cloud, i.e., bundled application and cloud infrastructure services, targeting large enterprises and the mid-market.

TCS’ immediate priority for ERP on Cloud is to scale the offering. The growth opportunity is significant, fueled by the end of ECC support and the S/4HANA transformation. The growth is also necessary to help TCS continue bringing automation across its various ERP on Cloud offerings and lowering costs.

Four Specialized Offerings

TCS’s flagship offering is around SAP migration to the cloud. With this offering, the company offers a lift-and-shift migration. The offering is technical, targeting the migration of databases and OS. Common client scenarios for this offering include organizations facing middleware that is no longer supported by their respective ISVs. The company highlights the IP’s scalability and that it can accommodate any middleware. TCS provides the necessary middleware refresh, minimizing client investment while benefiting from cloud hosting and hyperscaler innovation.

TCS started its ERP on Cloud journey with environment provisioning, whether for SAP PoCs, development and testing, specific usages such as document archival, or even large production environments. TCS has worked on accelerating instances deployment on the cloud and has pre-installed cloud templates to provision SAP Basis. With the rise of FinOps, TCS promotes a right-sizing approach to control spending while reaping the benefits of public cloud.

Complementing its lift-and-shift migration to the cloud offering, TCS offers greenfield S/4HANA transformation. The company provides pre-configured templates with ~120 standard processes to accelerate the deployment. Most processes support back-office functions (e.g., order to cash, procure to pay). They also address several industry-specific templates for processes in discrete manufacturing sectors (e.g., plan to produce, quality management, maintenance management). TCS has also localized these templates for several countries, including U.S., U.K., India, UAE, China, and Indonesia. TCS estimates that this offering helps to reduce implementation, targeting 16 weeks of deployment time. For this offering, TCS is an SAP-Qualified Partner-Packaged Solutions, targeting the mid-market with its pre-configured templates.

TCS also provides SAP environment monitoring and management. The company has its TCS Enterprise Manager IP for multi-cloud application and cloud infrastructure monitoring, also integrating with ITSM tools (e.g., ServiceNow). TCS is investing significantly in automation with AI, deploying SAP updates automatically, and conducting production data and ITSM pattern analysis. TCS Enterprise Manager is ERP on Cloud’s fastest-growing offering. Client demand is SAP-centric, but expanding to other ERP platforms and custom applications, filling an application monitoring market gap.

The Road Ahead

Naturally, TCS is looking for additional productivity gains and automation to reduce costs further; accordingly, it is investing in automation and has grouped its IP and automation efforts under the ERP Enablers category. An example of a recent investment includes a library of IaC configuration files to provision cloud instances. Another example is a data migration and source and target validation tool dealing with a heterogeneous set of now unsupported databases.

Geographic expansion is also a priority. The current SAP momentum should help. Organizations are accelerating their transition, whether lift-and-shift or transformation. They require a standard and industrialized service to mitigate risk in the context of tight budgets. TCS’ emphasis on innovation and service repeatability should help.

]]>
<![CDATA[Wipro Brings Depth to GenAI Use Cases for QE]]>

 

Wipro recently briefed NelsonHall on its GenAI investments for quality engineering, discussing the creation of use cases and sharing the thinking behind some of its decision-making.

Wipro’s GenAI investments for QE are part of the company’s ai360 program, a $1bn investment that includes activities in developing use cases, training, and GTM. The launch of its QET GenAI Platform is part of this initiative.

It has identified ‘quick win’ use cases, including:

  • Automated requirement analysis, test scenarios, case and script generation from user stories
  • Test design recommendations
  • Synthetic test data generation
  • Knowledge management
  • Data transformation and validation.

Like its peers, Wipro highlights the benefits of standard prompts, e.g., LLM’s output accuracy, lesser output variability, and capturing the client’s application and testing context. Wipro has created libraries of standard prompts, classified by role (UX designer, developer, tester, architect, BA, and application support) across the software development lifecycle.

RAG and Prompt Engineering

Beyond prompt engineering, Wipro wants to improve the accuracy of the LLMs. Rather than fine-tuning LLMs (through training the models on additional training data sets), it has chosen the retrieval-augmented generation (RAG) approach, which essentially relies on creating vector databases of the client’s testing artifacts. With the RAG approach, Wipro believes it takes a more relevant method to include the specific context of the client’s applications. To that extent, the company has created a tool that goes through various document formats (e.g., .doc, .pdf, .ppt) and creates a data set in a vector database.

For several use cases (e.g., test script generation, test data), Wipro wants to be LLM-agnostic and will connect with GenAI COTS (e.g., ChatGPT 3 and 4 and Azure OpenAI). It supports most test execution tools and languages (e.g., Selenium, Eggplant, Appium, and Playwright).

Looking to the future

The company is developing several GenAI use cases targeting specific tasks. Examples of these include locating an error in a Selenium script or writing a VB macro to migrate data from ALM to JIRA. Wipro is building a repository of use cases covering testing activities, taking a bottom-up approach.

There is a clear focus on helping clients beyond the interest stage and consulting engagements to PoCs and deployment. To facilitate client adoption, Wipro is looking to make its GenAI services enterprise-grade with assured data privacy and security. Options offered by the company include hosting on the client’s premises or its own.

Investments in GenAI will continue to be a priority in the foreseeable future. The company recently invested in data transformation and validation. Wipro plans to bring further depth to its user story analysis; it is exploring how to make user stories more standard and consistent within an enterprise. Current writers of user stories tend to have their own style. Wipro believes that GenAI can bring some standardization while increasing overall user story quality. The company also wants to go into more depth regarding automated root cause analysis beyond traditional defect classification.

Bringing an enterprise-grade service

Beyond LLM use case depth and standardization, Wipro believes that it will differentiate its value proposition by offering an enterprise-grade service. The company highlights it has taken several steps in this direction.

Wipro provides access to LLMs through its ai360. ai360 wants to ‘guardrail’ LLMs and systematically monitors and controls LLM usage. It ensures:

  • The right usage, for instance, taking a persona-based approach and providing access to the right model
  • Cost control, in a FinOps approach
  • Reporting for corporate and regulatory compliance purposes.

Wipro has also worked on decreasing the time to create test scripts from an initial 15 minutes to one minute, relying on proprietary Python test script libraries.

The company highlights it has also progressed well in LLM output consistency. It finds the LLM output/responses to English language prompts can be unreliable. To overcome the challenge, Wipro created a library of UML models for specific processes (e.g., completing an online purchase transaction). It will update the UML libraries for each client and subsequently create test cases and scripts. With this approach, the company believes it can also increase test coverage.

Wipro points out that clients hesitate to move from demos and PoCs to deployment. The company believes its enterprise-grade approach will help organizations make the move and will continue to invest in it.

]]>
<![CDATA[NTT DATA Inc. Shares Priorities After Merger with NTT Ltd.]]>

 

NTT DATA held its first EMEA Latin America (EMEAL) Analyst Day last week since the merger of NTT DATA and NTT Ltd. announced in June 2022. The event provided some perspective on the new structure of the firm, its capabilities, and its short-term priorities.

The new NTT DATA Group is a much-enlarged firm with $31bn in revenues and around 150,000 employees. The group comprises two standalone companies:

  • NTT DATA Japan, with $13bn in revenues
  • NTT DATA Inc., the $18bn international business, co-owned by NTT DATA (55%) and Japan's NTT (45%). The unusual ownership structure reflects NTT Ltd's former ownership by NTT and a business model difference between the two entities.

It is possible that, in time, the two companies will eventually be integrated into one global organization.

NTT DATA Inc. has started its integration first at the geographic cluster level (U.S., EMEAL, and APAC outside of Japan). Historically, NTT DATA looked more like a federation of companies loosely aligned with several integrated practices (such as SAP services). The new geographic cluster structure will help the firm integrate businesses in EMEAL, while preserving its onshore and consulting heritage.

An Application, Infrastructure, and Network Integrated Firm

The 2002 merger with NTT Ltd. expanded NTT DATA Inc.'s capabilities to include data center colocation, network, connectivity, and IT infrastructure. Most of the NTT Ltd. portfolio is services. It also includes a significant network product reselling business.

NTT DATA Inc. has kept the data center colocation business independent, highlighting its CAPEX-heavy business model. The company claims to be the third-largest data center/colocation provider, with 80 data centers. Potential synergies with the rest of NTT DATA Inc. are limited, although there may be occasional joint GTM for server and computing projects.

The integration has focused on connectivity and network services capabilities, along with those in IT services. The aim is to sell integrated IT services, network services, and connectivity, something that few competitors have achieved, through a business (and consulting-led) approach rather than emphasizing connectivity and technology only. An example is private 5G, which the company sells under an Industry 4.0 approach to automotive OEMs rather than through a more technical 5G equipment resale and integration approach.

Priorities: Innovation and Assets

Priorities for the next two years highlighted at the event include innovation and the Asset-Based Business (ABB) initiative.

NTT DATA Group has aligned its innovation efforts across the three geographic clusters. NTT DATA Group's Innovation Headquarters is driving this initiative, prioritizing its R&D spending across:

  • Mainstream services (70% of spending)
  • Growth (20%)
  • Emerging (10%).

GenAI is a priority, with NTT DATA Group using its parent company's investment in its own LLM, Tsuzumi. The company is actively deploying GenAI and Tsuzumi across the SDLC. Here, we noted in conversations an acknowledgment that GenAI's gains are likely to be 10-15% in the mid-term, well down from optimistic claims in the industry of 30-50%. Automating the full SDLC with GenAI will require a long-term investment in addressing gaps.

With its Asset-Based Business (ABB) initiative, NTT DATA Inc. has identified products, solutions, and accelerators built by local and vertical units. It wants to bring a structured approach to its assets, identifying and assessing them, prioritizing and allocating investments, and accelerating its growth through commercial focus and cross-selling into accounts through Inc.'s account managers.

ABB's ambitions go beyond monetizing software assets. The unit wants to structure its portfolio, where possible, into a suite of modules and products under the Syntphony umbrella brand rather than standalone products. Beyond sales (mostly subscriptions), NTT DATA Inc. wants to use ABB's portfolio to differentiate its services. The company targets project sales rather than standalone product sales. NelsonHall estimates ABB accounts for around 6% of NTT Data Inc.'s total revenues (including software and services). ABB has bold growth ambitions to account for over 10% of revenues eventually.

Getting the Word Out

Getting the word out is also a priority. NTT DATA Inc. remains a quiet giant whose capabilities are not yet well-known. For instance, the company has a much bigger scale in its global delivery network than expected, with around 66,000 employees in India and another 16,000 or so based in nearshore locations, including Czechia, Romania, South Africa, and the Philippines. An immediate priority is standardizing tools and processes across centers to harmonize its delivery.

The integration journey is ongoing. In EMEAL, the company has selected its innovation portfolio priorities: GenAI, sustainability, and ABB. NTT DATA Inc. has a very broad portfolio, including SAP and Salesforce services, data & analytics, application cloud migration, and its much-enlarged IT infrastructure and network services.

We expect NTT DATA Inc. to continue to review its portfolio regarding investment and GTM priorities.

Topline growth is a stronger priority than margin expansion: NTT DATA Inc. considers itself the growth unit of the NTT DATA Group.

]]>
<![CDATA[Eviden’s Quality Engineering AI Journey]]>

 

NelsonHall recently talked with Eviden, Atos’ consulting and application services business, about its QE practice, Digital Assurance.

Digital Assurance has 5k quality engineers, 65% offshore, reflecting a high leverage in North America (due to its Syntel heritage) counterbalanced by Atos’ large European public sector client base. The practice has aligned its service portfolio around high-growth offerings such as testing applications for digital projects, migration to the cloud, testing Salesforce migration projects from Classic to Lightning Experience, and SAP.

Beyond these technologies, Digital Assurance has focused on AI, initially traditional AI with ~45 pilots underway, and then around GenAI in 2023-24.

AI/GenAI as priorities

Eviden currently has five primary GenAI use cases relevant to testing being deployed on its GenAI QE Platform:

  • Test strategy generation
  • Ambiguity assessment and scoring
  • Test case creation
  • Test data
  • Test script automation.

One of the demos we attended was ambiguity assessment and scoring, where Eviden evaluates the quality of a user story/requirement. Other demos such as automated test case and test script generation provide several insights regarding the current art of the possible.

GenAI quick wins

GenAI provides quick wins that do not require significant ML model training.

An example is assessing the quality of user stories. Commercial LLMs will work out of the box and can be used as-is without further training. But LLMs only work if the input data (user stories in this example) follow best practices, e.g., are detailed enough and have clear acceptance criteria. If those fail, the LLM will reject the user stories.

Prompt engineering rather than data finetuning

Eviden is finding that the pretraining provided by the hyperscalers is good enough for most use cases, and is not currently contemplating conducting clients’ data training.

Eviden sees a need for structured prompt engineering, i.e., providing the LLM model with the right instructions. It is building repositories of standard/proven prompts. In addition, Eviden will adapt the prompts to the specificities of each application, e.g., table structures and user story patterns. Digital Assurance estimates that adapting prompts to the client’s applications will only last a few weeks. This approach is time-sensitive and provides quick wins, for instance, around automated test script generation.

Combining traditional AI and GenAI

Eviden is combining GenAI with more established impact analysis AI models (e.g., predicting the impact of code change/epics on test cases) and is conducting GenAI processing once it has done so with predictive AI model investments. The ecosystem approach goes beyond other AI models, and Eviden points out that it is deploying container-based delivery to execute GenAI models independently to shorten time-to-process.

The beginning of the GenAI journey for QE

This is just the start of the GenAI journey for Eviden’s Digital Assurance practice. The company is deploying early GenAI use cases and deriving its first best practices. Eviden also points out that human intervention is still required to assess GenAI’s output until GenAI reaches maturity. Even with GenAI, the testing industry is far from autonomous testing or even hyper-automation.

Eviden is working on other GenAI initiatives, including around Salesforce and SAP applications. For instance, Digital Assurance has used GenAI in SAP to generate a repository of ~250 T-codes (SAP transactions) with relevant test scenarios and cases.

Eviden is also exploring migrating to open-source tools away from SAP-recommended COTS for their regression testing needs. The migration goes beyond changing text execution tools and migrating test scripts. This is not the first time we have seen interest in moving away from commercial tools, but historically this has not materialized in massive migration projects. GenAI will ease the process.

]]>
<![CDATA[Sopra Steria’s Update on GenAI Experience: Focus on Software Engineering]]>

 

During the summer of 2023, we met with Mohammed Sijelmassi, Sopra Steria’s CTO and Digital Transformation Office head and discussed how Sopra Steria was deploying internally and helping clients externally with GenAI and LLMs. Earlier this month, we met with Mr. Sijelmassi again to assess Sopra Steria’s progress on its transformation, its work with clients, and use cases.

Use Cases: Knowledge Management & Reverse Engineering Remain the Priority

Sopra Steria currently has two use case priorities: conversational agent/knowledge management to complement self-service portals and bring a higher quality of chatbots, and reverse engineering for creating documentation out of existing applications: documentation generation is time-consuming and is a popular candidate for GenAI.

Code generation is next. Sopra Steria finds that code migration between object-based languages and Java version migration (e.g., to JDK 17 LTS) works fine: the challenge is to increase code quality, create concise code, and reuse software components (LoCs) where possible. Sopra Steria is identifying internally reusable components of high quality that can be used across the company’s software developers.

Areas of mid-term investment include COBOL migration to an object-based language. There are challenges in migrating between languages relying on sequential instructions (e.g., COBOL applications) to object-based programming languages. Sopra Steria is working with IBM and is taking a reverse engineering approach to identify specifications and generate code. It also points to proprietary programming languages (e.g., SAP ABAP) where the volume of LoCs in the public domain is insufficient to have best practices.

Prompt Engineering vs. Data Finetuning

Mr. Sijelmassi also provided some light on the LLM pre-training vs. client-specific (finetuning) discussion. The company believes that the quality of pre-trained LLMs is high enough for its own and its clients’ software engineering needs. Therefore, Sopra Steria is currently focusing on prompt engineering and creating libraries of standard prompts.

The company is exploring how to finetune LLMs on its own data and specially for its software products. Sopra Steria is selectively working with clients on this and has internal pilots around its HR Software and Sopra Banking products. The company finds that training LLMs on client-specific data is a relatively limited engagement (with costs ranging from €100k to €1m, depending of the size of available data). However, it points out that the finetuning exercise relies on the quality of existing data and its curation to avoid issues such as data bias. In other words, the finetuning effort will truly depend on the quality of the client’s data estate.

Internal Deployment is the Priority

Sopra Steria has embedded GenAI into its software development tools (Digital Enablement Platform) to improve productivity: so far, the company has purchased 10k Github Copilot licenses, which it is rolling out gradually, 1k per month in a phased approach aiming to build experience and gradually derive best practices.

Achieving developer buy-in is of course critical: in early internal surveys, Sopra Steria has found most developers believe that GenAI brings productivity gains and increases software quality. It expects different levels of GenAI adoption across its developers, with freshers/recent graduates and experts driving adoption. The company is accordingly targeting middle-aged developers with training and change management programs to raise GenAI awareness. Sopra Steria also intends to create career paths where developers can become experts rather than move to management positions.

Sopra Steria is targeting 10k licenses deployed and its current 58k employees by the end of 2024. 2024 will be the year of deployment and best practices.

Vertical Use Cases Are Next

Consulting with Sopra Steria Next still accounts for Sopra Steria’s bulk of GenAI activities. Sopra Steria Next continues to identify new use cases. Recently, it deployed Copilot 365 services to help clients use it and equipped 300 consultants to collect feedback.

Beyond best practices and use case identification, Sopra Steria Next highlighted its work around:

  • GenAI and its impact on the environment (energy consumption)
  • Selecting LLMs, differentiating between products and open-source models
  • TrustAI.

Also, Sopra Steria Next is exploring vertical use cases, initially targeting conversational agents. The company has a project with a retail bank to use GenAI and help agents cross-sell products. It is also cross-pollinating the same approach to similar industries such as retail & distribution.

GenAI To Have a Contract Impact bv 2025

2024 is primarily a year for internal deployment. Sopra Steria expects to derive productivity gains from 2025 onward and embed them in its contract discussions. Depending on client maturity the company is looking to see productivity gains from 10% to 25%. Of course, this will not translate to an equivalent price drop: additional costs in terms of license, GPU, and training need to be factored in.

Beyond productivity, Sopra Steria’s investments in GenAI demonstrate two points:

  • GenAI will go beyond the hype and bring productivity gains
  • GenAI brings two easy-to-deploy use cases that are quick wins. Other use cases require investment and will take time to be rolled out.
]]>
<![CDATA[IT Services Predictions: 2024 Will Be a Year of Transition]]>

 

2023 was a year of disruption after the 2021-22 digital catch-up. As the year unfolded, IT services spending slowed down, initially in the U.S. in the financial services, telecom, and high-tech sectors.

We expect 2024 to be a year of transition with a modest rebound in IT services spending, continued consulting interest in GenAI, a rebound in cloud infrastructure adoption after a slowdown in 2023, and an uptake in discretionary spending. U.S. spending will rebound and outgrow U.K./Europe.

A Modest Rebound in IT Services Spending

IT services spending growth in 2024 will increase by a modest 1 pt to +4%, driven by managed services/IT outsourcing spending in H2 2024.

Spending in the U.S. will rebound in H2 despite much uncertainty due to the U.S. elections, both in terms of stock market perceptions and also the possibility of squeezed federal government spending. In contrast, Europe will remain soft, as Germany suffers from its exposure to the quiet Chinese manufacturing sector. More information here for our subscribers.

Digital and GenAI Drive Consulting

Consulting remains a cyclical activity, and revenues in key sectors such as financial services, traditionally large buyers of consulting services, and telecom have been declining. A positive impact is the very high interest by firms across sectors in understanding the potential applications of GenAI in their organization. Interest also remains around UX/UI, SaaS, and front-office applications, though it is slowing down. Industry 4.0 generally remains solid but will be impacted by the manufacturing slowdown we predict for 2024. Sustainability has mid-to long-term potential.

IT infrastructure Services: Cloud Rebounds

Despite a slowdown in 2023, notably in the high-tech and financial services sectors, the migration of IT infrastructure and applications is a secular shift. NelsonHall sees no market saturation for public cloud migration in the short term. Security will continue to thrive.

Transformation of the digital workplace accelerates with a focus on Experience Management Offices (XMO) for centrally coordinating the UX and deploying XLAs such as user satisfaction in contracts. Vendors will continue to verticalize their persona-based offerings. e.g., nurses and doctors in healthcare, plant operators in manufacturing. Clients turn to vendors for guidance and OCM on GenAI, with agent assist utilizing KM as a typical early use case.

Application Services: Agile and Data/Analytics/AI Continue to Dominate

Data, analytics, and AI will be 2024’s priority, with GenAI (and traditional AI and IoT) driving spending growth. Another major theme is hosting data on the cloud. Key themes for other service lines include:

  • ADM: agile transformation continues to be the major theme. Along with APIs, LC/NC tool usage, and sustainability. Many organizations will seek advice from their vendors about the role of GenAI in ADM (e.g., documentation, code migration, code development)
  • Quality engineering/testing: continuous testing (i.e., agile testing transformation) is, like in ADM, a major theme along with functional automation. Traditional AI promises to automate the requirements/user stories-test case-test script cycle. GenAI will continue to monopolize the boardroom’s attention
  • S/4HANA transformations with phased roll-outs will continue to be solid. Organizations will also assess the benefits of SAP initiatives such as Clean Core and BTP to lower maintenance costs and rearchitect applications. The agile transformation will continue
  • Salesforce has made a strategic shift and now favors profitability over revenue growth. The company expects a 10% CC revenue growth (primarily organic) in 2023 and pushes its GenAI agenda. MuleSoft remains the core driver, while core Sales and Service Cloud grow faster than more recent products such as Commerce and Marketing Cloud. 2024’s key question will be whether Salesforce will push again on revenue growth for its subscription and service ecosystem.

U.S. Will Grow Faster than Europe

Growth in discretionary spending is likely to resume from H2 2024, led by the U.S. once inflation rates start declining and businesses have greater visibility on the likely outcome of the presidential election.

U.K./Europe’s recovery will be delayed by a semester (H1 2025). There will be increased uncertainty in the U.K. with a general election in H2, in both commercial and government sectors. Germany will continue to be impacted by less manufacturing equipment sales to China. France will have anemic growth, also impacted by a manufacturing slowdown.

2024: A Year of Transition

2024 will be a year of a transition from the slowdown of 2023, with the U.S. regaining its role as a traditional driver in IT services spending. IT services spending will return to stronger growth from 2025 as organizations reignite discretionary spending in digital, cloud, and security. Notably, GenAI and AI bring the potential to fundamentally transform how businesses operate across functions; none will want to be left behind.

 

To keep up to date with NelsonHall's IT services research and thought leadership in 2024, subscribe to our IT Services Insights newsletter on LinkedIn.

]]>
<![CDATA[Tech Foundations Update: A Multi-Year Transformation]]>

 

The CEO of Atos’ Tech Foundations business recently updated industry analysts on its transformation program.

We have commented about the turbulent period Atos Group is going through in splitting into two businesses (Tech Foundations and Eviden). The planned sale of Tech Foundations to Czech billionaire Daniel Kretinsky’s EP Equity Investment (EPEI) vehicle is going ahead, with completion now expected in early Q2 2024. Tech Foundations will retain the Atos brand name.

We were interested in portfolio developments at Tech Foundations and progress in its portfolio transformation and reducing the level of problematic (‘red’) contracts.

In terms of portfolio simplification, Tech Foundations is exiting its BPS activity in the U.K. And in October, it finally exited Unify, its UCC business, and divested to Mitel Networks. It has also significantly ramped down its VAR business to reduce its low-margin activities. The focus now is primarily on offerings such as digital workplace that it can provide in an industrialized manner, and on helping clients manage the complexity of hybrid multi-cloud.

There has also been some portfolio development in these areas; for example, digital workplace sustainability in office equipment: Tech Foundations is launching Atos Tech 4 Good Assistant, a dashboard to help end-users monitor their environmental performance and uninstall unused applications or switch off their laptops. It has also launched a dashboard, Sustainable Workplace, at the enterprise level.

Tech Foundations is also deploying its Technical Services unit in new geographies, primarily the U.S. and Spain. The unit has launched a consulting service targeting workplace and data center transformation and service integration. TS is its highest-growth unit and expands its position in build services rather than run services, which tend to have lower margins. Peers such as Kyndryl and DXC have similar initiatives.

In terms of red contracts, there has been substantial progress, often through exits rather than renegotiation. In 2021, these represented 13% of its contracts. That proportion reduced to 8% in 2022 and is now approaching 6%, much closer to the level of some major IT infrastructure service-centric peers. This continues, with some of the exits being major moves; e.g., the recent NEST exit in the U.K., a contract that was worth £1.5bn over 18 years.

There is a renewed focus on strengthening client intimacy, including a new client advisory group. The focus is driving conversations at the CXO level with priority clients in each region to understand and address critical client challenges through co-creation with Atos and its partner ecosystem.

Tech Foundations’ recovery is, of course, a multi-year journey. Recent financial results indicate some progress. In Q1-Q3 2023, its revenues were down 4.4% y/y in CC, impacted by the ramp-down in VAR and BPS activities. This performance lies between Kyndryl (-1.7%, NelsonHall estimate) and DCX GIS (we estimate -9.3%). For FY22-24, Tech Foundations is targeting 0% to 2% annual growth, with a rebound commencing in 2025.

Taking the Tech Foundations business private might be helpful. Its multi-year transformation will probably be more suited to an investor with a multi-year investment horizon than the quarterly mindset of many shareholders in public companies. We would expect EPEI to invest to help Atos expand its service portfolio. Security will be a priority even if the unit retains its security monitoring capabilities following the split. Application migration to the cloud is another potential acquisition area, complementing its expertise around infrastructure migration to the cloud.

The dust should settle for Tech Foundations (the new Atos) in H1 2024.

]]>
<![CDATA[Sopra Steria Articulates its Expectations for GenAI]]>

 

Sopra Steria is the latest vendor to announce a significant investment in GenAI/LLMs and its incorporation into its software engineering platform (Ingine).

We recently met with Mohammed Sijelmassi, who holds two roles at Sopra Steria: CTO and Digital Transformation Office. We discussed how Sopra Steria is deploying internally and helping clients externally with GenAI and LLMs.

There is undoubtedly a sense of urgency: Steria acknowledges that GenAI will likely disrupt the IT services industry within two years, and its clients’ industries. The company has seen its delivery teams using GenAI and wants to accelerate this initial movement at the group level. Accordingly, Sijelmassi is sponsoring an AI use case program around three priorities: application services and its software products; educating clients on using AI; and internally for sales and support functions.

Immediate Priorities include Level 1 Support and Software Engineering

Sopra Steria is assessing LLM use cases in application services. An immediate opportunity is level 1 application support, with LLMs complementing other tools, such as self-service portals and chatbots, to deflect inbound calls to the service desk. The intention is to use GenAI together with chatbots to bring higher-quality responses.

Other immediate opportunities are related to software development and reverse engineering, e.g., for creating software documentation from existing applications, reading through lines of code to understand what an application does, annotating lines of code with comments, or identifying the application’s requirements (with the intent of generating test cases).

Sopra Steria is also assessing use cases in software development; next year, following several months of preparation, the company will train LLMs on its code repositories, including its coding and architecture best practices and standards.

Governance Issues

GenAI will bring governance issues. The increased adoption of AWS CodeWhisperer, GitHub Copilot, Duet AI for Google Cloud, and Hugging Face will increase AI-generated code proliferation. This raises challenges such as:

  • The proliferation of code will impact its readability and degrade the application performance, and potentially impact the CPU and memory of a mobile device
  • IP issues. Sopra Steria wants to avoid code and business logic developed specifically for a client being reused by a different organization. This means controlling the training data, separating code and business rules owned by clients, and ensuring best development practices are embedded in the training data.

LLMs have implications beyond governance and will impact junior roles such as level 1 application support. The company will increase its new graduates’ upskilling investment and migrate them quickly to level 2 roles. GenAI might have an impact on the age pyramid.

Sopra Steria is embedding LLMs into its software engineering platform Ingine to spread GenAI usage rapidly. Ingine also includes enabling tools and services such as a development environment, DevOps, security, and testing tools accessible through an internal marketplace. Sopra Steria plans to have Ingine GenAI-enabled by the end of 2023.

Mid-Term Initiatives

Beyond software development, reverse engineering, and support, Sopra Steria is looking at mid-term application services initiatives, such as:

  • Code migration, e.g., from COBOL to .NET or Java. This will come later, as programming language migration requires transformation beyond code and includes reengineering the application based on different architectural principles. Sopra Steria is exploring how to use reverse engineering to create software specifications and automatically generate new code
  • QA/testing.

Sopra Steria is addressing increasing client demand for workshops to explore current GenAI use cases and understand the IT implications around application architectures through its consulting arm, Sopra Steria Next. The company is working with its clients on the EU’s AI Act. It believes that demystifying fears or fantasies around GenAI will take another year for its clients. Sopra Steria Next will help here.

The company is also deploying LLMs for several of its software products. For instance, it is currently combining GenAI and ML for its HR software products, such as identifying changes in payroll and HR regulation, reviewing its software products to look for defects, and improving quality.

Productivity Gains Ease the Talent Shortage

Sopra Steria envisions the coexistence of large generic and slower-performance LLMs with broad capabilities, and more focused ones specialized in specific topics and use cases requiring low latency. Examples are use cases where real-time applications, such as manufacturing or defense applications, favor speed rather than breadth of knowledge.

There is no doubt that GenAI will bring productivity gains to the IT services industry while presenting it with enormous challenges, not the least of which is demanding significant investments in training. Sopra Steria’s stance is that GenAI will ease the pressure on the shortage of IT talent.

]]>
<![CDATA[Cognizant Looking to Double its International Market Business]]>

 

Cognizant's non-U.S. business, Cognizant Global Growth Markets (GGM), recently held an analyst and advisor day and discussed its regional priorities.

New CEO Ravi Kumar has initiated a restructuring plan ('NextGen Program') to reduce Cognizant's cost base to fund investments.

While Cognizant had guided on flat topline growth in 2023, Q1 saw a 9% y/y growth in LTM bookings, indicating the company has started to gain momentum in IT outsourcing deals.

Where Cognizant in the U.S. is impacted by a large contract ramp-down, Cognizant GGM's priority is topline growth, with an aspirational goal to double revenues (without specifying a date). Cognizant GGM generated revenues of $5bn in 2022, spread almost equally between the U.K., Continental Europe, and RoW. Together, these international (non-U.S.) revenues represent 26% of Cognizant's total revenues (compared with around 47% for TCS), so the U.S. will continue to drag on overall company performance.

Replicating its success in the U.K.&I

In Europe, Cognizant GGM is looking to replicate its success in the U.K&I, its fastest-growing geography for several years (21.1% CC topline growth in 2022), driven recently by public sector awards such as those with Defra.

Cognizant GGM believes its deployment of a vertical GTM in the U.K. helped it scale rapidly and reaffirmed its positioning at the intersection of vertical and sub-vertical knowledge and technology expertise. Cognizant has grouped its business into six regions, presumably intending to develop a verticalized GTM in the larger clusters.

Consulting will play a significant role. Cognizant’s ambition is to double its consulting activities to 9% of GGM revenues as it looks to drive client intimacy, expand its reach to LoB executives, and negotiate sole-source contracts. Within consulting, Cognizant also emphasizes its sustainability advisory services.

Cognizant GGM is focusing on its top 100 clients to drive deeper relationships. This is a very significant refocus, as GGM has ~2k clients. Depth of the relationship and client selectivity will prevail over the breadth of the client portfolio. This approach is consistent with Cognizant's push in consulting.

Cognizant continues to build its international onshore and nearshore presence, with recent onshore centers in Leeds, U.K.; Adelaide, Australia; and Halifax, Canada, while deploying its digital studio network globally (out of ~60 delivery centers). In total, Cognizant GGM has ~4k employees working in its digital studios out of a total of 86k employees servicing international clients.

Another strand of the push for accelerated growth is investments with partners. For instance, Cognizant GGM emphasizes its Enterprise Platform Services, believing Europe is more advanced than other corporate regions. The unit emphasizes specialized offerings and joint GTM with SAP, Oracle, Salesforce, Pega, Workday, Adobe, and cloud migration (e.g., SAP S/4HANA and Oracle Cloud). Beyond ISV practices, Cognizant has selected three broad themes (revenue management, SaaS application interoperability, and omnichannel) across ISVs. Again, the approach resonates in promoting a business approach rather than a technology sale.

Portfolio investments

Cognizant is aligning its service portfolio around a dual cost savings and transformation agenda. The company is targeting large IT outsourcing deals. It is also ready to accept build-operate-transfer deals to help clients grow their technology expertise through captives in India.

Cognizant GGM is preparing for a market rebound around Q4 this year. Platforms remain a priority. In the U.S., TriZetto remains a cornerstone of Cognizant's IP-based business, driving software product sales, C&SI services, and BPaaS services. Outside the U.S., GGM is targeting opportunities in English-speaking countries with a healthcare payer model similar to the U.S. model.

Cognizant GGM is also deploying IPs relevant to a growth agenda. For instance, through its Zenith and TQS acquisitions, it gained several IoT-related IPs such as Apex (industry 4.0), 1Facility (smart buildings), and its Sustainability accelerator. Cognizant also has its Meritsoft product for post-trade processing.

Unsurprisingly, AI is a priority. Cognizant was early among the major IT services providers in announcing its launch of an enterprise-wide generative AI platform (Cognizant Neuro AI, on May 15), an extension of platforms including Neuro IT Operations (AIOps), and Skygrade (application cloud migration, monitoring, and management).

Engineering services and Germany

Cognizant is making bets on two (out of six) geographical clusters: the Nordics and Germany. In Germany, the unit has done well in financial services, manufacturing and life sciences. It is looking to expand its presence in the large automotive market to rebalance its presence in Europe's largest economy. Cognizant's 2021 acquisition of ESG Mobility, now Cognizant Mobility, brought in 1k employees and next-gen ER&D capabilities around electric, connected, and autonomous vehicles, and significant relationships with automotive OEMs and tier-one suppliers.

Cognizant's prioritization of ER&D services goes beyond Germany: it has made several acquisitions in this area, including Mobica (2023, U.K., 900 employees, including ~500 in Poland) and TQS (2021, Ireland, 200, data historian analysis), complementing the large Zenith acquisitions (2019, Ireland, 800, industry 4.0 for pharmaceutical firms). Zenith brought specialized capabilities combining sub-vertical and technology expertise. With ER&D services being more verticalized than IT services, Cognizant’s dual positioning is key.

Expect further IP investments and M&As

What's next? Despite its recent slowdown in M&As, Cognizant signaled an appetite for investments in its service portfolio and a focused approach to geographical expansion, signaling growth aspirations rather than a restructuring story. While its dual positioning on vertical and technological knowledge is common, Cognizant demonstrated concrete examples beyond traditional consulting and systems integration (C&SI). We think the next step is building more IP, e.g., industry solutions in C&SI, and continuing the AI momentum. Also, expect more tuck-in acquisitions in Cognizant GGM. We think Nordics is the next candidate.

]]>
<![CDATA[TCS Takes Systematic Approach to Salesforce Verticalization with Crystallus]]>

 

We recently talked with TCS’ Salesforce practice about its verticalization initiatives.

Product verticalization has been one of Salesforce’s key strategies (along with Customer 360/cross-selling and geographic expansion) since 2014, when it launched its Industries business unit. Like SAP, Salesforce acknowledges that organizations spend time and effort customizing their enterprise applications, and so it has broadened and deepened its vertical cloud offering, strengthened by its acquisition of Vlocity in 2020.

Despite its vertical push, Salesforce continues to rely on IT services partners to complement its vertical capabilities, acknowledging the role of its systems integration partners. Yet, the role of the service partner raises many questions about the nature of their vertical offering. Should a partner’s vertical offering be a product (sold with a subscription) or a solution (provided with the service)? Should it have functionality and an enhancement roadmap or be project-led? Should the partner offer point functionality, integrate with Salesforce applications, or provide a more comprehensive sub-vertical solution? Should the solution be available on AppExchange and go through Salesforce certification?

Our discussion with TCS’ Salesforce practice helped clarify what clients should expect and its verticalization effort with TCS Crystallus™ on Salesforce.

With Clay Maps, TCS combines a transformation methodology with systematic sub-process mapping

In the past two years, TCS has articulated its Salesforce verticalization strategy through its Clay Maps, which have two components:

  • A systematic mapping of key sub-processes, targeting sub-verticals under disruption. The company has mapped these sub-processes with Salesforce functionalities across Clouds, going beyond erstwhile Vlocity to the full range of sectors (Salesforce Customer 360)
  • A transformation methodology based on identifying the client’s goal and how this goal impacts the client’s business and, eventually, processes. TCS calls this their ‘go-to-market plays’. Examples include fraud prevention and detection for claims management in insurance and wellness insurance for healthcare payers.

TCS Crystallus: Adapting to Client Demand

Based on Clay Maps, and as part of its TCS Crystallus on Salesforce initiative, TCS created 90 artifacts ranging from PoVs to demos and solutions. Crystallus includes:

  • For each sub-industry: a problem statement, impacted processes and code remediating the problem statement. TCS has ~30 of these go-to-market plays
  • Approximately 30 pre-configured applications (based on TCS’ best practices) and 10 PoVs that eventually will become pre-configured applications
  • 20 functionality white spaces for which TCS has developed reusable code components.

TCS Crystallus goes across telecom, media, technology, manufacturing, life science, healthcare, retail & CPG, public sector, E&U, BFS, travel & hospitality, and professional services.

TCS takes a pragmatic approach to its Crystallus artifacts: it will initially design PoVs, invest in a demo, and then a reference architecture and a solution, depending on client demand.

An example of a Crystallus sub-vertical solution is for healthcare providers. TCS has developed several roles such as the clinical educator journey, where the educator looks at patient records, enrolls the patient in care programs, and shares knowledge articles with patients. The Crystallus solution also provides dashboards and drill-down capabilities.

The company has designed a roadmap for its most successful solutions and will enhance them, primarily based on upfront investments rather than project-led developments. Overall, TCS provides the solutions as part of the service, although it does not rule out selling them for a subscription or a license if the client asks for the standalone solution.

TCS asserts it has also anticipated the future. Should Salesforce develop its own sub-vertical process or point solution, TCS will work with the client transitioning to the off-the-shelf functionality and remove dead custom code. TCS highlights that it has designed its TCS Crystallus solution as modular and will integrate with the client’s applications. The practice asserts that, as a top five Salesforce partner (based on certifications), it has access to Salesforce’s product roadmap, which it reviews.

TCS asserts that Crystallus is for the long term, and that further sub-vertical expansion is part of the journey. The company launches assets in the verticals brought by Vlocity/Salesforce Industry Cloud, e.g., in media, energy & utilities, and healthcare providers, with recent solutions for gas and energy transition and new Salesforce Cloud (e.g., CPG). TCS increasingly wants to design solutions across Salesforce products (‘multi-cloud’). Salesforce finds that the more Cloud products its clients have, the more loyal they are. This makes sense and correlates with Salesforce’s claims that its clients use it for creating a front-office platform. We think that clients will be even more loyal if they find sub-vertical Salesforce solutions that reduce customization work and maintenance costs.

]]>
<![CDATA[Atos’ Tech Foundations Accelerates its Transformation]]>

 

Atos recently held its Capital Market Day for Tech Foundations. It is a year since the company announced its intention to spin off its high-growth units into a new company, Eviden, and keep its historic IT infrastructure services (and U.K. BPO capabilities) in an entity called Tech Foundations. Alongside the spin-off, Atos initiated a massive €1.1bn restructuring plan for Tech Foundations. The plan came on top of an ongoing restructuring plan in Germany for €180m.

A year ago, the outlook for Tech Foundations was mixed, with uninspiring objectives to reach positive organic growth in 2026 and an operating margin above 5%.

After its profit warning in July 2021, Atos had indicated that its IT infrastructure services business was under pressure and could not reduce its data center costs as fast as declining revenues from clients migrating to public cloud. The scale of Tech Foundations’ challenge in its core infrastructure business was considerable and included a large burden of underperforming contracts, representing around €750m in revenue.

Restructuring Is the Priority…

One year into the reinvention, financials are gradually improving, and Atos shares a more positive outlook. Tech Foundations is now targeting €5.0bn in revenues by 2024, significantly higher than the previous guidance of €4.1bn. Positive organic growth continues to be expected from 2026. And the operating margin target range has increased from around 5% to 6-8%.

Tech Foundations has made progress in some aspects of its turnaround: for example, it has addressed two-thirds of its red contracts. Progress with personnel downsizing has been slower: so far, 900 positions have been cut, primarily onshore, out of the total 7,500 planned. This reflects Atos’ large employee presence in Continental Europe, which has strong labor laws. Tech Foundations aims to increase its offshore ratio by 10 pts, reshape its pyramid, and gain further efficiencies from automation. The business is ramping down its little-profitable or non-synergistic non-strategic activities (~€900m), primarily its standalone VAR and several U.K. BPO activities.

… With Commercial Momentum

Tech Foundations had lost some of its commercial rigor. It is now looking to reignite its commercial activity. The business should benefit from the current market appetite for in-year cost savings and managed services contracts. Tech Foundations is focusing on the usual suspects. With cross-selling in mind, it focuses on add-ons in its top 100 accounts, targeting 13% of additional revenue. It has also expanded its large deals (€30m+) team, creating 90 new positions, and looking to standardize its large deal processes to drive repeatability. Tech Foundations is also regionalizing its IT outsourcing approach, giving more independence to the geographies for their service portfolio priorities. The unit has changed its incentivization scheme, with bonuses based on the first three years of the contracts and KPIs focused on profitability rather than organic growth.

There is also a much closer focus on contract performance, with a team monitoring contracts weekly.

Continuing to develop partnerships with the hyperscalers is naturally important, with AWS as a priority: the current AWS-related pipeline is €0.5bn and is expected to double within the next 12 months. The AWS CloudCatalyst partnership includes a commitment for AWS to provide training for 20k certifications. Another element of the partnership is that AWS will utilize Atos’ servers left free by clients migrating to the public cloud, helping Tech Foundations reduce costs.

Four Units with Different Priorities

Atos is refreshing its organizational structure. Tech Foundations now has four units: Hybrid Cloud & Infrastructure (HCI), Digital Workplace (DW), Technology Advisory & Customized Services (PS), and Digital Business Platforms (DBP):

  • HCI is Tech Foundations’ largest business (€2.1bn). It faces revenue decline and is about reinvention (with new services such as hybrid and multi-cloud orchestration, edge computing monitoring, and enlarged partnerships), proactively helping clients migrate to the public cloud and a service catalog. Of those initiatives, CloudCatalyst with AWS is probably the highest priority for getting HCI back to growth after a couple of years of transformative decline
  • DW (€1.2bn) is in a better position, enjoying revenue growth. And with UCC/Unify set for divestment, the unit is offloading a declining business. It emphasizes its transformation capabilities and existing or new offerings, e.g., XLAs, IoT/connected device management, and as-a-service offerings. It is leveraging gen AI for service desk level 1 automation
  • PS (€0.9bn) is another Tech Foundations’ growth driver. The unit operates in an apparently unattractive market, staff augmentation in IT infrastructure services and, to a lesser extent, in application services. However, PS is Tech Foundations’ highest growth business and is highly profitable. The unit is introducing new offerings, such as technology advisory services, having launched innovation labs (Inno’Labs) to ideate with clients and create PoCs. It also created expertise networks (Tribes) to share knowledge across its 8k employees
  • DBP prefigures the future of Tech Foundations and includes its hosted platform business. The business is small (€0.3bn) but includes the visible Sports Event unit known for its work for the Olympics/Para Olympics and the UEFA. It also has a digital ID business (with contracts in Togo and Morocco) and sustainability software products. DBP is a disparate business that goes beyond Tech Foundation’s IT infrastructure-centricity. Expect Tech Foundations to scale this business strategically.

Ensuring Best Practices Are Engraved at Tech Foundations

The future of Tech Foundations depends on ensuring its commercial, delivery, and service portfolio priorities are engraved into the company’s DNA. Achieving this is clearly a priority in the highly competitive IT services infrastructure market.

The business focuses on transformation contracts and being highly selective rather than only addressing run services.

Investment in automation continues: Tech Foundations was reassuring, showing early deployments of generative AI use cases.

The mid-term future of Tech Foundations lies in portfolio development. Relevant areas include cybersecurity services, particularly SOC/managed security services, and application services around native cloud development and migration. In both of these, AI and generative AI will play important roles.

]]>
<![CDATA[IT Services Firms Turn to Portfolio Selectivity]]>

 

The IT services industry has entered a new growth cycle, with increased emphasis on portfolio management and vendors increasingly divesting their low-growth, low-margin businesses.

IBM was an early example of this trend with its Kyndryl spin-off. Several of its competitors are also following this pattern, including Atos. At the same time, in the Nordics, Tietoevry will go one step further: the company will divest its infrastructure and half of its application services business. A much smaller vendor, NNIT in Denmark, sold its IT infrastructure business to private equity in May 2023. In these four examples, the companies are targeting higher revenue growth and margins by offloading their low-growth, low-margin activities.

The move may sound obvious. But portfolio selectivity certainly wasn’t the dominant message in the past twenty years. When communication service providers or hardware vendors entered the industry from the late 90s to the mid-2010s, their message was about end-to-end services and bundled (‘integrated’) services. Most major IT services vendors offered the full range of IT infrastructure and application services and expanded selectively to BPO and ER&D services. That approach still works well for some leading IT services vendors such as TCS but increasingly less so for a number of others.

Tuck-In Acquisitions Will Continue Apace

A consequence of the current portfolio focus and rationalization is that large M&As are decreasing. Instead, most tier-one IT services vendors are increasing their emphasis on tuck-in additions in areas such as digital, cloud, and security, and to a lesser extent in traditional areas such as consulting and SAP services, frequently in support of geographic expansion. Accenture, NTT DATA, and, more recently, IBM Consulting have been the most active in this respect.

In addition to the areas listed above, we expect selective expansion in growth areas such as AI and data, and further investment in front-office applications such as marketing automation. Also, we expect selective investments in ER&D services, targeting connected products and digital manufacturing/industry 4.0. And with rising defense budgets globally and more digital products, defense IT should also be on the agenda. Yet, for sovereignty and security reasons, the increased defense IT spending will benefit only firms partnering with local defense firms.

Acquiring small but high-growth specialist firms currently makes sense. Yet, in previous cycles, many IT services vendors targeted mid-size to large competitors rather than tuck-ins to avoid talent exodus resulting from a small and flexible firm being absorbed into a larger and more process-oriented entity.

Overall, this growth cycle is about organic growth and increased margin. The leading vendors will avoid using their cash on major expansions into new, more speculative areas and stay close to their roots.

PEs Looking to Exit IT Services Investments

Private Equity attitudes towards IT services firms have also changed. With (still) rising interest rates, access to M&A funding has become more costly, primarily impacting PE. We will see less PE activity around acquiring IT services firms in the next few years, as highlighted by the failed DXC take-over. Valuations should go down. In the meantime, several PEs will want to exit their IT services investments (e.g., Virtusa, Coforge, Hexaware, Expleo, Inetum, Engineering).

]]>
<![CDATA[Qualitest Acquires Q Analysts to Address Emerging ‘AI-infused Device’ Market]]>

 

We recently talked to Qualitest about its latest acquisition, Q Analysts, its sixth since 2019. Qualitest has been on an accelerated transformation journey under the ownership of PE BridgePoint. Q Analysts further strengthens Qualitest’s capabilities in next-gen digital QA, with expertise in testing AI-based devices such as AR/VR/MR headsets and generating data for training AI models.

Qualitest Has Accelerated its Transformation

Qualitest has bold growth ambitions targeting $1bn in revenues by 2026, and in support of this, it has further shifted its delivery network to India to gain scale. The QA Infotech and ZenQ acquisitions helped significantly in this respect. NelsonHall estimates that Qualitest has ~45% of its headcount in India, or ~3,400 FTEs. We expect this India-centricity to increase further.

Qualitest has also further verticalized its GTM, its salesforce now being organized around the following industry groups: technology, BFSI, healthcare & life sciences, telecoms, utilities, retail, and media & entertainment. In parallel, Qualitest has expanded its focus from its core technology clients to BFSI (now 30% of revenues, on par with technology). It recently strengthened its healthcare and telecom expertise with the Comply and Telexiom transactions.

The company is specializing its service portfolio and, at the same time, investing in automation. Continuous testing remains a priority. The 2019 acquisition of data science company AlgoTrace jumpstarted Qualitest’s expertise in AI-based testing. NelsonHall believes that AI-based automation will disrupt the QA industry by automating the generation of test scripts and breaking the lengthy requirement-test case-test script cycle by removing the test case phase.

Q Analysts Brings Specialized Testing Services for AI-based Connected Devices

Qualitest has developed its digital services portfolio beyond traditional mobile app testing, introducing next-gen offerings. The acquisition of Hyderabad-based ZenQ brought in capabilities around blockchain and testing connected devices such as smart homes, pet care, fetal health, and drones.

Now, the acquisition of Q Analysts further increases Qualitest’s investment in digital testing offerings, looking at products described as ‘AI-infused devices,’ i.e., AR/VR devices and virtual assistants.

Q Analysts currently services tier-one technology firms engaged in AR/VR/MR, wearables, and virtual assistant devices. The company has ~600 employees, is headquartered in Kirkland, WA, and has offices in Santa Clara, CA. Q Analysts has testing labs in Kirkland, Santa Clara, and Antananarivo, Madagascar. It has structured its portfolio around two activities: testing of ‘AI-infused devices’ (60% of revenues); and generating training data for these devices (40% of revenues).

The company has worked on AR/VR testing activities, often at the prototyping stage. It offers a full range of services, from devices to mobile apps, web applications, usability testing, and back-office integration testing. As with mobile devices, AR/VR devices bring specific QA activities, such as assessing the performance of an application on a device and estimating the impact of running this application on the device’s battery.

Q Analysts highlights its expertise goes beyond device testing. The company’s sweet spot is assessing image and video rendering on the device. The company has invested in its workforce to identify rendering issues such as image refresh rate or pixelization, a capability only a trained human eye can spot.

The company continues to invest in visual testing in the AR/VR/MR space. For example, the company tests new technologies such as foveated rendering (i.e., the devices have in-built inward-facing cameras to track a user’s eye movement and render images of higher resolution where the eye is focused) to minimize energy consumption and make device batteries last longer. The company considers visual testing to be key and requires advanced visual and technical skills.

Q Analysts’ second activity is generating training data or ‘ground truth data services’, a term borrowed from the meteorology industry. The company will generate training data in its labs and capture images and movements required using cameras and LiDAR scanners. Q Analysts’ know-how comes into play by generating datasets based on its client’s demographics and providing several real-world simulated set-ups, such as living rooms and offices and other variances (such as furniture and interior decoration). Q Analysts also provides related specialized services such as manual 2D and 3D image tagging to help train AI models.

High Potential Ahead

Qualitest has big ambitions for Q Analysts based on the expectation that demand for connected ‘AI-infused devices’ will expand from its product engineering niche. The use of AI-infused devices will become increasingly common across industries; for example, retail (virtual try-on), healthcare (physical therapy and 3D models), and energy & utilities (digital twin-based training). Longer term, Q Analysts targets the metaverse, expanding from its AR/VR and other AI device niche to the larger virtual world opportunity.

Complementing Q Analysts’ specialized capabilities, Qualitest brings increasing expertise in AI-based automation, including computer vision testing and connected device test automation. Client demand looks limitless, and Qualitest is building its next-gen testing expertise to address that demand year after year.

]]>
<![CDATA[BT Merges Global & Enterprise Units to Simplify GTM and Save Costs]]>

 

A New Focused Division 

BT has announced the merger of its two ICT services units, Global and Enterprise, into a new division, BT Business. The new division will have pro-forma revenues of £8.5bn and an EBITDA of £2bn. The current CEO of Global will lead BT Business.

With this move, BT wants to unify its B2B market and focus its capabilities on connectivity, unified communication, networking, and security. It also wants to simplify its GTM for its corporate and public sector clients and remove duplication across Global and Enterprise.

Cost savings are an important element of the merger, with BT targeting cost savings of £100m by FY25 through consolidating management, support functions, product portfolios, and IT.

Financial Pressures

For the 3Q ending September 2022, the BT Group reported revenue of £10.4bn, up 1% due to growth in its Consumer and Openreach segments,  and partially offset by legacy declines in large corporate customers in Enterprise and lower equipment sales in Global. Large corporate accounts and the decline of legacy products continue to present a challenge to the BT Group.

Global and Enterprise suffered from the pandemic, with revenues down by 14% and 8% respectively in FY21 (the year ending March 31). However, the two units did not benefit from the digital and cloud catch-up after the pandemic. Revenues were still down in FY22 (by 10% and 5% respectively) and in H1 FY23 (by 2% and 5% respectively).

The decline of Global and Enterprise reflects, unsurprisingly, portfolio changes. Global suffered from lower equipment sales and divestments (in Spain, Latin America, and France). Enterprise has suffered from the decline in legacy services, such as fixed telephony, despite mobile and VoIP growth in the SME and SoHo segments.

EBITDA struggled under pressure despite efficiency actions taken by BT, from its FY 22 in March, where EBITDA was down for Enterprise and Global, by 4% and 23% respectively, YOY. The decline continued into the results for the six months reported in September 2022, with Enterprise (23% down) and Global (5% down) for the six months YOY.

Fiber Deployment

The creation of BT Business is part of a larger cost savings program, with BT targeting £3bn in cost optimization by FY25. BT announced in November 2022 it wanted to save operational costs to fund its investment in deploying fiber options, OpenReach, throughout the U.K. The company targets 25m home and business customers by December 2026, up from 9m in December 2022. The investment comes at a time when BT, like many other European firms, faces rising energy costs. BT’s needs for investments do not stop with fiber deployment. The company is also investing in deploying 5G.

BT Business Outlook

Global and Enterprise had overlapping offerings and also suffered from internal competition on the large U.K. accounts. The merger should help BT Business simplify its GTM and achieve cost savings. It should help the new division to invest more in specific areas, e.g., digital, cloud, and security for large enterprises.

The merger will, however, probably not solve BT Business’ exposure to traditional voice and equipment resale, whose revenue decline has been long-lasting. BT Business will need to develop high-growth offerings through M&As to increase its service mix. Also, given the growing overlap between telecom and IT services, we think that BT Business will need to further lower its cost structure by increasing its India delivery network.

BT Business keeps an important SoHo and SME business, which intrinsically have different dynamics than ICT services to large enterprises. BT has a good track record in packaging services and offerings to its clients. The deployment of fiber and 5G should help the company gain market share in this customer segment. The question is whether this customer segment should be part of BT Business or BT Retail.

Certainly, between the two divisions, better clarity on customer focus and reduction of duplicative services and roles like solution and deal architects, financial analysts, and bid managers will provide the focus that the two units need and assist in meeting BT Group’s financial goals.

The new BT Business division will report as a single unit from April 1, 2023.

]]>
<![CDATA[Tata Elxsi sees the Metaverse as the Evolution of AR in the Next Decade]]>

 

We recently talked to Tata Elxsi, the engineering and R&D services vendor headquartered in Bangalore, India. We discussed where the adoption of AR/VR is currently among enterprises, how the metaverse differs from AR/VR, and which B2B use cases enterprises will prefer in the short to medium term.

The current demand for virtual reality (VR) is primarily driven by the gaming industry. With the advent of Augmented reality (AR), there is currently strong demand by enterprises in support of training; for example, for equipment setup, and remote maintenance & repair. Demand here was boosted by the pandemic and remains steady, partly because of reduced factory floor personnel and operator attrition. While metaverse use cases often take part in a virtual world, the metaverse is not VR.

Metaverse Will Encompass AR

Recent news from Meta has been mixed, with the company suffering from slow user acceptance. Yet, like Meta, Tata Elxsi sees a promising future for metaverse adoption by enterprises over the next decade. It highlights that metaverse use cases will cover a broad range of functionalities such as payments (enabled by blockchain), digital twins, and training. From a conceptual perspective, the range of possibilities is limitless.

An example of a digital twin use case, in the automotive industry, is vehicle twins, which display sensor data for things such as engine temperature that can be monitored and used for prescriptive and predictive maintenance. Tata Elxsi sees digital twins being widely used across sectors, such as car leasing firms, banks, and OEMs expanding their financial services arms. This digital twin use case already exists, and its adoption is rising independently of the metaverse.

So where does the metaverse help, compared to AR? Tata Elxsi sees the metaverse as multi-channel, therefore integrating AR and other interaction means (e.g., web browsers and .exe files on desktops). It also highlights that the metaverse has a richer ecosystem and can handle several individuals (as opposed to only one in AR applications). The metaverse also includes virtual passive objects and AI-governed avatars. Tata Elxsi believes that 40% of assets and persons in the metaverse will be virtual.

Tata Elxsi believes that the metaverse opens new use cases, such as virtual corporate events, which AR cannot currently handle. For the firm, UX will play a big role in driving the usage of virtual events.

The Metaverse Has a Ten-Year Horizon before Enterprise Adoption

Tata Elxsi sees, over the next decade, technologies aggregating and complementing each other to form metaverse applications. A key element of the metaverse’s success will be standards and reusability, driving standardization of technology and faster implementations.

The emerging notion of AR Cloud will bring standard architecture and technologies that can be reused as part of AR and within a metaverse. An example of a standard common feature is the GPS-based location of a virtual building in the metaverse. Tata Elxsi points out the potential use of virtual buildings as training centers where employees gather to attend classes and workshops.

The extent of widespread adoption of metaverse platforms and AR Cloud will depend on several requirements, including the deployment of 5G to handle the bandwidth-intensive AR immersive experience. The weight of AR devices, which will decrease from 1kg to 20 grams in the next few years, will drive user adoption and price reduction.

On a ten-year horizon, Tata Elxsi highlights that the evolution of metaverse platforms will be self-governed and cannot be directed by one specific company (meaning Meta, in NelsonHall’s opinion). The metaverse will be a result of a mixture of new or yet-to-be-discovered technologies. Tata Elxsi is maintaining a flexible approach to the metaverse and will continue to evaluate technologies and use cases.

]]>
<![CDATA[Cognizant Drives Salesforce Marketing Cloud Specialization with Lev]]>

 

We recently talked to Cognizant about its Salesforce Marketing Cloud capabilities.

Within the Salesforce portfolio, Marketing Cloud, along with Commerce Cloud, is a high-potential product that will eventually outgrow the more mature Sales and Service Clouds. More than any other Salesforce product, Marketing Cloud has grown through M&A, notably ExactTarget (that came with B2B marketing ISV Pardot) and Datorama (analytics for marketers).

Cognizant primarily built its Marketing Cloud capabilities with its 2020 acquisition of Lev, with Cognizant transferring its Marketing Cloud practitioners to Lev. As Cognizant’s Marketing Cloud practice, Lev has now reached around 600 consultants globally and is one of Salesforce Marketing Cloud’s largest service partners.

Lev’s preferred entry point with Marketing Cloud projects is consulting, with a maturity assessment of the client’s Marketing Cloud instance, followed by creating roadmaps for the transformation program to help the client exploit more Marketing Cloud features and functionality. Lev also offers organizational audits, process improvement, and license/subscription expense rationalization.

Addressing Marketing Cloud’s Large Portfolio

Lev’s capabilities span the full range of Marketing Cloud sub-products, ranging from the core Engagement sub-product (initially based on ExactTarget) to the emerging Customer Data Platform, Personalization (the former Interaction Studio), Intelligence (Datorama acquisition), and Account Engagement (Pardot acquisition).

Most of Lev’s work is around the Engagement sub-product, primarily Email Studio, Journey Builder, Mobile Studio, and Ad Studio. Lev works with clients to transform their email initiatives, automating email campaign management triggered across the customer journey. It uses Contact Builder to clean data and remove redundant accounts to improve data consistency.

Lev supports client scenarios such as organizations expanding multi-channel communications (e.g., expanding from email ISV to SMS and WhatsApp) and migration from legacy email service providers. In both scenarios, Lev will focus on migrating/transforming assets such as email templates, content areas, images, and documents to the Engagement sub-product.

Expansion in BPS

Lev has gone beyond IT services with Engagement and expanded to:

  • Digital advertising agency activities such as campaign management BPS
  • Creative services such as email visual design, copywriting, and content strategy and creation
  • Paid ad management, helping clients define their campaign’s objectives, segment their audience, purchase the right advertising online, and monitor their effectiveness
  • Translation and localization services for clients with multi-country campaign needs.

Lev sees campaign management and creative services as one of the entry points into an account.

Lev has developed two products that complement Salesforce’s Marketing Cloud:

  • Campaign Studio, which helps organizations migrate marketing assets to Engagement and monitor campaigns at the enterprise level, and shorten time to create complex emails, audiences, and send schedules
  • Abandoned Cart, which monitors abandoned carts on Commerce Cloud and can trigger email reminders.

Increased Joint Initiatives with Cognizant in Sales and Delivery

So, what next for Lev?

Lev continues to invest in emerging Marketing Cloud products:

  • Personalization, formerly known as Interaction Studio, for tracking customer behavior, identifying customers to connect unknown and known customer profiles, and connecting with Sales Cloud to enrich customer profiles
  • Intelligence, formerly known as Datorama, to offer a marketing reporting tool that can be used by marketing professionals and provide specific KPIs such as cost per lead or cost per click
  • Salesforce Genie, formerly Customer Data Platform, for unifying the customer data profile, and enabling declarative segmentation and activation.

The emphasis is on helping organizations adopt enterprise-wide Marketing Cloud programs, focusing on marketing asset and data model standardization and application integration with Salesforce Sales Cloud and third-party applications. Lev believes that Cognizant’s capabilities around API and MuleSoft and overall data expertise will help while it specializes further in the marketing domain.

Lev is also looking to benefit from Cognizant’s scale and recruitment engine. It has now coordinated its sales and marketing activities with the larger Cognizant. Look to see some offerings verticalized, in conjunction with Cognizant’s sector units, other Salesforce practice units such as ATG, and of course, Salesforce’s vertical Clouds. Any such verticalization journey will require close internal and external coordination.

]]>
<![CDATA[Cigniti Develops iNSta on Automated Script Creation & Maintenance with High Potential]]>

 

Software testing continues to be an industry of contrasts: the primary activity, functional testing, remains a human-intensive activity, despite the accelerated adoption of continuous testing (i.e., bringing functional automation as part of DevOps).

But testing has also grown in a highly specialized set of activities, earning the name of Quality Engineering (QE), ranging from support activities, test data and test environment management, shifting both left (early in the software lifecycle) and right (to application monitoring and now site reliability engineering).

Nevertheless, the most exciting event in QE remains the usage of AI to automate the creation and maintenance of test scripts. We think that despite somewhat limited adoption, automated script creation has the potential to redefine the QE industry.

Test Script Maintenance Will Become Easier

We talked to Cigniti about its recent investment in its iNSta IP to automate test script creation and maintenance. Cigniti repositioned iNSta two years ago, from a testing framework, as its primary automation point, aggregating all automation and specialized services. The company promotes it as an 'AI-enabled low code scriptless test automation platform'.

Now the company has enriched iNSta with its core Intelligent Recorder.

iNSta's Intelligent Recorder will create test scripts on the fly when a user goes through a transaction in an enterprise application. It will identify UI objects and build an object library to maintain test scripts. Intelligent Recorder will scan the UI for each release and identify changes in the UI. The maintenance of such test scripts is, we think, of high importance. Cigniti finds that 5% of test cases are outdated or not in sync with the current release and will lead to false positives or testing failures. The company continues to add incremental enhancements: should Intelligent Recorder fail to recognize that an object has changed, it will use computer vision to compare screen images of two different releases, identify the objects that have changed, and amend its object library.

Cigniti also accelerated the speed of execution of iNSta, relying on conducting test script execution in parallel across several VMs and containers. The company will add VMs/containers automatically through a scaling-out approach. With this offering, Cigniti wants to address development organizations operating in agile/DevOps with requirements for short testing timelines. It also targets applications that require extensive use of AI, which typically slows down test execution.

Cigniti also complements iNSta with automated test script creation, using NLP technology to translate English-written test cases from Excel and ALM into test scripts. Cigniti has created a dictionary and will custom dictionaries for its clients. The company finds that its English language translation AI model brings more benefits than the Gherkin language, as BDD requirements are done by testing specialists and not by business users. Nevertheless, Cigniti is also integrating its BDD framework in iNSta.

A strength of iNSta is that the Intelligent Recorder and the NLP translation are interoperable, and users can go back and forth between the two approaches. This maximizes, we think, the possibility of automation and helps with test script democratization.

New Opportunities with E2E Testing

AI is also opening QE to new testing opportunities. To a large extent, functional testing tools such as HPE/Micro Focus UFT and open-source Selenium have focused on one application technology. Still, they cannot operate across mobile apps, web, client-server, and mainframe applications.

Cigniti has expanded iNSta's Intelligent Recorder from web applications to mobile apps and client-server applications. This opens more automated test script opportunities. It also opens up business process/E2E automation opportunities. Several industries, including telecom, banking, retail, and government, have processes operating across different application technologies. Until now, E2E testing had to be manual or relied on RPA tools/bots.

Cigniti also intends to host iNSta on the cloud and sell it as a PaaS tool to favor its adoption. In the meantime, it will expand the Intelligent Recorder to SaaS applications (e.g., Salesforce), mainframe applications, and APIs.

We think the QE industry now has the technology to challenge the requirement-test case-test script model, and now is the time to focus on organization adoption. Cigniti highlights that initially, clients are hesitant to adopt iNSta due to organization and skill changes. We expect Cigniti to spend time with its clients evangelizing the market, relying on its QA consulting unit, and helping clients on OCM. More than ever, testing (and IT) is also about people's buy-in. We think tools like iNSta will help testers focus on more gratifying tasks such as analysis and remediation. This is good news for the industry.

]]>
<![CDATA[ValueMomentum Continues to Invest in Continuous Testing & Shift-Right]]>

 

We recently talked to ValueMomentum about its QE approach to product-centric development and testing. The company is helping its insurance clients improve the quality of their applications using agile best practices and DevOps tools. In support of this, ValueMomentum has refreshed its automation approach and created a continuous testing platform articulated around design (shift-left), execution (through automation), and monitoring (shift-right).

Most tier-one QA vendors today have their own continuous testing platforms, and these have become the backbone of test automation. Indeed, such platforms currently aggregate most of the existing automation and IP, running automation as part of each release cycle. These continuous testing platforms differentiate by adding new automation features to core automation.

Making Continuous Testing the Aggregation for Test Automation

ValueMomentum is investing in continuous testing through methodologies and adding new automation features. The company uses BDD, for instance, as it believes the Gerkhin language remains the best alternative for business users to write test cases that are automatically converted into test scripts, thereby reducing ambiguity in requirements. The company complements its BDD centricity with pre-defined business process diagrams for the insurance industry using MBT (Mantiz).

During the shift-left phase, to promote quality in the development phase of the project lifecycle, ValueMomentum has integrated code-related services into its continuous testing platform (e.g., unit testing and code review); test support services (e.g., test data management and service virtualization; AI-based analytics (such as code coverage and test impact analysis, and static code analysis); and non-functional (e.g., testing, and automated vulnerability assessment). As an example of its investment in AI, ValueMomentum is fine-tuning its defect prediction AI model by increasing data sources, from past defects to code changes in the release and developers’ coding quality.

In the testing environments, once the application release is completed, ValueMomentum uses a mix of full functional test automation (E2E testing) complemented by exploratory testing to maximize the chances of catching bugs before production.

Shift-Right Is the New Frontier. AI Will Help

Shift-right continues to be one of the open frontiers in the QE industry. Feeding back production information to the dev and test teams in an automated manner is still a challenge. AI is increasingly being deployed, but there remains considerable growth potential for its use.

ValueMomentum is accordingly investing in shift-right. Beyond APM tools for application monitoring, the company uses AWS tools for cloud applications, e.g., Canaries (monitoring of the performance of end-user devices), A/B testing (usability research), Game Day (simulating a failure or an event to test applications, processes, and team responses, similar to chaos testing), and rollbacks (redeploy a previous application release using CodeDeploy).

And indeed, ValueMomentum is gradually making its way to Site Reliability Engineering (SRE), where production specialists monitor applications and work with developers to remediate application issues quickly. For now, ValueMomentum is taking an AWS approach, relying on point solutions tools that AWS provides. It is fine-tuning AI use cases such as test case recommendations, defect triaging, defect-to-test case mapping, test case optimization, system comparison, and test case search. This is just the beginning of AI in shift-right for QE.

]]>
<![CDATA[TCS Deploys SRE Services to Cloud Application Testing]]>

 

TCS recently briefed NelsonHall on its approach to site reliability engineering (SRE) in the context of quality engineering (QE).

SRE emerged almost a decade ago as part of the shift-right move, targeting production environments beyond traditional IT infrastructure activities such as services desk and monitoring activities. While no definition of SRE has fully emerged, TCS points out that SRE focuses on two topics: resiliency and reliability, through with observability and AIOps, automation, and chaos engineering as key services.

TCS prioritizes cloud-hosted applications for its SRE services, as cloud hosting increases the likelihood of application outage since applications that have been migrated were not initially designed and configured for cloud or multi-cloud hosting.

Generally, there has been very little SRE in QE activity, even though the industry has emphasized shift-right for several years. The shift-right notion in QE refers to feeding back production information to dev and test teams, breaking down the traditional silos between build and run activities. And in activities such as application monitoring (relying on the APM tools) and associated AI use cases (to make sense of APM-triggered events), the classification of defects found in production, and in sentiment analysis, have become common.

We think shift-right activities can still be improved, building on monitoring activities. Chaos engineering is a good example of a developing proactive service. More importantly, the feedback from production to dev and test needs to be improved, and we think SRE will help here.

Observability/Monitoring, AIOps, and Chaos Engineering

TCS' approach to SRE relies on application monitoring, AIOps, automation, and chaos engineering. Application monitoring ('observability') remains at the core of TCS' portfolio. For this, the company will deploy APM tools, collect logs and traces, and provide reporting. One of the challenges in application monitoring is data dissemination across different applications and databases. Accordingly, data centralization is a priority for TCS.

Once it has collected monitoring data, TCS deploys AI models (AIOps) to automate event detection and correlation and eventually move to a prediction phase. TCS' main AI use cases are predictive alerts, root cause analysis, event prioritization, and outage likelihood. The company will use third-party tools such as Dynatrace (combined with application monitoring) or deploy its own IP, depending on the client's tool usage.

For deployment and recoverability, its next step after AIOps, TCS will complement application deployment with automated rollbacks and ticket creation. At this stage, when facing application defects, the SRE team will also involve the development teams to conduct RCA and fix application defects.

TCS will also conduct chaos engineering. Chaos engineering complements performance engineering and testing in that it evaluates applications' behavior under more strenuous conditions. With chaos engineering, TCS will conduct attacks such as instance shutdown, increased CPU usage, and black holes to assess how the applications being tested behave. TCS has integrated tools such as Gremlins and Azure Chaos Studio in its DevOps portfolio to embed chaos engineering as part of continuous testing.

Demand Is Still Nascent

TCS typically deploys SRE teams of six engineers for monitoring applications. It highlights that SRE adoption is still nascent, and it will lead such programs with marquee clients initially.

In broad terms, the future of SRE lies in DevOps and becoming part of continuous testing, where all activities are scheduled and automated, for new build/release execution. TCS is an early mover in this area and is currently honing its tools and consulting capabilities. Platforms combining tools and targeting comprehensive services as part of continuous testing are the company's next step.

]]>
<![CDATA[NTT DATA to Acquire Apisero to Double its Salesforce Practice]]>

 

We recently talked to NTT DATA about its pending acquisition of Apisero, announced last month.

NTT DATA has been through significant changes recently with its merger with NTT Ltd. NTT Ltd. grouped a wide range of network and connectivity services, hardware and related services, data center hosting, IT infrastructure services, and resales. The resulting NTT DATA is now a giant with revenues of ¥3.5tn (~$26.2bn) and 180k personnel, larger than Fujitsu’s Services unit. NTT DATA has largely unified its brands over the years while maintaining the NTT DATA Services brand for its North American operations.

The company continues its M&A activity, with Apisero bringing scale in digital and cloud. Apisero is a MuleSoft and Salesforce consulting partner headquartered in Chandler, AZ, with additional offices in Vancouver, Strathfield, Barcelona, Dubai, and India. The company services U.S. mid-sized firms and has approximately 2,000 specialists, including around 1,500 MuleSoft practitioners and around 500 Salesforce consultants. Apisero has an India-centric delivery model, with 90% of its employees based in India (in Pune, Mumbai, Delhi, Kolkata, Ranchi, Bangalore, Hyderabad, Guwahati, or Chennai). NTT DATA highlights that Apisero is enjoying very strong growth (NelsonHall estimates around 30% topline growth), outgrowing even Salesforce, which continues to benefit from robust market demand. In its latest quarter, Salesforce reached the same revenues as SAP.

Apisero is a strategic acquisition for NTT DATA as it will almost double its size in the key Salesforce service market. We estimate that the combined Apisero NTT DATA will have around 5,000 Salesforce practitioners (including MuleSoft) globally: Apisero will definitively place NTT DATA among Salesforce’s largest partners.

Apisero will also significantly strengthen NTT DATA’s capabilities in MuleSoft’s API-based integration niche. Salesforce has positioned MuleSoft as the glue for integrating its Cloud products, especially around Customer 360, aggregating customer data from Salesforce and external applications. And, of course, MuleSoft continues to expand outside the Salesforce ecosystem. While Apisero will bring mostly professional services to NTT DATA, it also has several MuleSoft-certified connectors for ISVs, whether significant SAP Hybris and Splunk or niche, Redox (EHR) and Metrc (marijuana industry).

A Game-Changer for NTT DATA in North America and India

More broadly, Apisero will be a game changer for NTT DATA in North America. It will quadruple its headcount in North America/India to around 2,700 and rebalance NTT DATA’s delivery network to India, primarily around MuleSoft.

Finally, Apisero will bring to NTT DATA North America around 500 Salesforce consultants, primarily around Sales Cloud. Even though Sales Cloud is one of the more mature Salesforce products, it has continued to enjoy 15-20% organic growth. Its potential remains important, including in the SME sector.

A Recruitment Engine

NTT DATA’s short-term priority is to let Apisero continue with its high growth and disseminate Apisero’s best practices across the group. In one example, Apisero will bring in an automated and structured recruitment and upskilling engine primarily in India, which will help NTT DATA to scale up faster.

NTT DATA shows the offshoring potential for MuleSoft’s technical activities; the company is looking to expand from the U.S. and sell MuleSoft offshore services to its client base globally.

Meanwhile, NTT DATA continues to be busy with its existing Salesforce capabilities. It recently benefited from integrating NTT Ltd.’s operations, which brought a Salesforce service business in South Africa through the legacy Dimension Data.

NTT DATA will now need to digest its recent acquisitions: expect to see a pause in M&A activity while it focuses on sharing best practices, offerings, and its delivery organization across the Salesforce practice in its various geographies.

]]>
<![CDATA[Infosys Structures its Sustainability Engineering Portfolio]]>

 

We recently talked to Infosys about how its Engineering and R&D services portfolio addresses sustainability and the circular economy, given the strong client interest in sustainable product design.

Infosys has accordingly evolved its sustainability engineering offerings from a series of capabilities into a portfolio that is articulated along three product lifecycle phases: design & development; manufacturing & operations; and aftermarket services, including product retrofitting and recycling.

Product Design and Development Is Core for Sustainability Engineering

Product and design development is at the core of any sustainability offering. Infosys’ design and engineering experience in this space ranges from energy-efficient products to sustainable products. An example of recent work is a radiant baffle, which Infosys initially designed for its internal needs. The radian baffle cools temperature by removing heat, mainly through heat transfer radiation. Infosys argues that the radian baffle internally reduced per capita energy consumption by 26%. The company points out that A/C equipment accounts for 40-50% of a building’s energy consumption in India, and Infosys  is now marketing it externally.

Infosys is also active in designing sustainable products. An example of a recent project is a green scoring dashboard that collects product-related components and parts information. It uses an AI model to determine a ‘green score’ based on the component impact on the environment and health.

Regulatory compliance is an important driver for sustainable product design. Infosys created its full material disclosure (FMD) dashboard designed for the chemicals industry. Chemical suppliers declare on the FMD dashboard the nature of chemicals supplied to the client. Infosys used FMD to identify restricted substances and for reporting purposes.

In another recent project around traceability, Infosys developed a crop transaction application suite for farms in India. The applications start with farm registration, recording the crop transaction and payment, up to the warehouse management and logistics stages, with integration into the ERP systems. The suite thus traces crops across their transaction phases.

Tapping the Immense Brownfield Opportunity

Beyond product design and development, sustainability has immense potential in brownfield manufacturing and operations. In this space, Infosys focuses on resource frugality, primarily around energy consumption and water management. The company draws on its own experience from achieving carbon neutrality in 2020 in its campuses/delivery centers. It monitors, in real-time, equipment such as chillers and HVACs, generators, and elevators, along with sewage and treatment plants. Much of its effort has been collecting data from facilities (through IoT systems), analyzing them, and building predictive models. Infosys now forecasts its energy and resource consumption based on models that include weather and conducts budget deviation analysis.

An example of a brownfield project is with a cable manufacturer that had designed a new electric cable coating that increased electricity transmission by 15-25% and was incorporated into its new product. It wanted to address its client base of aging installed wires (70% are 25 years old). Infosys designed a robot prototype installed on the power lines that take photos to identify dirty and deteriorated areas and then conducts wire cleaning and coating.

IoT has a key role in sustainable manufacturing operations. An example of a project is where Infosys collects data from sensors and equipment on a given production line. Based on the data, the AI models will determine when the production line is not running and, when applicable, turn off the air handling unit, to save energy costs.

Aftermarket Services: Reuse and Recycling Are Next

In aftermarket services, IoT use cases have centered around remote monitoring and predictive maintenance. Other potential use cases include focusing on reusability, repurposing, and recycling.

In aircraft decommissioning, Infosys is working on identifying components that can be resold and reused and triaging the level of recyclability of other components. Regulations in aircraft decommissioning require parts to be identified and traced from decommissioning to reinstallation into a new plane. Infosys is developing a blockchain solution for this. It is also working with SAE International on developing electronic transaction standards for the aerospace industry.

Infosys has largely completed its identification of sustainability capabilities and is now looking to fill gaps in its offerings and develop service repeatability with methodologies and accelerators. Infosys is starting with maturity assessments to help clients understand where they stand in their product sustainability journey and where they need to invest.

]]>
<![CDATA[TCS Emphasizes Neural Manufacturing in Support of Digital Manufacturing Initiative]]>

 

We recently talked to TCS about the company’s involvement in connected plants, TCS’ terminology for digital manufacturing.

TCS has a broad connected plants portfolio, ranging from manufacturing IT systems, MES, Industry 4.0 and connected supply chain, to industrial control systems, and automation. The development of this portfolio currently emphasizes further specialization and digitalization. The specialization goes deep with, for example, a recent offering focusing on supply chain integration for discrete manufacturing clients that operate under a batch production mode.

Along with increased specialization, digital is a high priority for TCS. Areas of focus here include AI, digital twins, and edge-based automation. It recently launched its Neural Manufacturing initiative to spread AI use cases among manufacturing plants.

Neural Manufacturing covers the data life cycle from data collection and classification, to analytics and AI use cases, and knowledge management. TCS has created three modular solutions covering these areas: DMP, InTwin, and CPOA.

TCS Digital Manufacturing Platform Focuses on Data Classification & Dashboarding

Digital Manufacturing Platform (DMP) collects data from sources such as data historians, sensors and equipment, and RFID tags, connecting with applications such as MES through interfaces or APIs.

However, DMP goes beyond data collection. It also focuses on data classification by creating data models, metadata, ontology, and instances. Classification is primarily manual at this stage, with TCS working on its automation, though DMP can already upload spreadsheet-based asset hierarchies and graph tools. Complementing data classification, DMP has ~50 standard dashboards aligned by use case, and include standard metric and descriptive analytics.

TCS InTwin Digital Twin Platform Has ~120 Standard Algorithms

While DMP provides access to data, InTwin provides digital twin functionality using AI to help users make sense of their manufacturing data. With InTwin, TCS has created ~120 standard AI algorithms around standard use cases such as prescriptive analytics, anomaly detection, what if analysis, and image analytics. TCS helps organizations prepare the data for building and enhancing these AI models, including selecting which data to use or creating synthetic data as necessary.

AI models continue to be a priority for TCS. It highlights that organizations’ demands are expanding from point solutions, e.g., anomaly detection, to more comprehensive digital twin projects such as equipment simulation. The company can recreate the behavior of specific equipment and conduct what-if analyses.

TCS Cognitive Plant Operations Advisor Solution Supports Plant Operatives

Finally, with its Cognitive Plant Operations Advisor (CPOA) solution, TCS is targeting the world of manufacturing knowledge management. While many organizations have engaged in AI pilot activity to derive data from their manufacturing operations, few have engaged in knowledge management beyond document digitization. TCS uses several techniques to create this knowledge. The company uses NLP for semantics, captures knowledge from different data sources such as drawings, and uses methods such as fault tree models for root cause analysis.

Further Specialization and Investment are Underway

TCS continues to enhance its Neural Manufacturing software suite. The company has already moved beyond the development of point use cases and accelerators and has formalized its capabilities into more expansive software products.

And indeed, NelsonHall finds that the use of AI in digital manufacturing is relatively embryonic. AI opens many possibilities around data and a better understanding of the behavior of equipment and plants, bringing new possibilities such as equipment and plant simulation. The number of use cases is fast expanding. We are glad to see TCS is making the necessary investment in this strategic space.

]]>
<![CDATA[Testbirds Prepares for Hypergrowth]]>

 

We recently talked to Testbirds, the largest Europe-headquartered crowdtesting firm, founded in 2012. We found Testbirds upbeat after the pandemic. The company had an excellent year in 2020, achieving revenue growth of 30% as organizations, challenged by closed offices, turned to Testbirds to conduct crowdtesting of their digital initiatives. This was followed by another excellent year in 2021, with revenue growth reaching 40%, led by digital projects, and Testbirds is expecting similar growth for this year. In parallel to this continuing sales momentum, Testbirds has reached operational breakeven and is currently funding its expansion organically. The company continues to recruit and now has ~600k crowdtesters in its community.

Expansion in Europe, Now Targeting U.S.

Expansion remains a priority for the company, which is increasing its office locations in Europe with new facilities in Leipzig and London, complementing its existing presence in Germany and the Netherlands, and to a smaller extent in France and Italy. Testbirds is structured into regional hubs, Leipzig and London being sales and project management centers serving clients in the German and English languages respectively. London is also a hub for project management, delivery, and sales and marketing activities to the U.S.

In addition to its direct sales activity, Testbirds wants to grow its indirect channel, increasing the level of work with partners. The company recruited a channel head in 2020 and expects its indirect channel to contribute revenues in 2022.

More Consulting and Specialized Services

Testbirds highlights that its indirect channel strategy will somewhat change its value proposition as partners will deliver the crowdtesting project management and analysis work themselves. Consequently, Testbirds has already changed its portfolio. In addition to offering crowdtesting project management and execution, the company is also now highlighting capabilities such as consulting and methodologies for advising clients on their crowdtesting goals and approaches. With this consulting-led service, Testbirds looks to accompany clients across their digital product journey. It has aligned its service portfolio around consulting, from defining a digital product concept to prototyping, development, testing, and release.

Beyond its consulting approach, Testbirds has expanded its offering beyond quality assurance and usability testing to online surveys, market research, and customer feedback. While QA remains core to its value proposition, the company is expanding in usability research and testing.

Testbirds highlights the specialized offerings of its Testbirds Exclusives brand. It recently launched its payment testing service, addressing online, offline, and in-store PoS. The company has set up a dedicated offering that can be provided on a standalone basis, focusing on European regulations on authentication or, more broadly, covering the customer journey, from product order to payment and returns management.

Alongside payment, Testbirds is promoting its offering verticalization, usually in the field. Examples include connected home equipment testing or EV charging station testing. Usability testing plays a key role in such verticalization.

Incremental Automation

Testbirds continues to invest in the Nest, its platform used by crowdtesters, its project managers, and clients. A recent example of incremental functionality is its companion app, which allows crowdtesters to log defects and screenshots directly from their mobile devices. The companion app simplifies crowdtesters’ work by avoiding going through a PC to log defect screenshots.

The company continues to invest in AI, using ML for mining defect comments and classifying defects into categories. It continues its work around defect prediction and automatically transcribing video voice into transcripts. While we initially expected AI to bring automation and efficiencies to crowdtesting, Testbirds finds that deploying AI use cases has been slower than expected.

So what’s next for Testbirds?

The company believes it has reached the inflection point where demand will move to hypergrowth. It has hired sales executives and counts on its indirect channel to grab this rising demand. The company has reorganized its service portfolio, driving specialized services. In parallel, Testbirds believes it has structured its execution to make its service repeatable. The company also pushes defect analysis work to its community through the Bird+ program to drive efficiencies. Finally, Testbirds is now opening again to further private equity funding. The company believes it will enter a hypergrowth cycle and external funding will help scale up.

]]>
<![CDATA[Cigniti Acquires RoundSqr to Accelerate its Digital Ambitions]]>

 

We recently talked to Cigniti about its digital ambitions and its acquisition of RoundSqr.

While remaining focused on quality engineering, Cigniti has quietly expanded its capabilities to RPA over the past three years. This extension is logical: RPA shares much with testing, relying on creating and maintaining bots or test scripts. This is the start: Cigniti has broad ‘digital engineering’ ambitions, and RPA was the first step.

With its recent acquisition of RoundSqr, Cigniti has taken another step in its digital strategy. RoundSqr has ~100 employees and revenues of ~$2.8m in its FY22. The company has ~30 clients, most of which are in the U.S., U.K., Australia, and India.

RoundSqr started as a digital company and currently offers data, analytics, and AI services. The company is also active in web and mobile application development services, including architecture design and APIs.

RoundSqr strategically invested in AI, particularly in AI model validation and computer vision. The company brings in a methodology and expertise to model validation. RoundSqr has also developed an IP called Zastra that helps with computer vision-related annotation services.

AI Is Strategic to QE

RoundSqr highlights that testing of AI models is primarily restricted to evaluating their accuracy and relies on separating data into training and testing sets; it looks to take a more comprehensive approach across the model itself and its data.

The company evaluates AI models across six parameters, namely Stability (conducting testing several times on the same data); Sensitivity (mitigating the impact of noise and extreme scenarios on the output); Data leakage (using non-training data when building the model), Performance (the model will have the same outcome even if the data is changed), Bias, and Predictability.

Beyond the AI model, we think RoundSqr’s AI capabilities will be instrumental to Cigniti’s QE activities. Organizations have started using AI to conduct focused testing to identify areas where they expect bugs. But AI is also relevant for automating test script creation and maintenance. The offerings are getting ready, and client adoption is now starting. We think AI has the potential to revolutionize the QE industry if it removes human intervention around test scripts.

RoundSqr Brings Computer Vision Annotation IP

Zastra, the IP that RoundSqr has built over the past 18 months, is a computer vision product that targets image tagging and annotation, the action of identifying objects, people, or living organisms in a picture. Zastra can provide the necessary steps for identifying objects, including image classification, object detection, and semantic and instance segmentation. RoundSqr targets several sectors with Zastra, primarily manufacturing, medtech, and utilities. Its use cases include defect detection, track and trace, CT and MRI scans, and satellite images.

Zastra links nicely, we think, with QE in the UX testing area. The role of testing has primarily revolved around testing the functionalities of an application. However, testing image rendering, e.g., on a website, has been far more limited, mostly around pixel-to-pixel comparison. We think AI models open new use cases for websites and digital technologies such as AR/VR and quality control in manufacturing plants.

RoundSqr’s product roadmap for Zastra includes synthetic data generation and audio annotation. The company will also expand its hosting options beyond AWS to Google Cloud Platform and Oracle Cloud.

Revenues of $1bn by FY28

This is the beginning of the journey. The priority for Cigniti and RoundSqr is now cross-selling and accelerating further organic growth.

However, RoundSqr alone is not sufficient for Cigniti to reach its $1bn revenue target by FY28, up from $167m in FY22. To achieve this objective, the company will rely on both organic growth and M&A.

Future organic growth will come from further expansion of its service portfolio to digital offerings such as data, AI and ML, blockchain, cloud computing and IoT. The company also plans to grow within engineering and R&D services, both industry 4.0/digital manufacturing and product/platform engineering services. Cigniti targets connected devices, taking an AI-based approach.

Cigniti’s client base includes BFSI, healthcare, medtech, travel and hospitality, and retail. The RoundSqr acquisition further strengthens Cigniti in BFSI. It also brings further focus on ISVs, and the supply chain and manufacturing functions, which Cigniti sees as having great growth potential.

To support its portfolio expansion, Cigniti will need to continue to acquire. Acquisitions such as RoundSqr will bring further specialization and are precious. Cigniti will, however, need a transformational transaction. Watch this space.

]]>
<![CDATA[Qualitest Acquires ZenQ, Expands Portfolio to Digital & Product Engineering Testing]]>

 

We recently talked to Qualitest regarding its acquisition of ZenQ.

ZenQ is the latest in a series of recent acquisitions by Qualitest, under the ownership of PE BridgePoint. The company acquired four firms in 2021:

  • QA InfoTech (QAIT) in Bangalore, doubling Qualitest’s presence in India
  • Olenick in the U.S.
  • Telexiom in Germany
  • Comply, an Israeli company that added a specialized capability in healthcare regulatory compliance.

The latest addition, the Dallas-headquartered ZenQ, aligns with Qualitest’s objectives to build digital transformation capabilities. It strengthens Qualitest in DevOps/continuous testing consulting and brings specialized digital expertise such as AI and blockchain. Finally, it opens Qualitest to the world of product engineering QE, around high-growth areas such as connected devices/IoT, including AI-intensive equipment such as drones.

Continued Expansion in Digital

ZenQ brings capabilities in digital, including blockchain testing. The company has worked primarily for ISVs across various verticals and use cases. Blockchain QE adds a niche high-growth area of expertise to Qualitest’s expanding digital testing portfolio. The company has already expanded to RPA/bot testing and application migration to the cloud testing. Also, its December 2019 acquisition of Israeli start-up AlgoTrace helped kickstart its AI offerings, focusing initially on data science. Since then, Qualitest has expanded its AI analytics and automation portfolio in visual testing and test case optimization areas.

Qualitest Enters Connected Device Testing

Importantly, ZenQ adds expertise around connected devices across various products, including drones, petcare and medtech devices, smart home and logistics products, and solar panels. The company is active in product engineering, in specialized services such as communication protocol QE and interoperability. This brings Qualitest to a new world of bundled hardware and software, where software (e.g., embedded software, mobile apps) plays an increasing role and where Qualitest has its roots. With ZenQ, Qualitest expands to hardware testing, where lab-based automation emerged only a few years ago.

Importantly, connected product testing also brings AI, notably computer vision, e.g., for use cases such as inspecting the quality of goods produced in a manufacturing plant, monitoring the health of forests and crops, or animal geo-fencing. Qualitest has experience in this space and has developed its AI-based IP Test.Validator for image recognition.

Further Scale

In addition to its portfolio expansion toward digital QE, ZenQ reinforces Qualitest’s capabilities in three countries:

  • Its onshore presence in the U.S. and Canada (Toronto)
  • Its delivery capabilities in India, adding Hyderabad to Qualitest’s presence in Bangalore and Noida.

In total, ZenQ has ~700 employees.

The integration journey for ZenQ and Qualitest is in its early stages. Cross-selling is a priority. From a portfolio perspective, expect Qualitest to bring further quality engineering and AI capabilities to ZenQ’s projects. For Qualitest, assuring product engineering is a new field with tremendous growth potential, and we expect the company to invest in QE automation in this space.

Meanwhile, Qualitest still has bold growth ambitions. The company has aggressive plans to reach $1bn in revenue in the next two years. Further acquisitions to gain scale both onshore and offshore and expand the portfolio to digital are likely.

]]>
<![CDATA[Capgemini’s Sogeti Positions QE in the World of IT Sustainability]]>

 

There is a big divide between IT sustainability and quality engineering (QE). In IT, sustainability is emerging from a carbon emission niche, expanding from a consulting to an execution phase. In QE, the focus remains primarily on functional automation with continuous testing/DevOps and AI as primary drivers. In short, the two have little in common.

As such, we had not anticipated that QE could soon become part of sustainability initiatives. However, Sogeti, part of Capgemini, recently briefed NelsonHall on how it is adapting its QE offering to sustainability with QE for Sustainable IT.

Measuring carbon footprint at the transaction level

Sogeti has designed QE for Sustainable IT, targeting the environmental side of sustainability (which also includes economic and social aspects). The company promotes a stepped transformation of IT rather than through big bang approaches. It highlights that once a client has started measuring its carbon footprint, it implements its strategy primarily by reducing its application estate and migrating its applications to the cloud.

Sogeti wants to offer a different approach to transformation, looking at the transaction level. The company will initially conduct its Green quality assessment, relying on its VOICE model, to understand the client’s sustainability objectives. Sogeti will then identify the most used ones in the production environment. It will then estimate how each transaction impacts the usage of hardware and networks (e.g., CPU, storage). Once done, the company will calculate the carbon footprint of each ERP transaction in production environments in the past 12 months. Once the applications have been transformed, Sogeti will reculcate the carbon emissions and measure its progress.

Where does QE fit within IT sustainability?

With its test case/test script approach, Sogeti highlights that QE already has the required experience and tools. The company will conduct the transaction, using functional test execution tools to measure the usage of hardware and networks. It will then capture each transaction’s hardware and network usage using APM tools.

Sogeti has worked with its development peers on the transformation side. The development teams will work on the code related to the ERP transaction, streamline the code, and remove dead code.

Sogeti looks to extend beyond this transformation phase and become a “sustainability quality gate”, mirroring the traditional role of testing in deciding if an application in development can be deployed in production environments. To do so, the company is currently working with a partner to build accelerators, e.g., a sustainable static code analysis to measure the “sustainability technical debt” of an application. The tool relies on checking if developers used sustainable development best practices.

This is just the beginning of Capgemini’s QE journey into sustainability. It sees increasing traction, thanks to regulatory pressure and consumer expectations, to reduce the carbon footprint of enterprises.

Capgemini’s roadmap for QE for Sustainability goes beyond ERP applications. The company wants to expand to other COTS and custom applications. With Capgemini’s CEO driving the company’s sustainability effort both internally and to external clients, expect to see more of these offerings in the next few months.

]]>
<![CDATA[NTT Combines NTT DATA & NTT Ltd., Streamlining ICT Operations]]>

 

NTT DATA recently announced the long-planned merger of its international business with NTT Ltd., the overseas ICT unit of Japanese telecoms giant NTT. The combined NTT DATA and NTT Ltd. will have revenues of ¥3.5tn (~$26.2bn) and 180k personnel.

With this move, NTT unites its two ICT units into a single entity, driving its vision of One NTT. The merger will remove some overlapping capabilities and should help drive revenue synergies by FY25. The larger NTT DATA organization will spearhead NTT's technology presence in enterprises.

The merger comes when NTT DATA has completed one cycle of its strategy for overseas growth. After several significant international acquisitions (Dell Services, everis, itelligence, and Value Team), the focus turned to cost savings, portfolio management, and through its Global One initiative, driving coordination across the firm. In FY21 (the year ending March 31, 2021), NTT DATA's overseas business reached an EBITA margin of 6.5%, a notch below its 7.0% target.

The addition of NTT Ltd. Doubles NTT DATA's Overseas Presence

With around $10bn in revenues and 38k employees, NTT Ltd. more than doubles NTT DATA's international business to $18bn.

One of the primary benefits of the merger centers on cost synergies. NTT DATA expects to achieve ¥30bn (~$150m) in savings by FY25 (the year ending March 31, 2026), improving margins in its overseas business. NTT DATA now expects to reach an EBITA margin of 10% in its overseas business, a 50% improvement. The absorption of NTT Ltd. into NTT DATA (and its transformation) will take time, with the synergy target suggesting a fast two-year effort from the day the merger is operational in July 2023.

NTT Ltd. Brings A Diverse Network and IT Infrastructure Portfolio

From a service portfolio perspective, NTT Ltd. brings a wide range of network and connectivity services, hardware and related services, data center hosting, IT infrastructure services, and resales. The company is particularly well known outside Japan for its Dimension Data business, a significant acquisition in 2o19, which brought in around $4.0bn in revenues and 11.5k personnel at that time.

Dimension Data, itself a regular acquirer, had strengthened its network capabilities through multiple transactions, including Nextira One. It had also expanded into application services in APAC, mainly in ANZ and Singapore.

NTT Ltd. brings a diversity of attractive and perhaps some less attractive IT and telecom/network services capabilities. The company started unifying its capabilities back in 2019 when it regrouped all its units under the NTT Ltd. brand. This transformation continues, with NTT Ltd. still going through a portfolio and cost transformation to drive margins up.

NTT Prioritizes the U.S. for Further Expansion

Last October, NTT DATA revised its mid-term plan, targeting ¥4.0tn in revenues in FY25. The services portfolio will focus on five primary offerings: Cloud, Data and Analytics, Security, ADM, and Enterprise Applications Services. Similarly, the target markets are five industries: healthcare & life science, automotive, insurance, telecom, and banking.

The structure of the new arrangement is interesting: parent company NTT will own 45% of NTT DATA's overseas business, with NTT DATA controlling the remaining 55%. It is unclear why NTT did not increase its share in NTT DATA directly rather than holding a stake in the overseas business. NelsonHall expects a change in the capital structure in the mid-term.

Acquisitions will be on the agenda. NTT DATA feels that, despite its ¥476bn FY21 (~$3.5bn) revenues in the U.S., it still has room for further expansion, with an ambition to expand its capabilities in digital.

Indian offshoring is also likely to be on the agenda. NTT DATA, despite the Indian presence brought by KEANE and Dell Services, still needs to grow in the country, especially considering its U.S. ambitions. Expect to see acquisitions to expand delivery capabilities in India.

]]>
<![CDATA[Atos Announces Major Restructure: Analysis]]>

 

On June 14, 2022, Atos announced the unexpected separation of its IT infrastructure services unit, Tech Foundation, from its BDS and Digital units. With this move, Atos has aggregated its high-growth and high-profitability units into a new company, Evidian. Its infrastructure services capabilities will stay in legacy Atos, with the objective of stopping the revenue decline and improving its profitability. This announcement raises several questions, which Dominique Raviart, NelsonHall’s IT Services practice manager, addresses in this video blog.

]]>
<![CDATA[EPAM Exits Russia, Accelerates Delivery Diversification]]>

 

The war in Ukraine has brought EPAM Systems (EPAM) under the spotlight. Although headquartered in Newton, PA, EPAM has a delivery network heavy in Central and Eastern Europe. It was founded in 1993 in New Jersey and specializes in service development, digital platform engineering, digital product design, and custom software. However, with its first offshore development center opening in Minsk, Belarus, in 1995, EPAM then expanded into Russia and Ukraine for delivery. In February 2022, on the eve of the invasion of Ukraine, 58% of the company's headcount remained in these three countries. Ukraine was the largest country with ~14k personnel at the end of 2021.

Reflecting on this situation, EPAM is adjusting its support model, moving people out of the region, and establishing brand new sites. The company acted quickly by deciding to exit Russia last month, but the relocation from the region continues EPAM’s diversification strategy already in play. There have been growing geopolitical and social uncertainties across the region. In 2014, with the Russian annexation of Crimea and the following Donbas invasion, EPAM started diversifying its delivery network to other countries in Central and Eastern Europe, Latin America, and India.

Yet despite the most recent developments in the region, the company announced excellent financial Q1 2022 results. During the quarter, EPAM generated revenues of $1.17bn, a year-over-year increase of 50.1% (40.1% organically). Profitability remained higher: its adjusted operating margin was 16.1%, down 1.4 pts.

Moving employees at scale within and outside Ukraine

The company is helping to relocate most of its employees in Ukraine during Q2 2022 to billable positions. The impact of the relocation effort goes beyond travel and setup costs, with EPAM highlighting that relocated employees are now in more expensive countries and have increased their wages. EPAM is currently renegotiating with clients to adjust rate cards.

Amid the war-torn conditions, the company helped move thousands of people from east to west inside the country and abroad (e.g., Poland, Hungary, Turkey, and Serbia, and across countries). Approximately 2k Ukrainian employees relocated abroad.

Accelerating delivery diversification

With phase 1 of its relocation strategy well in progress, EPAM launched phase 2 in parallel and has accelerated its delivery diversification effort to India, Mexico, and Colombia and across locations in CEE and Asia. By 2022, EPAM expects Ukraine and Belarus to account for just 30% of its capacity.

One of EPAM's challenges is to grow its presence in India. Its Alliance Global Partners acquisition in 2015 brought an estimated 1k personnel to India. The company recognizes it took time to learn about the market and has started developing the Indian talent market, doubling its presence in the country in 2021 to 4.3k personnel. This is the beginning of EPAM's expansion into India. Mexico, with its ~1k employees, will complement India.

Next steps: revenue growth and profitability

EPAM is now turning its attention to revenue growth and profitability, both of which the company has indicated will be under pressure in Q2, with y/y organic growth around 28% and its operating margin in the 3-5% range.

The utilization rate will be down in Q2. EPAM highlights that in Ukraine, despite the challenges, it did not record a drop in utilization as would have been expected. Q1 2022 utilization was 78.4% compared to 76.8% in Q1 2021. However, the company suffered from a "considerably lower utilization" level for employees remaining in Russia. The situation is also complicated in Belarus: EPAM plans to stay in the country, but because "a defined number of clients" are looking for alternate delivery, utilization rates are under pressure.

In the coming months, as EPAM gradually moves from delivery resiliency to revenue growth, it will start business expansion, initially within its current client base. EPAM expects a fast rebound, and is targeting H2 2022. However, it has not provided guidance for the full year. Nevertheless, we think the company is doing very well in mitigating the circumstances of the Russian-Ukraine conflict.

Outlook

EPAM is more than a financially-sound firm. The company is demonstrating its commitment to Ukraine, pledging $100m in aid to help its employees and relatives with a wide range of requirements. Also, EPAM established the Ukraine Assistance Fund to support humanitarian aid organizations that provide direct assistance to persons in need across Ukraine. This fund exists in addition to and apart from the $100m humanitarian commitment.

]]>
<![CDATA[Infosys Cobalt Cloud Focuses on Transformation, Security & Sustainability]]>

 

NelsonHall recently attended the ‘Infosys Cobalt World Tour’ conference in New York and, as the world begins to open up again, it was great to engage with Infosys executives face-to-face.

Infosys presented its Cobalt Cloud strategy and use cases to increase awareness of the benefits to the marketplace and highlighted that its cloud approach is focused on achieving business objectives rather than simply moving current solutions to the cloud.

To set the context, Infosys launched Infosys Cobalt in 2020, which includes services, solutions, and platforms to enable cloud-powered enterprise transformation. The Infosys Cobalt Cloud Community currently provides a catalog of 225 industry cloud-first blueprints and 35k cloud assets curated from Infosys' experience in delivering cloud programs for G2K enterprises. The Cloud Community includes Infosys experts plus partners, clients, academic institutions, start-ups, gig workers, and cloud developers.

Expediting the move to the cloud

Infosys takes a three-layered approach to the cloud through Infosys Cobalt to develop new business capabilities to meet emerging business needs and faster time to market. It also aims to reduce multi-cloud complexity through a secure cloud platform, bringing elasticity to the resource layer. The three approaches are:

  • Consumption Layer (Business Services): Infosys sees a new paradigm in the consumption layer, utilizing data analytics to derive business outcomes within the organization, including industry-specific solutions
  • Platform Layer (Technology Platforms): Infosys aims to move clients up the value chain, including helping them transform data lakes to the cloud, refactoring apps to be cloud-native, and using PaaS
  • Resource Layer (Cloud Resources): most clients start here with IaaS and cloud for consumption, network, and storage, and establishing virtual private cloud and connection between private, public cloud, and on-premise. This approach includes accelerating migration, taking native services from hyperscalers, and building on top of the cloud platform.

Infosys enables clients to create services consumable within their enterprise utilizing multi/hybrid cloud services, with platform technology enabling leaner operations with a heavy focus on engineering. Automation and IaC enable a developer-centric model that extends from DevOps to DevSecOps to NoOps in an agile manner. Key assets utilized in cloud platform engineering include Polycloud Platform, Cloud Automation Café, and Security Reference Architecture.

Enhancing security

Infosys enables enterprises to build cyber-resilient and compliant cloud ecosystems by adopting their ‘Secure by design, ‘Secure by scale’ and ‘Secure the future’ approach. Infosys assures ‘Digital trust’ through a structured execution process of diagnose, design, deliver and defend. From a Cobalt perspective, the blueprints and assets provided to their clients have regulatory and security compliance built into its solution and technical and financial governance. The security strategy utilizes strategic partnerships and pre-negotiated contracts in a platform security stack.

Sustainability is top of the agenda

Infosys looks to enable and accelerate sustainability solutions and drive impact through a business-to-business model and unlock long-term sustainability thinking across global enterprises. It aims to deliver the following benefits to its clients:

  • Making an impact on the triple-bottom-line of people, profit, and prosperity
  • Attracting a new wave of sustainability-minded clients, supply chain partners, and employees
  • Enhancing ESG attractiveness to investors and brand reputation
  • Securing resiliency in uncertain conditions.

Cloud is providing a vehicle for achieving carbon neutrality for its operations. Infosys offers Smart Buildings and Spaces services that enable the physical workplace to become digital by installing and managing Internet of Things (IoT) devices and sensors. Water management, carbon monitoring and control, solid waste management, energy assessment, and greenfield building consultancy are crucial sustainability competencies.

Outlook

Enterprises are accelerating their migration to private, public, and hybrid multi-cloud environments to satisfy greater demands for flexibility, scalability, resiliency, and security. This includes migrating on-premise infrastructure to hybrid cloud, including legacy application modernization to cloud-native systems. We expect Infosys to continue to build assets and cloud-first blueprints. In addition, in support of the Polycloud platform, we expect continued investments in the smart catalog and cloud-native services. The Cobalt suite of tools and assets enables enterprises to begin their cloud journey quickly and effectively with security in mind and an eye toward carbon-neutral outcomes.

]]>
<![CDATA[Amdocs Looks to Reinvent Agile QA]]>

 

The world of quality assurance (QA) is continually evolving, alternating between cycles of centralization and decentralization. QA became part of testing CoEs in the 2000s, driving process standardization, test coverage and automation. More recently, in its latest organizational model, it has become part of the agile development structure and is spread across agile projects. Quality engineers work alongside developers in agile teams of three to seven specialists, focusing on test automation and targeting the holy grail of QA: in-sprint automation.

We recently spoke with Amdocs’ Quality Engineering (AQE) organization about how the unit is embracing this trend. While Amdocs is well known for its software products for communication service providers (CSPs), the company now primarily operates under an IT service model, with AQE enjoying rapid growth. For example, AQE recently won a significant standalone testing contract from a tier-one CSP. The scope is large and involves ~200 applications, including new builds and applications in maintenance. The company will scale up to several hundred quality engineers at peak time. AQE is approaching the project by implementing a new organizational model based on agile and continuous testing principles.

Amdocs adapts function points for agile QA

For this project, AQE reinvented the function point estimation model for QA that is common in software development. The unit uses certification points to estimate the time and effort required to complete a QA activity. Beyond functional testing, the pricing model also covers non-functional and other areas such as test environment provisioning.

The function point-like approach is not new (a few vendors already took that route back in the mid-2010s) and has both advantages and disadvantages. On the positive, it has helped CSPs and vendors move past a T&M model to mitigate risks in fixed-price projects. Yet function points had drawbacks, e.g., counting function points took time and were manual, with experts sometimes diverging on their function point estimates. AQE aims to resolve this challenge by automating most counting of new functional features using agile program increment (PI). Also, AQE provides its estimate two months before the PI gets to development, giving clients visibility of upcoming costs to refine the scope of PIs.

Redefining agile QA teams

In the organization space, AQE is promoting a different approach, incorporating both centralized and decentralized aspects. The idea is that rather than embedding QA into an agile development team, AQE relies on a separate team of functional and technical experts, independent from the agile development unit.

For example, for the abovementioned project, AQE created standalone atomic QA teams to provide a broad spectrum of quality engineering activities, from functional to non-functional and quality engineering. In addition, AQE employed its automation IP and accelerator portfolio to increase the level of test automation.

By covering processes and analysis, AQE’s organizational approach goes beyond just setting up standalone QA expert teams. The organization highlighted that, as part of this project, it discovered that the client had focused most of its QA activities on integration testing.

AQE took a broader pespective on the project, helping the client shift from integration testing to E2E testing. In addition, AQE introduced unit testing among developers, thereby detecting defects earlier in the lifecycle.

AQE’s targets for the client include improving velocity by 80%, achieving cost savings of up to 50%, moving from quarterly to monthly releases, increasing resiliency, and improving customer satisfaction rankings. They demonstrate that QA is having an increasing and quantifiable impact on business outcomes.

]]>
<![CDATA[2022: A Bumpy Year for IT Services Spending]]>

 

NelsonHall is currently updating its bi-annual IT services forecasts, and here’s a quick look at some of the headlines for 2022.

COVID-19 no longer a threat to IT services spending growth

The world has evidently changed since our November 2021 update. The pandemic now seems largely under control, despite spikes in China. India, the world’s IT services hub, is gradually opening its offices. IT services providers now have substantial experience of operating primarily in a work from home environment, as indeed have their clients. Should there be another widespread spike in cases coming from a new COVID-19 variant, delivery disruption will be minimal.

A deteriorating economic environment

IT services spending is cyclical and dependent on GDP growth expectations. The macro-economic environment has deteriorated with the February 24 Russian invasion of Ukraine, amplifying existing trends at play. In its January 2022 World Economic Outlook Update, the IMF warned, beyond COVID, of “rising energy prices and supply disruptions” leading to higher and more spread inflation globally. In January, the IMF revised down its global GDP growth estimates.

There is no doubt that the Russian invasion of Ukraine, with its impact on energy, will accelerate trends at play before the invasion. In late April, we expect the IMF to further reduce its GDP growth expectations, for both North America and Europe.

Digital drives higher spending

However, the connection between IT services spending and GDP growth has changed. In a previous blog, IT Services in 2022: Entering A New Growth Phase, we highlighted that until 2019, IT services spending typically grew around 100 bps slower than GDP growth. From 2021 onward, we expect IT services spending to grow faster than GDP growth, driven by digital, security, and cloud. IT services spending growth in 2022 will be similar to 2021, between 6% and 7%.

Prices will remain under control despite the talent shortage

The hiring effort supports our growth hypothesis. Despite the many discussions about the lack of talent in the industry, data from the top eight India-centric IT services vendors(1) show they recruited in calendar 2021 at scale. Net recruitments reached ~330k employees. This headcount increase is exceptional.

We are entering the calendar Q1 2022 financial announcement season and will be tracking net headcount growth. An early indication came from Accenture for its Q2 FY23 results (ending February 28, 2022): the company increased its headcount by 24k (vs. 28k in Q2 FY21). Hiring remains sustained.

Despite the talent shortage, vendors have the ability to scale up and wage inflation is likely to remain under control. Meanwhile, vendors continue to accelerate their use of automation and cognitive technologies to drive further efficiencies and reduce, to some extent, their dependency on recruitment.

The Russian war in Ukraine brings great uncertainty

And then of course are the uncertainty factors, and this year they center on Russia, notably the duration of Russian’s invasion of Ukraine. If the war lasts longer or expands, the IMF predictions will again lower GDP growth.

We expect two scenarios form a long-lasting conflict.

  • Most enterprises set up their annual budgets in the September/October period: if the war continues (or concerns remain about new Russian conflicts in Central Europe), many large enterprises will seek to reduce their IT budgets. A significant freeze in Q4 2022 spending (and for full-year 2023) is still possible. 2022 may be a year of high spending increase in Q1-Q3 with a sharp decline in Q4. In parallel, enterprises will increase their cybersecurity funding and will delay their digital projects, should the war continue or expand
  • IT services spending will shift. While it will decline for enterprises, it will increase among central governments, around defense IT, data analytics and AI, and cybersecurity.

This year will be bumpy.

 

(1): Accenture, TCS, Cognizant, Capgemini, Infosys, Wipro, HCL Tech, and Tech Mahindra.

]]>
<![CDATA[TMMi Foundation Accelerates Development & Refreshes Model]]>

 

We recently talked to Erik van Veenendaal, the head of the TMMi Foundation, which promotes the TMMi QA process improvement methodologies.

TMMi Has Become A Widely-Accepted Methodology for Test Process Improvement

Founded in 2005, the TMMi Foundation is a not-for-profit organization focused on improving corporate test processes. It launched its TMMi methodologies at a time when organizations were beginning to formally structure their QA units and introduce best practices to increase productivity and quality in testing.

The Foundation decided not to address each tester's training and certification needs; it has an alliance for this with ISTQB, which remains the worldwide reference for QA training.

TMMi quickly became one of the two best-known testing process improvement methodologies. Its sphere of influence has gone beyond the number of certified organizations (250 globally). Many organizations have downloaded TMMi methodologies or purchased the books without formal certification. TMMi thus has gained an influence over QA that exceeds its client base.

Despite its share of mind success, the TMMi Foundation has faced challenges. One is the fast adoption of Agile methods; another is internal to the TMMi Foundation managing growth.

TMMi Foundations Updates its Methodologies & Books

The TMMi Foundation updated its methodologies and books to Agile. With the adoption of Agile, many organizations moved away from a process approach to transforming their QA. The TMMi Foundation continues to educate clients about the benefits of bringing a structured QA approach to agile development. Also, it launched an Agile version of its process methodology in 2015.

The Foundation is now developing a unified Agile and waterfall method. And the new methodology planned for 2024 will go beyond merging Agile and waterfall, with TMMi looking at including best practices and roadmaps around automation and AI.

Measuring TMMi's Benefits

Beyond refreshing its books and methodologies, the TMMi Foundation started to measure the impact of deploying its methodologies among certified organizations. The Foundation worked with the Universities of Belfast and Innsbruck, sending its questionnaire to organizations in its client database. The response rate of 64% provided a good level of accuracy.

The survey's findings show the effectiveness of TMMi. Approximately 90% of respondents expressed their satisfaction. Nearly three quarters (73%) of respondents reported that TMMi drove software quality improvement. However, TMMi does not lead to QA project reduction.

The survey also sheds some light on the TMMi corporate user population. Financial services, the largest spender on QA globally, is also the primary user (37%). Second is QA services/IT services vendors (30%). The remaining 33% span industries. Beyond improving the test process, QA organizations also use TMMi to demonstrate their capabilities, internally or to third parties, for regulatory compliance. Client organizations use, therefore, certification to showcase their QA transformation too.

Defining Clear Roles While Pushing its Service Ecosystem

In its expansion effort, the TMMi Foundation has also redefined roles and relationships with the TMMi ecosystem of partners. The Foundation plays a central role in methodologies and syllabi (for training). It is also the certification entity for client organizations undergoing TMMi assessment (through a sample approach). The TMMi Foundation also provides accreditations ifor training and assessment service providers and certifies individuals, e.g., as TMMi (Lead) Assessor.

The Foundation believes that partners will play a crucial role in such expansion, starting with local partners, i.e., the 'chapters'. These chapters drive TMMi's localization and marketing. They address testers at the individual level, and ensure training such as TMMi Professional training and certification (for testers who want to learn about TMMi methodologies) is available locally. The chapters also make sure testers at the corporate level who conduct consulting (Test Process Improver) or assess QA organizations (Accredited Assessor, Accredited Lead Assessor) are trained. They also advise QA consultancies that want to become training or assessment partners.

Currently, the Foundation has 23 chapters operating in 51 countries. Its local partners have a widespread presence across the continents. The TMMi Foundation realizes it still needs to strengthen its geographical footprint. It will announce a partner in Germany soon.

To sustain its local partner expansion, the Foundation shares half of its certification and accreditation fees back to the local chapters, intending to grow their marketing initiatives. Beyond recruiting new chapters, the TMMi Foundation wants to increase its activity level in each geography where it is present. The growth potential is significant.

The Need for Structured QA Remains

We find that TMMi's renewed expansion and international effort come at an opportune time. Agile is driving functional testing and beyond as part of continuous testing. Organizations are only transitioning and require help and consulting services for this journey.

AI is the next paradigm shift. AI-based analytics provide the Foundation with better-informed QA decisions and more focused testing. AI-based automation will drive the self-generation of test scripts. With technology evolving so far, QA organizations will need to resume a disciplined approach to QA while coping with Agile's decentralized QA needs.

]]>
<![CDATA[TCS Challenges Foundation of Functional Testing with 'One Automation Ecosystem']]>

 

The software testing/QA industry relies on three steps for functional testing:

  • Defining test requirements (what do I need to test?)
  • Test case creation (creating detailed instructions)
  • Test scripts (using these test cases to develop scripts operated by a functional test execution tool, e.g., Micro Focus QTP, Tricentis Tosca, and Selenium Grids).

Broadly, these three steps have remained the foundation of how functional testing operates. This is true even for agile projects, for which organizations are accelerating their automation efforts. However, these three steps have their limitations, mainly in terms of time and effort to create and maintain test scripts.

BDD, MBT, and record-and-playback automate test case & script creation

Of the various approaches to challenge this three-tiered foundation, Behavior-Driven Development (BDD) has been the most widely adopted. BDD relies on creating a standardized English test case language, Gherkin. Because it is standardized, BDD helps to automatically create scripts immensely. Yet, the adoption of BDD to date has not been as spectacular as expected.

Model-Based Testing (MBT) had a promising value proposition. It aimed to create business process flows/diagrams representing a business transaction on an application. Once defined, the business process flows are standard and can automatically be transcribed into test cases or scripts. However, MBT’s adoption was limited, possibly because MBT relies on adding another level of test artifacts, which in turn need to be maintained. However, MBT has had some success for applications such as COTS, with standard business processes. Industries such as communication services providers and financial services have also found MBT helpful.

And then, there is AI. AI has helped modernize record-and-playback tools. These tools mark down all the steps performed by a user when completing a transaction on an application. They then repeat the transaction and execute it in a functional execution tool. However, records are rigid and will fail when developers make minor changes such as a field name or location adjustment. AI helps deal with such minor changes and has improved the effectiveness of record-and-playback tools. The adoption of such tools is not widespread, but their value proposition is enticing.

With OAE, TCS brings it all together

We recently talked with TCS about its new automation initiative, TCS CX Assurance – One Automation Ecosystem (OAE). With OAE, TCS has aggregated its next-gen record-and-playback (TCS’ ScriptBot), MBT, and testing framework capabilities into one central tool. OAE brings together several approaches for automating the creation of test artifacts.

The beauty of OAE is that three tools are integrated and interoperable: a change in one immediately impacts the other two. OAE engineers can change views between the three tools and verify/edit conditions or edit the business process flow/test case/test script. For instance, a test engineer may modify recently recorded test cases and add new conditions in the test framework view. The tool interoperability also means that different personas can use OAE: test engineers, of course, and business analysis for creating business processes and power-users for recording transactions. This is a step toward test script creation democratization, one of the QA industry’s priorities to decrease costs and spread tool usage.

There is another benefit: with the three tools, OAE focuses on test artifact creation before the test script level, at the business process, or test case level. TCS can then use these artifacts to create the test scripts in its technology of choice, e.g., Selenium for web applications, Appium for mobile apps, a TCS IP for APIs, and Micro Focus for ERPs, mainframe and client-server applications. The approach minimizes the level of test script maintenance and pushes it back earlier in the automation process. TCS highlights that the conversion of test cases into scripts is instantaneous: it has not witnessed any performance issues in the conversion.

OAE also helps test transactions involving several applications running on different technologies. Typically, a transaction may start on a mobile or a web application/website and include testing APIs (for shipment) and even mainframe (payments). In short, OAE makes end-to-end testing much more accessible.

OAE requires the same discipline as for any testing framework. For instance, users still need to componentize test artifacts. An example is application login, which OAE users must set up as a test component shared across all tests. Also, to help with the discipline, OAE uses NLP: users creating a test artifact will be notified by the system when their artifact in creation already exists.

OAE integrates with other TCS IPs and benefits from some of these. One example is UX testing, where TCS can include accessibility and compatibility testing scripts in its functional ones. Another UX testing example is usability testing, which is the pixel-by-pixel and AI-based comparison of web pages to identify browser rendering differences.

Looking ahead, TCS has several development priorities for OAE, including accessibility on mobile app, integrating with functional automation tools such as Test Café/Cypress.io as an alternative to Selenium. TCS will also use its IP, SmartQE AI Studio, to collect application data during the SDLC and assess its quality. AI remains a priority.

OAE is a new IP, and TCS recently started promoting it among clients. NelsonHall welcomes TCS CX Assurance – TCS’ One Automation Ecosystem initiative for automating the creation of test cases and scripts. This is the future of functional testing, and it is AI-based. TCS is at the vanguard here.

]]>
<![CDATA[Expleo Accelerates Application Security Testing with AI & On-Demand Digital Offering]]>

 

DevSecOps Emerging

Application security testing has been part of functional testing for many years without being a significant investment topic. Organizations have typically favored functional testing automation while moving to agile/continuous testing; they have considered application security testing as an afterthought.

With the increased emphasis on cybersecurity, application security has become part of DevOps to create DevSecOps. DevSecOps promotes the democratization of application security testing. It also brings a shift-left focus, conducting application security at the development level rather than after functional testing.

Application security as part of DevOps and continuous testing requires automation. And this is where the challenge lies. Application security testing currently requires as much human expertise as software tool usage. Most testing services providers and their clients limit themselves to running scan tools such as source composition analysis (SCA) software and vulnerability detection software such as static and dynamic application security testing (SAST and DAST) tools.

However, running vulnerability detection software is not enough: these tools require going through the output and separating defects from false positives. Processing the tool output is time-consuming, tedious, and requires high application security expertise. Expect this analysis to slow down the continuous testing process.

Expleo Uses AI to Accelerate Vulnerability Analysis…

We recently talked to Expleo to understand how it is conducting and promoting application security testing within the context of continuous testing. The company is pushing application security test automation, and it has its own Xesa and Intelligent Vulnerability Assessment and Penetration Testing (iVAPT) IPs supporting this effort.

With Xesa, Expleo has pre-integrated several tools for integrating SAST and DAST (Portswigger BurpSuite) as part of continuous testing. Xesa also includes open-source ZAP Proxy for tool orchestration, and Defect Dojo (security defect management).

However, Expleo’s value-add relies on its automated defect analysis. iVAPT uses AI models to categorize defects by nature and severity, helping security experts shorten their analysis time. It uses ANN to process vulnerabilities based on past defect history. Manual testers will then verify the false positives allocation. This is the first test in the application security automation journey.

…And its On-Demand Digital Model for Shortening Provisioning and Delivery

Expleo has deployed its on-demand digital model and offering for application security to complement its automated vulnerability capabilities, still aiming to shorten time-to-market. The company relies on a shared delivery model and its X-Platform.

The company promotes a shared delivery center model for quickly ramping up its application security experts. Experts provide security across the application lifecycle, from the requirement level (e.g., security requirement reviews), to the design phase (threat modeling and design review), development and testing (SCA), and production (DAST and pen-testing).

The company highlights that it can mobilize experts through its shared service centers within 48 hours. Expleo has ~200 application security testers globally across multiple locations: in India, France, Ireland, the U.K., Germany, and soon Egypt and Romania. Expleo relies on its preferred tools, mostly open-source software, to provide the service and shorten delivery time.

Expleo recently launched its X-Platform. On the X-Platform, clients define their requirements, order their services, and follow the project’s progress and KPIs. X -Platform goes beyond service selection and includes project technology support, monitoring and analytics/reporting.

AI Will Play a Significant Role in DevSecOps

This is not the first time we have seen QA offerings that combine shared delivery, reliance on a service catalog to promote standard services, and a portal. Despite their value proposition, such offerings have had niche success.

In our view, such offerings have the potential for short-term activities such as threat modeling, pen-testing, and design review that regularly require services for up to three weeks. In these instances, the business case for clients to have a dedicated team can be difficult.

We see Expleo addressing the need for speed in continuous testing/DevSecOps from several angles. This is excellent news. AI, in particular, has the potential to bring many use cases. We think false positive identification is the first step in an AI journey to create intelligence out of vulnerability scanning.

]]>
<![CDATA[Atos Issues Profit Warning: Analysis]]>

 

Dominique Raviart gives his reaction to today's announcement by Atos, who issued its second profit warning in seven months.

]]>
<![CDATA[IT Services in 2022: Entering A New Growth Phase]]>

 

Here’s how NelsonHall expects the IT services market to shape up in the year ahead.

IT spending will be sustained in 2022

Organizations quickly resumed their IT service spending in 2021: after a decline of over 3.4% in 2020, IT spending rebounded in 2021, reaching 6.9% year-on-year growth (of course, against a soft compare). We are predicting growth of 5.5% in 2022.

IT service spending is cyclical. However, its link to GDP growth has changed. During the period 2015-2019, IT services spending grew by 100 bps slower than real GDP growth. We currently expect IT services will grow 100 bps faster than real GDP growth in the next three years. NelsonHall believes that the industry has entered an acceleration phase for two principal reasons:

  • The industry is exiting a 20-year period where offshoring, IT outsourcing, and then migration of cloud infrastructures impacted spending, despite volume increases. The effect of these is now waning
  • Digital continues to expand. Organizations will continue to invest in front-office and ERP SaaS applications, data & AI, UX/design thinking, RPA and automation, and IoT. This vast technology ecosystem also drives demand for cloud-based hosting and cybersecurity services.

Sustainability will become a significant part of digital.  While current spending is tiny, it is expanding quickly beyond an initial internal focus – to attract personnel and investors – to an external service. Carbon emission assessments are a starting point. We are already seeing some activity in areas such as plastics reduction and product sustainability in the Nordics, going beyond cost savings to environmental neutrality.

M&As: a lot of tuck-ins and…

The IT services industry quickly resumed its M&A activity in 2o21. Accenture continues to be the most active acquirer, with $4.2bn investment in its FY21 spread across 46 transactions, and it has similar ambitions for FY22. Accenture favors tuck-in acquisitions, though some also bring specialist scale.

Deloitte, Cognizant, Infosys, IBM Consulting (the former GBS), and Atos also followed an active tuck-in strategy, albeit at much lower volumes. NTT DATA had a tranquil year, reflecting, we think, its past acquisition integration and coordination work. Capgemini completed a few M&As, primarily targeting the Australian market, but remained focused on the integration of Altran in a volatile ER&D market.

Fujitsu was, as always, quiet on the M&A front, while TCS demonstrated, again, it does not need M&As to produce one of the best organic growths in the industry.

… Significant infrastructure divestments

We will also remember 2021 as a year vendors looked to divest sizeable IT infrastructure services businesses. IBM spun off Kyndryl, creating a $19bn giant. Kyndryl has guided the markets that it expects to resume growth by 2025: the new company will require four years to complete its transformation despite its financial means.

While IBM span off its entire IT infrastructure business, Atos has taken a more selective route, divesting some of its data center estate, related services, and its UCC/Unify product business. In total, Atos will divest €1.8bn in revenues from its IT infrastructure portfolio.

Globally, vendors are reconsidering their IT infrastructure service portfolio. TietoEVRY, the Nordic giant, wants to ‘partner’ in this space; the company has not yet indicated if the partnership will be a straight divestment or take a different form.

Meanwhile, DXC has reaffirmed that its IT infrastructure services business is strategic and has kept digital workplace services internal after the successful disposal of non-core assets and a significant net debt reduction.

Looking ahead, while cloud infrastructure continues to replace on-premise hosting, significant transformational opportunities remain around UCC and ITSM, automation, data & AI, and user experience.

IT services and ER&D overlap around digital technologies & platform engineering

While major IT services vendors reconsidered their IT infrastructure presence, they rushed into ER&D services. Both IT and ER&D services increasingly overlap in agile development, software product development, data & analytics, IoT and digital manufacturing, security, UX/design thinking, and PLM.

After the 2020 acquisition by Capgemini of Altran, we saw a few significant transactions. Accenture acquired a German onshore vendor, Umlaut, bringing 4.2k personnel. Cognizant also purchased a German specialist, ESG Mobility, bringing 1k automotive engineering specialists.

There is more to this overlap. Internet vendors, such as the FANGs and B2C online operations require web applications of high quality that can process millions of transactions. Vendors with a software product development strength such as EPAM and Cognizant Softvision have successfully established themselves in this digital platform engineering space.

We think the future lies in digital platform engineering, which fits the offshore model well, and draws on the strengths of both the IT services (agile development) and ER&D (software product quality) industries. This is an attractive market, and several vendors are ready to invest. Hitachi, for instance, did not hesitate to spend an impressive $9.6bn for GlobalLogic, a Californian vendor with FY21 revenues of $921m. We expect more IT service vendors to expand into digital platform engineering services in 2022.

]]>
<![CDATA[Atos: Using AI for Salesforce Progression Testing]]>

 

Since its creation, the software testing services industry has focused on regression testing, i.e., ensuring that previously developed software still runs after a new build. Regression testing is the core of all QA activities and has been widely adopted as part of multi-year managed services contracts. The financial services industry, with its extensive application estates updated by one or two releases per year, has been the largest spender.

For many years, the testing industry has left out progression testing, i.e., testing new applications and new features (rather than enhanced ones, as found in multi-year contracts). There is a reason for this, as new-build projects are short-term in nature: they do not easily accommodate the longer-term view and costs of building automation over time.

With the widespread adoption of agile, the situation has changed to some extent: agile projects, focusing on iteration and speed, have required functional automation to support accelerated development.

COTS functional automation suffers from lack of time

Then there is systems integration/COTS testing. COTS testing has largely remained outside of test automation. Organizations are challenged by a lack of time and budget to drive functional automation for their ERP/COTS projects.

Technology plays a more significant role: several specialized ISVs have emerged, including Worksoft, Panaya, and Provar (Salesforce). Their tools have focused on handling the technology specificities of these COTS.

In addition, testing services providers have complemented specialist tools with repositories of test cases aligned by a transaction. The more advanced services providers have also used model-based testing (MBT) for modeling transactions. However, the test case repository approach has its flaws, as clients will need to customize test cases. Acceptance of MBT has been somewhat limited, as it requires creating another layer of testing artifacts.

AI helps to redefine COTS functional testing

The real gamechanger for COTS testing has come from AI. For instance, Atos released its Syntbots xBRID to address Salesforce, SAP, and Pega projects.

xBRID is a next-gen record-and-playback tool. While testers will complete a transaction in Salesforce xBRID will capture the activities performed by the user. It will generate test scripts automatically and avoid the human-intensive scripting phase.

xBRID is also a testing framework whose execution relies on the Eclipse Integrated Development Environment (IDE). Testers will thus need to apply the usual discipline and componentize test cases, e.g., log in or application launch, to provide common sub-activities across transactions. xBRID works for web applications and with mobile apps (Appium). It also integrates with BDD test frameworks.

Perhaps more important, xBRID helps with test script maintenance. Atos will identify all the objects in a UI/screen and create libraries of such objects with their location and other characteristics. xBRID will scan each UI/screen during each sprint, identify changes, and update objects. As a result, test scripts will handle UI changes such as field name or position change and go through execution. This is an important step: test scripts can be fragile, and their maintenance has in the past been heavily human labor-intensive.

Handling complexities of SaaS applications and Salesforce

Atos continues to develop xBRID. Each COTS has its specificities; for instance, Salesforce has two UIs, Classic and Lightning, that complicate test automation. Also, Salesforce, with its three annual releases, brings continuous changes in HTML, CSS, and APIs.

A feature of SaaS applications and Salesforce is that they force clients into several upgrades per year. With on-premise applications, organizations can decide when to update and upgrade, but with Salesforce and SaaS applications, clients need to test whenever the ISV deploys a new release. Having these regular releases provides a case for investing in functional automation. Tools like xBRID will help.

Atos estimates that with xBRID, it can save up to 90% in testing time. What appears to be a massive reduction is because xBRID replaces mostly manual test activities. It is essential that the industry increasingly targets automated test script generation, something Atos calls “Automation of Automation”. Automated test script creation and maintenance is a paradigm shift for the industry.

]]>
<![CDATA[Qualitest Prepares for Further Inorganic Growth & Sector Specialization]]>

 

We recently spoke with Qualitest, the world’s largest QA pure-play. The company is in investment mode to accelerate its growth, backed by its majority owner, PE firm Bridgepoint. The company has added Cognizant leaders to its executive team (CEO, CMO, and India MD positions) with the intention of reaching $1bn in revenues in the next five years.

In support of this drive to accelerate growth, Qualitest has moved from a decentralized, country-led business to an integrated organization, and has embarked on several initiatives focusing on process standardization and automation, sales, and HR.

With its sales function, Qualitest is introducing an account management approach, land and expand strategies, and team-based selling, involving delivery teams in its bids. The company has maintained its focus on multi-year managed testing deals and has expanded its GTM target, building on its strengths in Israel, U.S., and the U.K, to include South America, Continental Europe, the larger Middle East, and India. It now targets five broad sectors: technology, BFSI, healthcare & life science, telecom, retail, media, & entertainment. The introduction of a systematic vertical sales approach is a significant change from a country-led GTM approach.

With its HR function, Qualitest has taken a comprehensive look across the employee lifecycle (from recruitment to upskilling, internal mobility, and succession planning) and matching the needs of projects. The program is vast, with Qualitest focusing initially on analytics to measure its HR effectiveness and then deploying intelligent automation.

The transformation of Qualitest also includes its value proposition. It has reshaped its portfolio significantly. Functional automation with agile/continuous testing has been a priority, along with digital and application migration to the cloud. Data and analytics, enterprise COTS are also priorities, along with AI. Its December 2019 acquisition of Israeli start-up AlgoTrace helped kickstart its AI offerings, focusing initially on AI-based analytics. Since then, the company has expanded its AI analytics and automation portfolio, with chatbot and data models as the new frontier. Qualitest has also rolled out several internal AI use cases in its sales and business support organizations. Examples include next best offer/action, fraud detection, and task allocation. The portfolio transformation continues, with AI and continuous testing as priorities.

Bridgepoint taking a majority stake in October 2019 has helped Qualitest accelerate its inorganic growth. The company has acquired four firms in 2021 so far: QA InfoTech (QAIT) in India, Olenick in the U.S., Comply in Israel, and an unnamed QA specialist in Germany. The first three acquisitions reinforced Qualitest’s presence in its core markets. The German specialist brings a footprint in a new geography in Continental Europe.

QAIT doubled the presence of Qualitest in India to a NelsonHall-estimated 2k FTEs, representing ~45% of its headcount. In terms of delivery, Qualitest’s value proposition was much more onshore-centric than its competitors. QAIT significantly changes the delivery profile, increasing its scale in India and giving it more recruitment visibility in Bangalore. Qualitest now plans to expand to Chennai and Hyderabad.

QAIT also somewhat expands the capabilities of Qualitest outside of testing services. The company has been active in agile software development, notably for front-end applications. While Qualitest’s primary focus is on QA, the company has also expanded to RPA. With clients awarding bundled development and test deals, Qualitest will gain from these development skills.

Chicago-headquartered Olenick Associates brings in 250 experts and a U.S. mid-west presence that complements Qualitest’s existing east and west coast footprint. Olenick brings in a client base in the financial services, legal, and utility sectors, with specific expertise in the electricity industry. The company provides performance testing across front-office applications (web and mobile apps, IVR, and text messaging), an offering that has increased in popularity after the Derecho storm in 2020. Qualitest has also gained through Olenick's capabilities around project management, DevOps.

Comply is a smaller acquisition, with 83 personnel. The company operates in the regulatory compliance space for the pharmaceutical and medical device industries, which are enjoying more vigorous growth than many other sectors. Comply works beyond QA and has a specialized software product, Skyline, for process analysis. With Comply, Qualitest gains further specialized and vertical capabilities. It will need to continue to invest in developing the acquired business.

Finally, the German pure-play brings an onshore presence with a client base in telecom and insurance. The company is sizeable with 250 employees and opens up a new territory for Qualitest. This is the first step for Qualitest into Continental Europe. No doubt Qualitest will deploy its model in the country and leverage its expertise in telecom and insurance in its significant geographies. The journey continues.

]]>
<![CDATA[Infosys Brings Managed Services to Digital Manufacturing]]>

 

The world of OT has been consulting-centric in recent years. Although their focus on digital manufacturing and IoT opportunities has intensified, enterprises have typically engaged their IT services partners mostly for use case identification, PoCs and pilot projects. Large-scale build projects have been uncommon.

But the pandemic has put digital manufacturing firmly at the top of the agenda and enterprises are now looking at optimizing their plant operations through remote work and increased automation. Cybersecurity concerns have also accelerated their focus on digital manufacturing. Manufacturing plants have heterogenous OT and IT systems; they rely on diverse equipment and ICT with different communication protocols and limited security capability. This OT/IT diversity in the shop floor makes monitoring and security difficult.

Several leading IT services providers, among them Infosys, advocate a comprehensive managed service approach to plant OT/IT and have developed comprehensive managed service offerings which include cybersecurity solutions. Managed service offerings start with the typical IoT use case of equipment monitoring, leveraging dashboards that allow the visualization of the health of one or multiple equipment assets, alerts and alarms for operators and maintenance field services.

In a development on its equipment monitoring offerings, Infosys has recently launched an OTSM offering, as part of its Cobalt Enterprise Service Management Café’s cloud blueprints, using ServiceNow’s OTSM tool and other technologies. The company highlights that clients do not have a clear view of equipment support responsibility; and operators on the plant floor do not know who to turn to when dealing with incidents related to equipment, HMIs, sensors, or networks.

Infosys is looking to bring to the OT world the structured approach common to service desk organizations. It articulates its OTSM offering around:

  • Asset discovery: identifying assets on the shop floor, from hardware to firmware version, including obsolete assets whose replacement needs to be budgeted for, or out of support software products. Infosys is also mapping plant asset dependency, and impact on other equipment, to build contingency plans. It conducts this phase primarily remotely, with personnel manually going through control systems’ lines of code and also looking at study drawings. Automation is next, with Infosys working on an automated asset discovery, initially looking at IT and OT connected to networks
  • Support governance and industrialization. Infosys is deploying the structured service desk approach to IT/OT to increase support coverage, bringing comprehensive visibility to support experts through dashboards, assigning tickets automatically, and building up a knowledge base to recommend resolutions.

Infosys’ offering will evolve both internally and in using ServiceNow functionality. Internally, Infosys is adding predictive maintenance use cases to complement monitoring ones. Externally, ServiceNow is working on integrating security software into its OTSM capabilities. With such functionality in the pipeline and many more coming longer-term, the service desk organization will evolve into a command & control center with automation at its core.

To achieve this vision of an automated support organization, Infosys will need to avoid the pitfalls of IT service desks and bring both flexibility and reactivity to operations. There are important benefits to be derived from the consumer experience now promoted in an advanced IT service desk, one that includes self-service, omni-channel communications, and knowledge bases. Client organizations should benefit from the democratization of digital manufacturing use cases such as remote monitoring, maintenance, and track-and-trace, to the shop-floor level, with empowered operators.

Client adoption of this OTSM offering will of course depend on the business case. Infosys is promoting this offering to its existing ServiceNow users. In particular, it is targeting support organizations that are structured around the ServiceNow tool and have the relevant processes and ServiceNow best practices. Nonetheless, OTSM’s pricing is based on the number of assets monitored, and will, at least initially, demand an investment by clients. To overcome this initial invest phase in the modernization of OT environments, Infosys is promoting transformational outsourcing involving multi-year contract agreements where OT modernization is funded by savings on operations management, a concept familiar to most CIOs.

]]>
<![CDATA[Capgemini’s ADM Aims for No/Low Application Maintenance Fees Thanks to QE]]>

 

Capgemini is aiming for low or no maintenance and support fees as part of its ADM offering. Depending on its level of responsibility for development activities, the company commits to reducing maintenance and support activities and is, therefore, making a bold statement unheard of in the IT industry.

To achieve such high aspirations, the company believes Sogeti's application testing – or quality engineering (QE) – plays a central role. Such a ‘quality gate’ is hardly new, but its role has evolved with the adoption of agile development methodologies. Agile, with its frequent releases to production, has accelerated the demand for functional test automation. Clients are spending more on automation and targeting in-sprint automation, where the features of a new release are already functionally automated, limiting the level of manual testing activity.

Capgemini highlights it has significantly invested around continuous testing and AI.

With continuous testing, Capgemini has integrated the DevOps tools with test execution software. The company continues to expand the scope of such continuous testing platforms to include support activities such as test data management with synthetic data, test environment provision, and service virtualization. Capgemini has gone beyond functional testing to non-functional (with application security playing an increased role) and static code analysis tools. The expansion continues.

Capgemini’s Sogeti also brings its investments in AI. Currently, most AI use cases focus on ‘better’ testing, with test defect prediction and test case optimization as quick wins. NelsonHall sees an increase in the number of AI use cases quarter after quarter, e.g., matching agile user stories with test cases. We think the creativity of firms like Capgemini to identify better ways of testing is limitless, provided clients have enough quality data internally.

AI use cases in testing expanded a couple of years ago from better testing to test automation. Recent use cases enable users to automate the test script creation phase and sometimes the test case stage. A particularly promising technology is next-gen record-and-playback AI-based tools. Capgemini’s tool will record the transaction and translate it into a test script. It will also scan the application under test, identify changes in objects, and update scripts accordingly. This is the beginning of automated test script maintenance, the QE industry’s most significant challenge.

Unsurprisingly, Capgemini’s QE automation approach has several requirements. For example, the company targets multi-year mid-sized to large deals, whose size will help Capgemini recoup its test automation investments. The company also looks for build-test-run contracts to control QE and build activities, whether application development or systems integration, e.g., SAP programs.

Capgemini aims to bring digital and cloud capabilities to its application development activities. The company targets application resilience, scalability, and security, with application migration to the cloud as a central element. Again, QE plays a crucial role in testing these attributes.

NelsonHall believes that Capgemini has made a bold move with its low/no maintenance fees value proposition. This offering comes at the right time. With the pandemic, clients have reignited large application services deals, with offshoring and automation as fundamental principles. Clients have cost savings on the agenda if only to leave more budget for digital projects.

]]>
<![CDATA[Navigating AI-Based Quality Assurance Automation with Infosys]]>

 

We recently talked to Infosys Validation Solutions (IVS), Infosys’ quality assurance unit, and discussed its continued investment in AI to automate testing and validate chatbots and AI models.

AI-Based Analytics from Test Case and Defect Data

The primary AI use cases in QA are around analytics: QA and development produce a vast amount of data that can be used for guiding QA activities. For instance, enterprises have vast amounts of test cases that often overlap or are duplicates. AI, through NLP, will go through test cases, identify keywords and highlight those test cases that are highly similar and probably redundant. This activity is called test optimization and can help remove between 3 and 5% of test cases. This may not seem a lot, but large enterprises have very significant repositories of test cases (Infosys has several clients with a hundred thousand test cases). Also, test cases are the basis for test scripts, which test execution software uses. More importantly, these test cases and test scripts need to be maintained, often manually. Reducing the number of test cases, therefore, has significant cost implications.

The analysis of test cases and defects brings many other use cases. The analysis of past test defects and their correlation with code changes is also helpful. Infosys can predict where to test based on the code changes in a new software release.

AI Brings Quick Wins that Are Precious for Agile Projects

There are many other data analysis quick-win opportunities. Infosys continues to invest in better testing. An example of a recent IP and service is test coverage. For websites and web applications, Infosys relies on URLs to identify transaction paths that need to be tested and compares them with the test cases. Another example is Infosys, for a U.S. bank, going through execution anomalies from the test execution tool and putting them into categories, providing an early step in root cause analysis. A rising use case is detecting test cases based on the comparison of user stories within agile and existing test cases.

We think the potential for AI-based analytics and resulting automation is without limits. NelsonHall expects a surge in such AI-based analytics and NLP, which will bring an incremental automation step.

Starting to Automate Human Tasks Outside of Test Execution

RPA also has a role to play in QA incremental automation steps. Outside of test script execution, functional testing still involves manual tasks. Infosys has developed a repository of ~500 testing-specific RPA bots to automate them; an example is a bot for setting up alerts on test execution monitoring dashboards, and another is loading test cases to test management tools such as JIRA.

With the predominance of agile projects, RPA can also be precious for highly repeatable tasks. However, RPA raises another issue: the maintainability of RPA scripts and how frequently they need to be updated. We expect Infosys to share its experience in this important matter.

Automation Step Changes Now in Sight

AI is also expanding its use cases from incremental automation to significant step changes. An example is Infosys using object recognition to detect changes in a release code and automatically update the relevant test scripts. In other words, Infosys will identify if an application release has a screen change such as a field or button changing place and will update the script accordingly.

There is more to come, we think, with web crawlers and next-gen record and playback testing tools. So far, client adoption is only just emerging, but this space is inspiring. Potentially, QA vendors could remove the scripting phase through automated creation or update of test scripts.

Chatbots Are Increasingly Complicated to Test

QA departments are moving out of their comfort zone with AI systems to test chatbots and AI models with AI systems.

In principle, chatbots are deterministic systems and rely on the pass-or-fail approach that QA tools use. Ask a chatbot simple questions such as the time or opening hours of a store. The response is straightforward and is either right or wrong.

However, the complexity of chatbots has increased. Voice plays a role and drives a lot of utterance training and test activity to solve language, accents, and domain-specific jargon challenges. Also, chatbots are increasingly integrated with hyper-scalers and rely on APIs for integration with back-end systems. Also, Infosys points to the increasing integration of chatbot functionality within AR/VR. This integration is bringing another layer of QA complexity and performance discussions. Infosys is taking a systematic approach to chatbot testing and has built several accelerators around voice utterances.

Testing of AI Models Is the Next Step Change Through Synthetic Data

With AI models, QA is moving to another world of complexity. AI models can be non-deterministic, i.e., not knowing the answer to a specific query; for example, identifying fake insurance claims for an insurance firm.

The traditional approach of QA, i.e., check the answer is correct or not, needs reinvention. Infosys is approaching the AI-model QA from several angles. For training and testing purposes, data plays an essential role in the accuracy of data science models. Infosys is creating synthetic data for training models, taking patterns from production data. With this approach, it is solving the challenge of the lack of sufficient data for training the AI model.

Another approach that Infosys is taking is a statistical method. It provides a series of statistical measures to data scientists, who can then decide on the accuracy of the data model.

AI model testing is still a work-in-progress. For instance, training data bias remains a challenge. Also, with QA meeting AI and data science, test engineers are clearly out of their expertise zone, and Infosys heavily invests in its UI and training to make its tools more accessible. The company points to further IP, such as using computer vision to check the quality of scanned documents.

There is much more to come:  the potential benefits of AI are limitless.

]]>
<![CDATA[IBM GBS Expands Salesforce Capabilities in Europe with Waeg]]>

 

IBM recently announced its acquisition of Salesforce Platinum Consulting partner, Waeg. The company, whose name means “wave” in old English, expands IBM’s Salesforce service presence in Europe and is IBM GBS’ second Salesforce services acquisition in 2021, following 7Summits in the U.S. It is also GBS’ fifth acquisition since IBM announced the intended spin-off of its managed infrastructure services business last October, highlighting how IBM Services is shifting its portfolio to digital offerings.

Waeg has, we estimate, around 160 employees. It has two primary strengths: specialized skills and its delivery network.

The company has a background in strategy consulting, business process re-engineering, and B2B Commerce Cloud (the former CloudCraze products, acquired by  Salesforce in 2018). Over time, Waeg expanded its B2B Commerce niche to B2B marketing automation (the Pardot products), Service Cloud, and ERP/back-office integration. The company has been investing in its MuleSoft capabilities, following the API-based product integration strategy of Salesforce. Waeg highlights that it has developed connectors for B2B Commerce with product management systems, such as handling promotions and coupons and logistics firms’ systems, to shipment tracking.

A particularly distinctive aspect of Waeg is its delivery network, which comprises a small local consulting presence combined with nearshore delivery centers within the EU. While the U.S. and the U.K. have adopted global delivery for traditional IT services and digital services, Continental European firms have preferred a more onshore approach, especially for digital projects such as Salesforce.

Waeg brings in a delivery organization that is primarily based in Warsaw, Lisbon, and Dublin. Meanwhile, Waeg continues to deploy its sales office network onshore in Amsterdam, Brussels, Copenhagen, Lyon, and Paris. Waeg is also building some onshore project management and business analyst presence to interface with clients with the same culture and language.

Waeg highlights that the pandemic and the adoption of WfH have deeply influenced European firms. Client demand for B2B Commerce Cloud has accelerated, while at the same time, the appetite for Service Cloud has remained very solid. Indeed, Salesforce’s earnings over the last year have shown high client traction for Commerce and Marketing Cloud along with a more traditional product such as Service Cloud (used in contact centers).

And, with WfH adoption, Waeg sees that nearshore delivery for digital projects, still in the EU, is now more acceptable to Continental European organizations for data privacy reasons. As such, Waeg provides IBM GBS with a foundation for scaling its Salesforce delivery presence rapidly.

We believe that Waeg will help IBM target new Salesforce opportunities in the manufacturing and life sciences sectors in Europe. The company brings a client base in manufacturing and CPG, OTC distribution, and animal health products. Significant clients include Novartis, Baxter, MSD, Biomérieux, Moet Hennessy, Friesland Campina, Barry Callebaut, and Dawn Foods.

Waeg is also expanding its Salesforce capabilities to supplier relationship management, where it sees significant client interest in the wake of COVID-19 and interest in sourcing locally and identifying new suppliers.

With Waeg, IBM GBS is expanding from its  Salesforce strength in the U.S. and now has a decent presence in growth markets in Europe. APAC will be IBM GBS’ next priority, though there may be further inorganic growth in key European markets.

As well as geographic expansion, IBM will further expand its portfolio specializations: field services, quote-to-cash, and Vlocity/Salesforce Industries are likely candidates. Given Salesforce’s growth and aggressive acquisition strategy, IBM GBS has plenty of options for further development.

]]>
<![CDATA[Whishworks Expands from MuleSoft Heritage to Whole Salesforce Service Ecosystem for Coforge]]>

 

Coforge, the former NIIT Technologies, recently briefed NelsonHall about its Salesforce capabilities. In 2019, the company acquired Whishworks, which became the foundation of its Salesforce activities. Whishworks, one of MuleSoft’s top five strategic partners globally, services clients across sectors, with BFSI being its most significant target market.

London-headquartered Whishworks also has an office in the U.S., in Princeton, NJ. Its delivery model is India-centric, its primary delivery centers being in Hyderabad and Noida. It is currently experiencing high growth, enjoying revenue growth of 30% in its FY21.

Specializing its MuleSoft Portfolio

Since 2019, Coforge has grouped all its MuleSoft and Salesforce capabilities under Whishworks, which now has 430 Salesforce practitioners, including 300 MuleSoft ones. Whishworks offers a wide range of services, from technology consulting to managed services, and is also a MuleSoft VAR in the U.K. and India.

Whishworks is working on developing a specialized portfolio of services. Two examples of this are:

  • Anypoint Platform 3.8 to 4.4 migration. MuleSoft is ending its support for Anypoint 3.9 by the end of 2021, leaving many of its clients with a mandatory migration. Whishworks has developed a fixed price offering for the migration, with pricing based on the number of APIs: it estimates the cost for a client with 50 APIs is around $100k. Whishworks highlights that some of its large clients have up to 1k APIs on different MuleSoft versions
  • The use of accelerators. Whishworks has four connectors that are available on Anypoint Exchange, the equivalent of Salesforce’s AppExchange. Whishworks’ connectors provide access to the APIs of the applications. An example of this is extracting data from a web application to a mobile device: the connector reduces the amount of custom code required to connect the two applications. Whishworks’ most popular connector is its Microsoft Azure Storage connector, which the company offers free of charge to clients.

Delivery quality remains a key focus, and Whishworks is relying on several approaches. Whishworks uses a centralized technical design authority team, ensuring that delivery teams apply best practices and get their sign-off. Whishworks want to avoid an API development team bringing in their development personal style by using standardized approaches.

MuleSoft is now the Foundation for Salesforce’s Customer 360

Further growth is on the agenda for Whishworks, initially with MuleSoft. The company highlights that MuleSoft aligns with Salesforce’s professional services approach, i.e., focusing on software products and leaving services opportunities to its SI partners. Whishworks is, in Europe, one of the two preferred SI partners for MuleSoft’s Commercial Business Unit clients. It is looking to expand its MuleSoft expertise to the U.S., where the service opportunity is immense.

Whishworks is also looking to expand to the entire Salesforce product ecosystem, from its technical MuleSoft niche to functional products. The Salesforce strategy will help here. Salesforce has made MuleSoft’s Anypoint Platform the official software tool for integrating its vast and quickly expanding acquisition-led product portfolio. Anypoint Platform is more than the technical glue of Salesforce’s applications. It has become a topic relevant to business, with MuleSoft’s API-based integration technology at the core of Salesforce’s Customer 360 value proposition. With Customer 360, Salesforce promotes a comprehensive customer profile through consumer data centralization and analytics.

Along with Customer 360, Whishworks also adds skills around the various Salesforce products, initially focusing on Sales Cloud, Service Cloud, Health Cloud, and Financial Services Cloud. The company highlights that these clouds rely heavily on MulSeoft for interacting with third-party applications. Also, Whishworks has already developed several vertical solutions, such as claims management for insurance firms and a financial services sector cloud migration tool and service.

Whishworks will be adding other vertical solutions to its portfolio: we expect the firm will ultimately address the whole client base of the larger Coforge.

And we also anticipate Coforge will bring business consulting capabilities to help drive discussions with clients around their digital transformation initiatives. More than ever, this consulting-led approach is required to make a Salesforce project more than a traditional enterprise application project.

]]>
<![CDATA[Atos Reignites its Salesforce Capabilities]]>

 

In November 2020, Atos unveiled its strategy for OneCloud, which focuses on key partnerships with the hyperscalers, and also includes three ISVs: SAP, ServiceNow, and Salesforce. We recently talked to Atos' Salesforce practice to discuss its growth plans and recent acquisitions.

Atos has been a long-standing SAP partner and recently reignited its Salesforce practice, making two acquisitions in Q4 2020 with Eagle Creek Software Services and Edifixio, both Salesforce specialists. Together, they bring around 600 Salesforce consultants, mostly based in the U.S. and France.

Eagle Creek brings specialization in the U.S. around Commerce Cloud, Field Services & Vlocity

Eagle Creek has its headquarters in Minneapolis, MN. The company has around 250 employees and serves clients in the telecoms, manufacturing, financial services, health & life science, and public sectors.

Eagle Creek initially provided Siebel Systems services, turning to Salesforce in 2016 when Salesforce acquired Demandware (now Commerce Cloud). The company has maintained its Commerce Cloud specialization, expanding its capabilities to .NET and Java development capabilities to enhance the UI of Commerce Cloud.

In parallel, the company developed capabilities around Field Services Lightning. Eagle Creek believes that the potential for field services automation is untapped. With Salesforce further investing in its field service product portfolio with the 2019 acquisitions of Click Software (labor scheduling), and MapAnything (driving directions on a mobile), Eagle Creek sees a preference by clients for Salesforce products over those of GE/ServiceMax or Oracle. In 2017, Eagle Creek became a Vlocity partner in addressing utilities and CSPs, becoming one of the top three service partners in North America.

Eagle Creek has an onshore-only delivery approach, with its technology centers located in North and South Dakota, close to its Minneapolis headquarters. These centers are used for training as well as delivery. Eagle Creek also benefits from the loyalty of its employees in the Dakotas, with attrition at ~5%.

Edifixio: Sales, Marketing, Community & SAP integration

French company Edifixio brings a different set of capabilities to Atos. Edifixio has a background in application migration to the cloud. The company also built B2B portals and marketplaces, integrating them with back-end systems (e.g., SAP). Ten years ago, the company expanded to Salesforce services, initially focusing on CRM/Sales Cloud.

Currently, Edifixio has ~80 Salesforce consultants in Paris and Grenoble and ~300 certifications. It has mostly capabilities around Sales and Marketing, along with Community Cloud and Heroku (for mobile enablement). The company is active on the application and data integration side and has developed several related accelerators and IP.

Edifxio provides an AWS-hosted SAP integration product as a managed service. Another IP is an AppExchange data quality tool that Edifixio developed for detecting data duplicates and improving its clients' data quality. Finally, Edifixio brings a DevOps IP, complementing Salesforce DX.

Atos: bold growth ambitions & portfolio verticalization

An immediate priority for Atos is consolidating its portfolio of services and accelerators and making these consistent across geographies. The company wants to push its application and data integration capabilities and make Salesforce's products interoperable, targeting Commerce and Field Services integration. Synergies with the rest of Atos in integration and strategy and business process consulting will help here.

In the mid-term, Atos has bold growth ambitions. With Edifixio and Eagle Creek, the company has 600 consultants in its Salesforce practice, and is targeting 1,500 consultants within two years. To fuel this growth, Atos is retraining internally as much as possible. M&A activity is also highly likely. The company just finalized, in February 2021, the acquisition of Profit4SF. The Utrecht, Netherlands-based Profit4SF is small firm, with 30 employees. More importantly, it brings precious Marketing Cloud capabilities.

Atos also intends to verticalize its service portfolio. An immediate priority is Vlocity (now Salesforce Industries) around telecoms, media, and utilities. Unsurprisingly, given its client base, Atos is also targeting the manufacturing sector, mostly in the U.S., Germany, and France.

We think that Atos' portfolio verticalization is an effective approach, helping Salesforce in its effort. While Salesforce has launched several industry clouds in the past years, it still needs to rely on its service partners to complement its horizontal capabilities. Two numbers indicate the priorities of Salesforce: it will spend $27.7bn on acquiring Slack, while it spent $1.33bn for Vlocity!

]]>
<![CDATA[IBM GBS Adds Salesforce Specialization with 7Summits Acquisition]]>

 

We recently talked to IBM GBS regarding its acquisition of 7Summits. After years of limited GBS M&A activity, IBM has been increasing its investment level in this business. With the planned divestment of most of its GTS business, GBS is core to IBM Services. Since October 2020, GBS has made five transactions across cloud, payments, SAP, and Salesforce.

The 7Summits acquisition, which fits into IBM iX, is the first Salesforce services acquisition since its Bluewolf transaction in 2016. With Bluewolf, IBM acquired a tier-one Salesforce partner and gained further scale. Bluewolf had 500 employees and strengthened IBM's Salesforce practice in the U.S. 7Summits also enhances the Salesforce practice’s presence in the U.S.: it is headquartered in Milwaukee, WI, and brings an onshore delivery presence in 30 states, mostly in Chicago, Indianapolis, Atlanta, Austin, NYC, Minneapolis, and San Francisco. We estimate its headcount to be ~240.

7Summits Brings A Specialization in Experience Cloud and Lightning Experience

Unlike Bluewolf, 7Summits is not about acquiring for scale (IBM already has scale, with a NelsonHall estimated 4,000 employees, excluding MuleSoft and Tableau personnel). What 7Summits brings to IBM is a portfolio specialization.

7Summits has a background in Experience Cloud (formerly Community Cloud), Salesforce's portal product. To date, the company remains heavily involved in Experience Cloud and has developed ~70 accelerators, the majority of which are based on Experience Cloud. The company has ~10 Lightning Bolts (i.e., industry templates) on AppExchange, focused on partners, clients, employees, and, in the case of Higher Education, students.

Also, within its IP portfolio, 7Summits has an important IP called Migration Factory, which enables Community Migrations. Community Migration is a set of tools and services for transitioning from SharePoint, Connections, or Jive. With Community Migration, 7Summmits systematically inventories existing content, maps to Salesforce UX constructs, and focuses on contextual data migration.

Its IP also includes around thirteen enterprise applications (point solutions, such as News, Events, and Job Board) and about 30 Salesforce Lightning components (e.g., Image Gallery and Leader Board).

7Summits also created a prototyping IP. The IP relies on Salesforce's Envision for creating UX/UI designs that are compatible with Salesforce's design requirements.  

Five years after its launch in 2015, Lightning Experience continues to drive activity at 7Summmits. 7Summits has developed a Classic-to-Lightning UI migration IP and looks at providing a roadmap for the migration, focusing on the UI and custom code and libraries. 

Beyond Experience Cloud: Integration, Consulting, and UX

Experience-led business solutions require capabilities beyond Experience Cloud. 7Summits uses Experience Cloud to bring together objects from across multiple Salesforce clouds and data and content from other systems to enable digital transformation. 7Summits estimates that each Experience-led implementation requires business consulting (25% of the services provided), UX/UI design services (~10%), technology and implementation services (~50%), with integration as an essential, and project management (~15%). Thus, the company has developed its capabilities around business consulting, experience design, and technology services. Looking ahead, IBM's capabilities in application and data integration, workflows, and the UX capabilities of iX will help scale 7Summits' capabilities.

7Summits recently verticalized its capabilities, starting with the software and high-tech sector, addressing client's needs like iterative product development, customer-led product roadmaps, and field-centric code sharing. 7Summits focuses on activities such as partner onboarding, connecting clients, and maintaining products.

7Summits has found similar partnership management needs in the manufacturing sector. The company also has experience in the higher education sector, with Harvard University, around student onboarding. The company is now expanding to the healthcare payer and provider sector.

The Road Ahead: Multi-Clouds, Agile, and Industry Solutions

7Summits is gradually expanding from its Experience Cloud specialty to become a Service Cloud and helping clients in their multi-cloud journey. Again, the more diverse Salesforce portfolio of IBM will help here. Meanwhile, IBM is expanding its Salesforce portfolio beyond the core Sales and Service Clouds. IBM hints it will conduct additional M&As to strengthen its portfolio. Also, NelsonHall expects a geographical delivery expansion to UKI, the EU, LA, and APAC.

IBM's Salesforce ambitions go beyond multi-cloud and strengthening its capabilities around the different Salesforce products. A priority is accompanying the client across the program lifecycle, from consulting through to application management, through cross-selling GBS' strengths around AI and analytics, automation, DevOps, and application management services. With many enterprise clients selecting Salesforce's products as a front office foundation, IBM highlights its agile development capabilities, using its Garage delivery model and its integration expertise.

Salesforce continues its rapid acquisition strategy, which brings additional products to integration. IBM is accordingly creating templates and reference architectures to pre-integrate different Salesforce products. Salesforce is accelerating its verticalization efforts, identifying white spaces in its vertical solutions, notably through last year's acquisition of major ISV partner Vlocity. Given Salesforce's fast-moving product portfolio, major SI partners such as IBM will be relying on the strength of their partnership with Salesforce to know where to invest next in its industry solutions.

]]>
<![CDATA[passbrains Accelerates Development Thanks to Acquisition by msg]]>

 

We recently talked to passbrains, the crowdtesting pure-play with dual headquarters in Hamburg and Zurich that has just been acquired by German IT service and VAR vendor, msg. The acquisition is timely for passbrains. Its founder died in the summer of 2020 at a critical time, as the firm was developing the 2.0 release of its crowdtesting platform and diversifying its client base.

msg will help passbrains accelerate the development of its 2.0 platform

The name of msg may not be familiar outside Germany, although the company is sizeable: it generated revenues of €1.03bn in 2019 and has around 8,500 employees. msg is headquartered in Munich and has a dual activity as VAR and IT service vendor.

Its profile is unusual in that it operates as a federation of IT companies in 28 countries, providing its members with operational flexibility while providing them with shared services. For instance, msg’s internal development teams based in Germany will take over passbrains’ platform 2.0 development activities.

The ongoing development of platforms is critical for the success of crowdtesting vendors: the crowdtesting industry heavily relies on its proprietary platforms for managing projects, selecting and inviting crowdtesters, and identifying log and QC-ing defects, especially for their agile testing needs. Crowdtesting platforms have become a significant barrier to entry; we think this partly explains why more IT service vendors have not entered crowdtesting and crowdsourcing, except through acquisitions.

psassbrains will also benefit from msg’s ideation software product, which it will incorporate in its crowdtesting platform. This is an important module that msg uses to generate and rate ideas, given passbrains’ positioning on UX testing.

msg brings test automation expertise

The msg acquisition also solves one strategic challenge for passbrains: many competitors have aligned their crowdtesting capabilities around agile projects, putting less emphasis on exploratory testing and UX testing. They intend to transition clients from manual activities to test automation. This makes sense in the context of agile projects to bring functional automation. However, the challenge for crowdtesting pure-plays is to invest in test automation and compete with IT service vendors that have developed very significant test automation capabilities, IP, and accelerators.

Currently, passbrains has maintained a balanced portfolio, unlike competitors, with UX testing still representing ~60% of its revenues. msg’s expertise will help passbrains accelerate in agile testing. msg has a testing unit with ~300 FTEs located in Germany. It brings some test automation scale across functional (with a specialization around Tricentis’ Tosca) and non-functional.

msg brings cross-selling opportunities to passbrains

In the short-term, a priority for passbrains is joint GTM with msg, addressing testing opportunities with a full range of testing services. msg brings in a client base in SAP in the insurance, public sector, automotive, and healthcare sectors in the German market and will help passbrains expand from its telecom client base. passbrains will spend the next six months educating msg’s sales force about its crowdtesting capabilities.

passbrains’ mid-term priority is to deploy further AI use cases. The company is implementing msg.ProfileMap to match projects and skills requirements and identify the right crowdtesters.

In the longer-term, we think msg and passbrains can further expand into AI-powered testing. AI is dramatically transforming how functional automation operates and is a real paradigm shift. We believe there is a window of opportunity for passbrains/msg there.

]]>