NelsonHall: BPS Market Development blog feed https://research.nelson-hall.com//sourcing-expertise/bps-market-development/?avpage-views=blog The BPS Market Development program provides a comprehensive overview of BPS markets with a focus on market developments and market forecasting by industry, service line and geography. The program provides timely identification of changes in market opportunity and service delivery mechanisms, and helps organizations understand, adopt, and optimize the next generation of business process models. <![CDATA[Conduent Partners with Microsoft to Underpin Client GenAI Innovation Initiatives]]>

 

Conduent has partnered with Microsoft to use Microsoft Azure OpenAI to underpin its GenAI innovation initiatives with clients.

Its GenAI journey includes:

  • Selecting use cases focused on improving quality, throughput, and cycle times
  • Adoption of pilots in healthcare claims adjudication, fraud detection, and customer service enhancement
  • Subsequently, moving to MVPs and industrializing use cases.

Use Case Selection Criteria Focused on Improving Quality, Throughput, and Cycle Times

Conduent recognizes that GenAI is an expensive technology and that its adoption will typically incur costs in changing existing processes and technology stacks. This makes it difficult to build GenAI business cases on already optimized operations based solely on cost reduction. Hence, Conduent is focusing on “innovative additive opportunities” to make the business case work.

The outcomes that Conduent is targeting from GenAI initiatives are:

  • Improved quality, reducing error rates across standardized workflows
  • Increased throughput of business process transactions
  • Faster cycle times by consolidating value chain steps for faster processing.

At the same time, Conduent’s client relationships tend to involve relatively deep end-to-end service provision across a range of processes rather than single-process support. These combinations of services are tailored for specific clients rather than being standalone commoditized services.

Accordingly, Conduent’s document management services and CX services are generally supplied as part of a wider capability rather than as standalone services. Within this pattern, most of its solutions include elements of:

  • Document management, including summarization & analysis, and extracting and contextualizing text within images
  • User interaction/call center with multiple clients across various sectors, including enhancing agent assist, virtual agent, and call center agent assessment
  • Search & analytics.

These three areas are regarded as core competencies by Conduent and all its GenAI use cases for the immediate future will fall into one of these areas and will be capable of delivering improved quality, throughput, and cycle times.

Initial GenAI PoCs Focus on Healthcare Claims Adjudication, Fraud Detection, and Customer Service Enhancement

Conduent has announced three GenAI pilot areas covering healthcare claims adjudication, state government program fraud detection, and customer service enhancement.

Conduent is a major provider of healthcare claims adjudication services. Here, it is working on a PoC with several healthcare clients to apply GenAI to reduce the error rate in data extraction and achieve faster cycle times in claims adjudication. GenAI is being used within document management to summarize highly unstructured documents, such as appeals documents and medical records, using its contextualization capabilities, including image-to-text. The technologies used are Azure AI Document Intelligence and Azure OpenAI Service.

Secondly, Conduent is working on a fraud detection PoC to support social programs in the U.S. state government sector. This PoC uses GenAI for search & analytics across multiple structured and unstructured data sets to increase the volume and speed of fraud detection. The technologies used here are Azure Data Factory and Azure OpenAI Service.

Finally, Conduent is using GenAI to enhance the use of agent assist and virtual agents by training virtual agents on existing data so that they can be deployed much faster.

Moving to MVPs and Industrializing GenAI Use Cases

Conduent initially undertook PoCs in the above areas to get both Conduent and the client comfortable with the results of applying GenAI and prove the use case. The PoC process is iterative and very granular, and Conduent perceives that organizations need to be extremely prescriptive to get the right results. This typically means defining the specific inputs and outputs of the use case very tightly, including defining what GenAI should do in the absence of individual inputs.

Conduent is now moving towards MVP and building client business cases in several of these pilots, including in some pilots in the document management space. Since Conduent has taken a horizontal approach to its use case selection, many of these have the potential to scale across multiple industry segments and clients.

In addition, Conduent provides BPaaS services to many of its clients, so it is looking to embed proven GenAI use cases in its proprietary technology platforms.

Existing use cases are business unit-sponsored but curated centrally, with the center providing enablement and cross-pollination across business units. However, the intention is to incubate future GenAI capability within individual business units.

]]>
<![CDATA[Capgemini Launches ’One Operations’ to Support CPG Enterprises in Driving Revenue Growth]]>

 

Capgemini has launched a new digital transformation service, One Operations, with the specific goal of driving client revenue growth.

One Operations: Key Principles

Some of One Operations’ principles, such as introducing benchmark-driven best practice operations models, taking an end-to-end approach to operations across silos, and using co-invested innovation funds, are relatively well established in the industry. However, what is new is building on these principles to incorporate an overriding focus on delivering revenue growth. The business case for a One Operations assignment focuses on facilitating the client’s revenue growth and taking a B2B2C approach focused on the end customer, emphasizing the delivery of insights that enable client personnel to make earlier decisions focused on the enterprise’s customers.

Capgemini’s One Operations account teams involve consulting and operations working together, with Capgemini Invent contributing design and consulting and the operational RUN organization provided by Capgemini’s Business Services global business line.

Implementing a One Operations philosophy across the client organization and Capgemini is achieved through shared targets to reduce vendor/client friction and co-invested innovation funds. One Operations assignments involve setting joint targets with a continuously replenished co-invested innovation fund of ~10–15% of Capgemini revenues used to fund digital transformation.

One Operations is very industry-focused, and Capgemini is initially targeting selected clients within the CPG sector, looking to assist them in growing within an individual country or small group of countries by localizing within their global initiatives. The key to this approach is demonstrating to clients that it understands and can support both the ’grow’ and ’run’ elements of their businesses and having an outcome-based conversation. Capgemini is typically looking to enable enterprises to achieve 4X growth by connecting the sales organization to the supply chain.

Assignments commence with working sessions brainstorming the possibilities with key decision-makers. The One Operations client team is jointly led by a full-time executive from Capgemini Invent and an executive from Capgemini’s Business Services. The Capgemini Invent executive remains part of the One Operations client team until go-live. The appropriate business sector expertise is drawn more widely from across the Capgemini group.

One Operations assignments typically have three phases:

  • Deployment planning (3–6 months) to understand the processes and associated costs and create the business case
  • Deployment (6–15 months) to create the ’day one’ operating model
  • Sustain, involving go-live and continuous improvement.

At this stage, Capgemini has two live One Operations assignments with further discussions taking place with clients.

Using End-to-End Process Integration to Speed Up Growth-Oriented Insights

Capgemini’s One Operations has three key design principles:

  • Re-inventing the organization by embedding a growth mindset by reducing business operations complexity and enabling an AI-augmented workforce to focus on their customers and higher-value services
  • Increasing the level of end-to-end integration by improving data accuracy and incorporating AI to achieve ’touchless forecasting & planning’ and enable better decisions and speed of innovation. ’Frictionless’ end-to-end integration is used to support more connected decisions and planning across the value chain
  • Transforming at speed and scale.

These transformations involve:

  • Shaping the strategic transformation agenda through defining the target operating model based on peer benchmarks and using standardized operating model design, assets, and accelerators
  • Using a digital-first framework incorporating One Operations pre-configured digital process evaluation and digital twins
  • Deployment of D-GEM technology accelerators, including AI-augmented workforce solutions and Capgemini IP such as Tran$4orm and ranging from platforms to microtools
  • Augmented operations using Capgemini Business Services.

Changing the mindset within the enterprise involves freeing personnel from tactical transactional activities and providing relevant information supporting their new goals.

Capgemini aims to achieve the growth mindset in client enterprises by enabling an integrated end-to-end view from sales to delivery, facilitating teams with digital tools for process execution and growth-oriented data insights. Within this growth focus, Capgemini offers an omnichannel model to drive sales, augmented teams to enable better customer interactions, predictive technology to identify the next best customer actions, and data orchestration to reduce customer friction.

One Operations also enables touchless planning to improve forecast accuracy, increase the order fill rate, reduce time spent planning promotions, and accelerate cash collections to reduce DSO, while improving promotions accuracy and product availability are also key to revenue growth within CPG and retail environments.

Shortening Forecasting Process & Enhancing Quality of Promotional Decisions: Keys to Growth in CPG

The overriding aim within One Operations is to free enterprise employees to focus on their customers and business growth. In one example, Capgemini is looking to assist an enterprise in increasing its sales within one geography from ~$1bn to $4bn.

The organization needed to free up its operational energies to focus on growth and create an insight-driven consumer-first mindset. However, the organization faced the following issues:

  • 70% of its planning effort was spent analyzing past performance, and ~100 touches were required to deliver a monthly forecast
  • Order processing efficiency was below the industry average
  • Approx. 30% of its trucks were leaving the warehouse half-empty
  • Launching products was taking longer than expected.

Capgemini took a multidisciplinary approach end-to-end across plan-to-cash. One key to growth is the provision of timely information. Capgemini is aiming to improve the transparency of business decisions. For example, the company has rationalized the coding of PoS data so that it can be directly interfaced with forecasting, shortening the forecasting process from weeks to days and enhancing the quality of promotional decisions.

Capgemini also implemented One Operations, leveraging D-GEM to develop a best-in-class operating model resulting in a €150m increase in revenue, 15% increase in forecasting accuracy, 50% decrease in time spent on setting up marketing promotions, and a 20% increase in order fulfillment rate.

]]>
<![CDATA[The Role of Organizational Change Management in Digital Transformation]]>

 

Digital transformation and the associated adoption of Intelligent Process Automation (IPA) remains at an all-time high. This is to be encouraged, and enterprises are now reinventing their services and delivery at a record pace. Consequently, enterprise operations and service delivery are increasingly becoming hybrid, with delivery handled by tightly integrated combinations of personnel and automations.

However, the danger with these types of transformation is the omnipresent risk in intelligent process automation projects of putting the technology first, regarding people as secondary considerations, and alienating the workforce through reactive communication and training programs. As many major IT projects have discovered over the decades, the failure to adopt professional organizational change management procedures can lead to staff demotivation, poor system adoption, and significantly impaired ROI.

The greater the organizational transformation, the greater the need for professional organizational change management. This requires high workforce-centricity and taking a structured approach to employee change management.

In the light of this trend, NelsonHall's John Willmott interviewed Capgemini's Marek Sowa on the company’s approach to organizational change management.

JW: Marek, what do you see as the difference between organizational change management and employee communication?

MS: Employee communication tends to be seen as communicating a top-down "solution" to employees, whereas organizational change management is all about empowering employees and making them part of the solution at an individual level.

JW: What are the best practices for successful organizational change management?

MS: Capgemini has identified three best practices for successful organizational change management, namely integrated OCM, active and visible sponsorship, and developing a tailored case for change:

  • Integrated OCM – OCM will be most effective when integrated with project management and involved in the project right from the planning/defining phase. It is critical that OCM is regarded as an integral component of organizational transformation and not as a communications vehicle to be bolted on to the end of the roll-out.
  • Active and visible sponsorship – C-level executives should become program sponsors and provide leadership in creating a new but safe environment for employees to become familiar with new tools and learn different practices. Throughout the project, leaders should make it a top priority to prove their commitment to the transformation process, reward risk-taking, and incorporate new behaviors into the organization's day-to-day operations.
  • Tailored case for change – The new solution should be made desirable and relevant for employees by presenting the change vision, outlining the organization's goals, and illustrating how the solution will help employees achieve them. It is critical that the case for change is aspirational, using evidence based on real data and a compelling vision, and that employees are made to feel part of the solution rather than threatened by technological change.

JW: So how should organizations make this approach relevant at the workgroup and individual level?

MS: A key step in achieving the goals of organizational change management is identifying and understanding all the units and personnel in the organization that will be impacted both directly and indirectly by the transformation. Each stakeholder or stakeholder group will likely find itself in a different place when it comes to perspective, concerns, and willingness to accept new ways of working. It is critical to involve each group in the transformation and get them involved in shaping and driving the transformation. One useful concept in OCM for achieving this is WIIFM (What's In It For Me), with WIIFM identified at a granular level for each stakeholder group.

Much of the benefit and expected ROI is tied to people accepting and taking ownership for the new approach and changing their existing ways of working. Successfully deployed OCM motivates personnel by empowering employees across the organization to improve and refine the new solution continually, stimulating revenue growth, and securing ROI. People need to be both aware of how the new solution is changing their work and that they are active in driving it – and thanks to that, they are actively making the organization a "powerhouse" for continuous innovation.

How an enterprise embeds change across its various siloes is very important. In fact, in the context of AI, automatization is not only about adopting new tools and software but mostly about changing the way the enterprise's personnel think, operate and do business.

JW: How do you overcome employees' natural fear of new technology?

MS: To generate enthusiasm within the organization while avoiding making the vision seem unattainable or scary, enterprises need to frame and sell transformations incorporating, for example, AI as evolutions of something the employees are doing already, not merely as "just the next logical step" but reinventions of the whole process – from both the business and experience perspective. They need to retain the familiarity which gives people comfort and confidence but, on the other hand, reassure them that the new tool/solution adds to their existing capability, allowing them to fulfill their true potential – something that is not automatable.

]]>
<![CDATA[WNS Launches Quote-to-Sustain to Reinvent Order-to-Cash]]>

 

The pandemic has changed organizations’ attitudes towards the need for change, greatly increasing their emphasis on adopting new digital process models and digital transformation. Partly this is driven by the need to enhance their transactional efficiency and effectiveness rapidly, but at least equally importantly, it has brought a much greater requirement for real-time information and analytics to drive the business. All these pressures are keenly felt within the finance department.

WNS introduces Quote-to-Sustain

In response, WNS has looked to reinvent order-to-cash in the form of Quote-to-Sustain. Some of the issues that WNS is aiming to address with its Quote-to-Sustain offering include:

  • Improving billing timeliness and accuracy by improving the integration and consistency of data between quotations, digital contracts, order management & fulfillment, and billing systems
  • Maximizing revenue by releasing credit holds for good customers
  • Protecting the enterprise with more dynamic credit control than periodic credit review
  • Improving collections by cleansing master data in real-time.

In addition to delivering enhanced end-to-end visibility, stakeholder experience, and analytics, WNS has also reimagined its Quote-to-Sustain service to deliver greater variability in F&A process costs as business volumes fluctuate and become more unpredictable as a result of the pandemic. Its new Quote-to-Sustain offering bundles technology and services and allows clients to “pay by the drink”.

Specifically, the goal is for clients to remain cash neutral and, subject to some volume commitment, to pay only for transactions, with a decrease in costs emerging from year two onwards. WNS funds all change management.

Quote-to-Sustain modules

The Quote-to-Sustain module structure is:

  • Quote-to-Order: consisting of unified master data, cognitive credit, digital contracts, and smart orders
  • Bill-to-Cash: consisting of integrated billing, intelligent collections, predictive dispute management, touchless cash applications, and predictive deduction management
  • Report-to-Sustain: consisting of digital dashboards, botified queries, analytics-as-a-service, and revenue assurance.

Each of these modules takes the form of a system of engagement sitting on top of the client’s existing ERP systems and systems of record.

In this respect, the use of unified master data is important in bringing together the various commercial and financial elements from multiple databases to ensure accuracy, for example, in billing the right person and identifying the appropriate person for each type of query. The unified master data aims to be a single source of the truth using data authentication from external sources and providing an element of real-time data cleansing.

WNS’ cognitive credit offering aims to take credit management beyond the periodic review of credit bureau reports and base its recommendations for credit eligibility on its own analyses of financial ratios, customer behavior (including any changes in payment pattern), and news triggers from external sources. WNS believes this approach to be particularly effective in addressing credit management within small businesses. The service incorporates technology from HighRadius and Emagia.

WNS’ digital contracts and smart orders modules utilize its Skense and Agilius platforms to combine and analyze data from various sources and integrate quotations, orders, contracts, and billing to reduce the errors that typically arise between disparate sources of information.

WNS’ Revenue Assurance and Analytics modules include some industry-specific modules to monitor and minimize revenue dilution, using analytics for improved collections and to reduce revenue losses arising from upstream process errors.

Quote-to-Sustain adoption plan

WNS has always approached F&A from a sector-specific viewpoint. Having developed all the modules within Quote-to-Sustain, WNS is now in the process of integrating this capability with its industry-specific processes in line with client demand. Powered by an exclusive partnership with EvoluteIQ, WNS’ domain-led hyperautomation platform suite is designed to accelerate the adoption of process automation and drive enterprise-wide digital transformation. The sectors WNS will focus on are the ones where it has already developed industry-specific expertise and IP and include airlines, travel agencies, trucking, shipping & logistics, insurance, telecom, media, CPG, manufacturing, and utilities.

Nonetheless, the initial clients of WNS Quote-to-Sustain have typically started by purchasing a single module such as cognitive credit & collections, and WNS expects a typical sequence of deployment to be cognitive credit & collections, followed by unified master data, followed by revenue assurance.

WNS has already applied its cognitive credit, touchless cash applications, botified queries, and analytics-as-a-service modules of Quote-to-Sustain for a media client. This company’s cash application process was automated but only achieved a 75% auto-match rate due to delays in the receipt of remittance advice notes. WNS deployed touchless cash apps via EIPP (electronic invoice presentment and payments) to achieve an auto-match rate of 88% and introduced intelligent chatbots and predictive disputes management to reduce the time for resolution significantly. The chatbots resolve most disputes without human intervention, with all trade promotion issues resolved through chatbots.

Overall, the media company has achieved a potential $38m uplift in free cash flow by optimizing payments from late-paying customers and an 11% reduction in bad debts by improving late-stage collection.

In addition to this modular approach being taken with mid-sized organizations, WNS is also targeting start-ups, where the company is in discussion with some organizations for the entire suite of end-to-end services.

Conclusion

Many existing F&A operations have incorporated best-of-breed point solutions and subsequently applied RPA in support of point automations. However, these organizations are often still using disparate data sources and have not fully reimagined their F&A processes into an integrated framework using a single source of the truth and analytics for improved operational and business intelligence. WNS’ Quote to Sustain offering aims to provide this reimagined finance model and help organizations become more agile and analytical in their approach to order-to-cash.

]]>
<![CDATA[Infosys McCamish Life & Annuities Platforms Combining Depth of Policy Coverage & Breadth of Customer Experience]]>

 

For some time, life & annuities carriers have suffered from a multitude of legacy platforms, with each implemented to handle a particular style of product that was either not handled by its predecessor or was added through the acquisition of a set of blocks from another carrier. The resulting stable of platforms has always been expensive to maintain. In recent years, this has been compounded by the increasing importance of digital customer experience and the ability to launch new products quickly.

The pandemic has further emphasized these needs with consumers increasingly moving online, the need for new types of insurance products, and the vast majority of companies increasing their digital transformation emphasis. While most of Infosys McCamish’s current pipeline is driven by mergers & acquisitions and platform rationalization and optimization to modernize legacy environments, they have also onboarded new clients to provide end-to-end services for open blocks, becoming a viable partner for organizations looking to expand their new business pipeline.

Infosys McCamish Focuses on Client Interaction

Indeed, while life companies need to launch new products at speed, insurance product functionality is now increasingly taken as table stakes. Life & annuities producers are now much more focused on client interaction functionality. This includes the omni-channel experience and the ability to deliver a zero-touch digital engagement, incorporating, for example, machine learning to deliver straight-through processing and next-best actions.

In line with these requirements, Infosys McCamish has taken a 3-tiered approach:

  • For policy owners & agents: aiming for zero-touch, omnichannel, and responsive design via portals, e-delivery, e-sign, smart video, SMS notify, and chatbots
  • For operations: aiming for ‘one & done’ via the use of workflow, dashboards, content management, and document management
  • Core administration functionality: aiming for seamless integration via APIs together with an extensive product library.

Infosys McCamish’s preference is to convert policies from client legacy platforms to its own VPAS platform. Its conversion accelerator identifies data cleanliness and produces balance and control reports before moving the policy data to VPAS. Not all data is moved to VPAS, with data beyond the reinstatement period being moved to a separate source data repository. Infosys McCamish will aim to have 13-24 months of re-processable data on its platform, converting all the history as it was processed on the original platform so that in the future, it is possible to view exactly what happened on the prior platform.

VPAS supports a wide range of life & annuity products, including term life, traditional life, universal life, deferred annuities, immediate annuities, and flexible spending accounts, and Infosys McCamish estimates that on mapping a carrier’s current products with the current configurations in VPAS, there is typically around 97% full compatibility. VPAS currently supports ~40m policies across 22 distinct product families.

However, where necessary or where conversion for some policy types is impossible, it can also wrap its customer experience tools around legacy insurance platforms to provide a common and digital customer experience. Infosys McCamish platforms make extensive use of an API library that supports synchronous and asynchronous communication between Infosys McCamish systems and customer systems.

Incorporating “Smart Video” into the Customer Experience

Infosys McCamish has enhanced its customer experience to enable policy/contract owners to go beyond viewing policies online and transact in real-time, further introducing:

  • Mobile App and chatbot functionality
  • Smart video, which uses APIs to extract data and present it to the customer in video form
  • Wearables to support wellness products.

Customers can view their billing and premium information and obtain policy quotes online, with personalized smart video used to enhance the customer experience. They can also initiate policy surrenders online. Depending on carrier policy, surrenders to a certain value are handled automatically, with higher value surrenders being passed to a senior person for verification. Similarly, if a customer is seeking to extend their coverage online, the request is routed by the workflow to an underwriter or senior manager. DocuSign is used to facilitate the use of e-signatures rather than paper documents. All correspondence can be viewed online by customers, with AI-enabled web chat used to support customer queries.

Digital adoption depends on carrier policy and is running at ~25%, with customers being prompted to use digital in all correspondence. Single-touch and no-touch processing account for ~75% of transactions.

Workflow & Dashboards Guiding Agents to Reduce Time to Onboard

Infosys McCamish has integrated BPM and workflow and low-code development to support the back-office and call center service layers to provide operations with inbuilt automation to achieve increased levels of straight-through processing and fewer opportunities for manual errors. It incorporates business rules so that data is only keyed once with, for example, relevant customer updates applied to one policy type being applied across all of their policies.

The VPAS customer service work desk is built on Pega, with the workflow configured for contact center and back-office services and supporting the customer and agent self-service portals.

The agent dashboard is dynamic with the view shown based on the agent role, and the call center dashboard provides drill-downs on service requests by type, SLA performance details such as average handling times, and the full audit trails of each transaction.

The workflow also guides the call center agent through the steps in a transaction, provides scripting, and uses AI to recommend additional actions when communicating with a customer. This improves the quality of each interaction and significantly reduces the time taken to train new agents.

The above is supported by experience enablers underpinned by the data warehouse, which is updated in real-time as changes are made in the policy administration system. The data warehouse is accessed via APIs by Infosys Nia analytics or third-party tools such as PowerBI or Tableau.

Product Configuration Based on Cloning Existing Products

New products are typically created within the product management module by cloning an existing product or template and business rules; for example, customizing to add or remove certain features or coverages, rather than by creating new product features and functionality.

VPAS new business supports digital new business, including E-App and underwriting case management, and integrations with other new business platforms such as iPipeline and FireLight.

Agent Management & Compensation Increasingly Bundled with Product Administration

In addition to the VPAS life and annuity product administration system, Infosys McCamish’s life & annuity platforms also include PMACS, a producer management & compensation system, supporting agent onboarding, licensing, and commission management.

Infosys McCamish is experiencing a greater requirement for end-to-end capability, with PMACS increasingly being bundled with VPAS. The emphasis within PMACS has moved beyond commission management, where the system shows the agent how each commission was calculated, to agent onboarding, licensing, and appointments, allowing agents to view their pipelines and their client policy portfolios.

PMACS has also moved beyond supporting life & annuities and group & critical illness to support property & casualty.

Summary

Infosys McCamish is increasingly looking to assist life & annuities carriers in the adoption of modern digital platforms, and their VPAS ecosystem emphasizes:

  • Digitalization, with a high emphasis on separation of systems of engagement from systems of record
  • Componentization, to facilitate low impact enhancements and speed to market
  • Self-service for both customers and agents, in support of zero-touch processing and multi-channel access
  • Integrated BPM and workflow and low-code development in support of the back-office and call center service layers
  • Use of CI/CD to identify the impact that changes in one component will have on other software components.

John Willmott and Rachael Stormonth

]]>
<![CDATA[Capgemini's CIAP 2.0 Assists Enterprises in Rapid & Cost-Effective Scaling of Automation Initiatives]]>

 

Capgemini has just launched version 2 of the Capgemini Intelligent Automation Platform (CIAP) to assist organizations in offering an enterprise-wide and AI-enabled approach to their automation initiatives across IT and business operations. In particular, CIAP offers:

  • Reduced TCO and increased resilience through use of shared third-party components
  • Support for AIOps and DevSecOps
  • A strong focus on problem elimination and functional health checks.

Reduced TCO & increased ability to scale through use of a common automation platform

A common problem with automation initiatives is their distributed nature across the enterprise, with multiple purchasing points and a diverse set of tools and governance, reducing overall RoI and the enterprise's ability to scale automation at speed.

Capgemini aims to address these issues through CIAP, a multi-tenanted cloud-based automation solution that can be used to deliver "automation on tap." It consists of an orchestration and governance platform and the UiPath intelligent automation platform. Each enterprise has a multi-tenanted orchestrator providing a framework for invoking APIs and client scripts together with dedicated bot libraries and a segregated instance of UiPath Studio. A central source of dashboards and analytics is built into the front-end command tower.

While UiPath is provided as an integral part of CIAP, CIAP also provides APIs to integrate other Intelligent Automation platforms with the CIAP orchestration platform, enabling enterprises to continue to optimize the value of their existing use cases.

The central orchestration feature within CIAP removes the need for a series of point solutions, allowing automations to be more end-to-end in scope and removing the need for integration by the client organization. For example, within CIAP, event monitoring can trigger ticket creation, which in turn can automatically trigger a remediation solution.

Another benefit of this shared component approach is reducing TCO by improved sharing of licenses. The client no longer has to duplicate tool purchasing and dedicate components to individual automations; the platform and its toolset can be shared across each of infrastructure, applications, and business services departments within the enterprise.

CIAP is offered on a fixed-price subscription-based model based on "typical" usage levels, with additional charges only applicable where client volumes necessitate additional third-party execution licenses or storage beyond those already incorporated in the package.

Support for AIOps & DevSecOps

CIAP began life focused on application services, and the platform provides support for AIOps and DevSecOps, not just business services.

In particular, CIAP incorporates AIOps using the client's application infrastructure logs for reactive and predictive resolutions. In terms of reactive resolutions, the AIOps can identify the dependent infrastructure components and applications, identify the root cause, and apply any automation available.

CIAP also ingests logs and alerts and uses algorithms to correlate them, so that the resolver group only needs to address a smaller number of independent scenarios rather than each alert individually. The platform can also incorporate the enterprise's known error databases so that if an automated resolution does not exist, the platform can still recommend the most appropriate knowledge objects for use in resolution.

Future enhancements include increased emphasis on proactive capacity planning, including proactive simulation of the impact of change in an estate and enhancing the platform's ability to predict a greater range of possible incidents in advance. Capgemini is also enhancing the range of development enablers within the platform to establish CIAP as a DevSecOps platform, supporting the life cycle from design capture through unit and regression testing, all the way to release within the platform, initially starting with the Java and .NET stacks.

A strong focus on problem elimination & functional health checks

Capgemini perceives that repetitive task automation is now well understood by organizations, and the emphasis is increasingly on using AI-based solutions to analyze data patterns and then trigger appropriate actions.

Accordingly, to extend the scope of automation beyond RPA, CIAP provides built-in problem management capability, with the platform using machine learning to analyze historical tickets to identify the causes and recurring problems and, in many cases, initiate remediation automatically. CIAP then aims to reduce the level of manual remediation automation on an ongoing basis by recommending emerging automation opportunities.

In addition to bots addressing incident and problem management, the platform also has a major emphasis within its bot store on sector-specific bots providing functional health checks for sectors including energy & utilities, manufacturing, financial services, telecoms, life sciences, and retail & CPG. One example in retail is where prices are copied from a central system to store PoS systems daily. However, unreported errors during this process, such as network downtime, can result in some items remaining incorrectly priced in a store PoS system. In response to this issue, Capgemini has developed a bot that compares the pricing between upstream and downstream systems at the end of each batch pricing update, alerting business users, and triggering remediation where discrepancies are identified. Finally, the bot checks that remediation was successful and updates the incident management tool to close the ticket.

Similarly, Capgemini has developed a validation script for the utilities sector, which identifies possible discrepancies in meter readings leading to revenue leakage and customer dissatisfaction. For the manufacturing sector, Capgemini has developed a bot that identifies orders that have gone on credit hold, and bots to assist manufacturers in shop floor capacity planning by analyzing equipment maintenance logs and manufacturing cycle times.

CIAP has ~200 bots currently built into the platform library.

A final advantage of using platforms such as CIAP beyond their libraries and cost advantages is that they provide operational resilience by providing orchestrated mechanisms for plugging in the latest technologies in a controlled and cost-effective manner while unplugging or phasing out previous generations of technology, all of which further enhances time to value. This is increasingly important to enterprises as their automation estates grow to take on widespread and strategic operational roles.

]]>
<![CDATA[Infosys BPM’s $1bn Milestone & Future Trajectory]]>

Infosys BPM has reached its 18th birthday, in many cultures the age of maturity, achieving a major milestone of $1bn in annual revenues. Infosys BPM today is a very different company from its birth in 2002 when it was set up as a JV in India with Citibank, and there have been some significant developments in the last couple of years.

We recently caught up with Infosys BPM CEO Anantha Radhakrishnan, and while he is intensely conscious that it not a time for celebrations when a pandemic is raging, there is a clear confidence about Infosys BPM’s future trajectory. While not quite celebrating, there is a quiet pride about what the company has been doing to help fight the spread of the infection in the state of Karnataka.

Delivering effective COVID-19 programs

What has been achieved in Karnataka (population 64m+) in a contact, inform and track program in a very short time is quietly remarkable (it certainly appears so to me; my own government, a nation with a similar sized population, has yet to introduce any such program). Karnataka is vulnerable to the virus coming into the state via international travellers flying into Bangalore and Mangalore airports. Its state government turned to Infosys BPM to help launch and manage a program to respond to this specific threat, also a second broader program focusing on citizens across the state.

In the first program, Infosys BPM designed and managed a program to reach out digitally to all travellers coming into Karnataka from March 1 onwards, capture their relevant health data via an app, monitor their health for 14 days, provide a help number should they develop any COVID-19 type symptoms, and also advise on quarantine procedures.

In the second larger program, citizens have been encouraged via an extensive multimedia outreach program to log relevant health and non-health information on an app or helpline number. On the basis of the data they provide, they are given advice on appropriate action to take, with help being arranged in exceptional circumstances. The system integrates with the state’s own hospital and ambulance systems. And in the event of any infection hotspots, it can be used to send localized messages to citizens living in a particular cell phone tower or village. Infosys BPM helped define the outreach strategy, designed the ‘Apthamitra’ app (‘close friend’ in the local language), and assembled a consortium of nine BPS companies that have operations in the state to operate the inbound and outreach program. This activity illustrates the maturity of Infosys BPM in its ability to design and launch a major program from scratch.

In terms of transforming its service delivery operations during the pandemic, Infosys BPM has equipped all its centers outside India to reach 95% WFH enablement. China has returned to a hybrid model with about 70% office-based employees, 30% WFH. The India BPM operations are 75% WFH enabled (the 25% including personnel not yet in production). Radha referred to having received 300 emails from clients expressing their appreciation of Infosys going above and beyond to maintain service delivery.

$1bn milestone & beyond

Against the backdrop of COVID-19, Infosys got to the end of its fiscal year achieving its 1$bn revenue target. This has been done through a combination of market-leading organic growth (nearly 17% CC growth in its FY20) and two interesting JVs set up in 2019, in both of which Infosys has a majority stake.

In Japan, Infosys has an 81% stake in Hitachi Procurement Service Co., Ltd. (HIPUS), which handles indirect materials purchasing functions for some Hitachi Group businesses in Japan. Also part of the JV are Panasonic and staffing company Pasona. Normally with JVs such as this, the primary focus is to commercialize and expand the operation. The size of the unit operated by Infosys is already significant and the immediate focus is slightly different. The initial priorities include:

  • Transforming the operations by bringing in modern thinking about procurement processing, including the application of RPA, AI and analytics to streamline operations and improve the UX of buyers
  • Expanding its coverage within the Hitachi Group, as well as offering indirect procurement services to Japanese-owned corporations, serving both their domestic and international needs.

The CPOs of Hitachi and of Panasonic are on the board of HIPUS, which will help in driving both of these priorities.

And in Europe, Infosys has a 75% stake in Stater, a mortgage administration services provider headquartered in the Netherlands; ABN AMRO, its largest client, retains a minority stake. As with HIPUS, there is an emphasis on digital transformation of the service (in this case, transformation of the whole mortgage and loan experience by leveraging dynamic workflow, API layers, RPA, analytics and AI), and of course Infosys will also continue to enhance Stater’s mortgage platform. In addition to developing a next-gen mortgage offering, the opportunities for growth in the JV lie in:

  • Further expansion of its service offerings, for example beyond those around mortgages to adjacent unsecured loan types, also in expanding its activities in supplementary services such as risk models for fraud prevention, leveraging Infosys’ analytics capabilities
  • Expansion of the client base. An obvious opportunity is expansion in Germany (though this remains a market where home ownership is relatively uncommon): for example, Deutsche Bank, with whom Infosys has a strong relationship, is a relatively small account for Stater.

These JVs are a significant expansion of Infosys BPM, one in back-office enterprise services, the other in an industry-specific offering. Infosys BPM has reached a point where its revenue mix is 60% from enterprise services and 40% from industry-specific services. Radhakrishnan’s ambition is for Infosys BPM to reach a roughly 50/50 split, with at least 30% of this being platform-, or quasi-platform, based. In the U.S. Infosys BPM has a longstanding insurance platform business with McCamish, and it is also providing mortgage services to clients including one of the largest U.S. regional banks.

As well as these JVs, Infosys acquired last year a 1,400 person contact center in Northern Ireland that has clients in the telecoms, social media, healthcare, ed-tech and fintech sectors. Its largest client is BT, also an Infosys client: here, BT benefits from the investment that Infosys BPM can make to transform this onshore service by applying digital technologies.

The year ahead for Infosys BPM

So, what might we expect from Infosys BPM over the next year? Every crisis presents its own opportunities; and while its scale is larger than anything we have seen in our lives, COVID-19 is no different. For example, there are going to be lots of BPS captives up for sale as enterprises look to raise cash: but interested vendors should be careful to select those ones that will enhance their capabilities. Infosys BPM is proactively pitching for opportunities for targeted captives of clients in the wider Infosys Group (there is plenty of scope: Radhakrishnan points out that Infosys Group has around 1,200 clients, mostly large enterprises, of whom just 200 are also Infosys BPM clients). It is possible, therefore, that Infosys BPM might be completing more structured deals in 2020, where it will look to modernize, simplify and digitally transform the operation, possibly extending its capabilities into more sectors and/or expanding its capabilities in certain back-office areas.

As it reaches its $1bn revenue milestone, the mood at Infosys BPM is more appreciative than jubilant, but there is a quiet confidence and sense of purpose as it looks to take advantage of new opportunities.

]]>
<![CDATA[Moving to an Autonomous Supply Chain: Q&A with Capgemini’s Joerg Junghanns – Part 2]]>

Read Part 1 here.

 

Q&A Part 2

JW: What are the main supply chain flows that supply chain executives should look to address?

JJ: Traditionally, there are three main supply chain flows that benefit from automation:

  • Physical flow (flow of goods from, e.g., from a DC to a retailer, the most visible and tangible flow) – some more obvious than others, such as parcels delivered to your door or raw materials arriving at a plant. To address these issues, the industry is getting ready (or is ready) to adopt drones, automated trucking, and automated guided vehicles (AGV). But to achieve true end-to-end physical delivery, major infrastructure and regulatory changes are yet to happen to fully unleash the potential of physical automation in this field. In the short-term, however, let’s not forget the critical paper flow associated with these flows of goods, such as a courier sending Bills of Lading to a given port on time for customs clearance and vessel departure, a procedure that often leads to unexpected delays
  • Financial flow (flow of money) – here the industry is adopting new technologies to palliate common issues, e.g., interbanking communication in support of letters of credit
  • Information flow (flow of information connecting systems and stakeholders alike and ensuring that relevant data is shared, ideally in real-time, between, e.g., a supplier, a manufacturer, and its end customers) – this is the information you share via email/spreadsheets or through a platform connecting you with your ecosystem partners. This flow is also a perfect candidate for automation, starting with a platform to break silos or for smaller transformation with tactical RPA deployments. More ambitious firms will also want to look into blockchain solutions to, for instance, transparently access information about their suppliers and ensure that they are compliant (directly connecting to the blockchain containing information provided by the certification institution such as ISO). While the need for drones and automated trucking/shipping is largely contingent on infrastructure changes, regulations, and incremental discoveries, the financial and information flows have reached a degree of maturity at scale that has already been generating significant quantifiable benefits for years.

JW: Can you give me examples of where Capgemini has deployed elements of an autonomous supply chain?

JJ: Capgemini has developed capabilities to help our clients not only design but also run their services following best-practice methodologies blending optimal competencies, location mix, and processes powered by intelligent automation, analytics, and world-renowned platforms. We have helped clients transform their processes, and we have run them from our centers of excellence/delivery centers to maximize productivity.

Two examples spring to mind:

Touchless planning for an international FMCG company:

Our client had maxed out their forecasting capabilities using standard ERP embedded forecasting modules. Capgemini leveraged our Demand Planning framework powered by intelligent automation and combined it with best-in-class machine learning platforms to increase the client’s forecasting accuracy and lower planning costs by over 25%, and this company is now moving to a touchless planning function.

Automated order validation and delivery note for an international chemical manufacturing company:

Our client was running fulfillment operations internally at a high operating cost and low productivity. Capgemini transformed the client’s operations and created a lean team in a cost-effective nearshore location. On top of this, we leveraged intelligent automation to create a touchless purchase/sales order to delivery note creation flow, checking that all required information is correct, and either raising exceptions or passing on the data further down the process to trigger the delivery of required goods.

JW: What are the key success factors for enterprises starting the journey to autonomous supply chains?

JJ: Moving to an autonomous supply chain is a major business and digital transformation, not a standalone technology play, and so corporate culture is highly important in terms of the enterprise being prepared to embrace significant change and disruption and to operate in an agile and dynamic manner.

To ensure business value, you also need a consistent and holistic methodology such as Capgemini’s Digital Global Enterprise Model, which combines Six Sigma-based optimization approaches with a five senses-driven automation model, a framework for the deployment of intelligent automation and analytics technology.

Also, a lot depends on the quality of the supply chain data. Enterprises need to get the data right and master their supply chain data because you can’t drive autonomy if the data is not readily available, up-to-date in real-time, consistent, and complete. Supply chain and logistics is not so much about moving physical goods; it's been about moving information for decades. A bit of automation here and there will not make your supply chain touchless and autonomous. It requires integration and consolidation first before you can aim for autonomy.

JW: And how should enterprises start to undertake the journey to autonomous supply chains?

JJ: The first step is to build the right level of skill and expertise within the supply chain personnel. Scaling too fast without considering the human factor will result in a massive mess and a dip in supply chain performance. Also, it is important to set a culture of continuous improvement and constant innovation, for example, by leveraging a digitally augmented workforce.

Secondly, the right approach is to make elements of the supply chain touchless. Autonomy will happen as a staged approach, not as a big bang. It’s a journey. Focus on high-impact areas first, enable quick wins, and start with prototyping. So, supply chain executives should identify those pockets of excellence that are close to being ready, or which can be made ready, to be made touchless, and where you can drive supply chain autonomy.

One approach to identifying the most appropriate initiatives is to plot them against two axes: the y-axis being the effort to get there and the x-axis being the impact that can be achieved. This will help identify pockets of value that can be addressed relatively quickly, harvesting some quick wins first. As you progress down this journey, further technologies may mature that allow you to address the last pieces of the puzzle and get to an extensively autonomous supply chain.

JW: Which technologies should supply chain executives be considering to underpin their autonomous supply chains in the future?

JJ: Beyond fundamental technologies such as RPA, machine learning has considerable potential to help, for example, in demand planning to increase accuracy, and in fulfillment to connect interaction and decision-making.

Technologies now exist that can, for example, both recognize and interpret the text in an email and automatically respond and send all the information required; for example, for order processing, populating orders automatically, with the order validated against inventory and with delivery prioritized according to corporate rules – and all this without human intervention. This can potentially be extended further with automated carrier bookings against rules. Of course, this largely applies to the “happy flows” at the moment, but there are also proven practices to increase the proportion of “happy orders”.

The level of autonomy in supply chain fulfillment can also be increased by using analytics to monitor supply chain fulfillment and predict potential exceptions and problems, then either automating mitigation or proposing next-best actions to supply chain decision-makers.

This is only the beginning, as AI and blockchain still have a long way to go to reach their potential. Companies that harness their power now and are prepared to scale will be the ones coming out on top.

JW: Thank you, Joerg. I’m sure our readers will find considerable food for thought here as they plan and undertake their journeys to autonomous supply chains.

 

Read Part 1 here.

]]>
<![CDATA[Moving to an Autonomous Supply Chain: Q&A with Capgemini’s Joerg Junghanns – Part 1]]>

 

Introduction

Supply chain management is an area currently facing considerable pressure and is a key target for transformation. NelsonHall research shows that less than a third of supply chain executives in major enterprises are highly satisfied with, for example, their demand forecasting accuracy and their logistics planning and optimization, and that the majority perceive there to be considerable scope to reduce the levels of manual touchpoints and hand-offs within their supply chain processes as they look to move to more autonomous supply chains.

Accordingly, NelsonHall research shows that 86% of supply chain executives consider the transformation of their supply chains over the next two years to be highly important. This typically involves a redesign of the supply chain to maximize available data sources to deliver more efficient workflow and goods handling, improving connectivity within the supply chain to enable more real-time decision-making, and improving the competitive edge with better decision-making tools, analytics, and data sources supporting optimized storage and transport services.

Key supply chain transformation characteristics critical for driving supply chain autonomy that are sought by the majority of supply chain executives include supply chain standardization, end-to-end visibility of supply chain performance, ability to predict, sense, and adjust in real-time, and closed-loop adaptive planning across functions.

At the KPI level, there are particularly high expectations of high demand forecasting accuracy, improved logistics planning and optimization, leading to higher levels of fulfillment reliability; and enhanced risk identification leading to operational cost and working capital reduction.

So, overall, supply chain executives are typically seeking a reduction in supply chain costs, more effective supply chain processes and organization, and improved service levels.

 

Q&A Part 1

JW: Joerg, to what extent do you see existing supply chains under pressure?

JJ: From a manufacturer looking for increased supply chain resilience and lower costs to a B2C end consumer obsessed with speed, visibility, and aftersales services, supply chains are now under great pressure to transform and adapt themselves to remain competitive in an increasingly demanding and volatile environment.

Supply chain pressure results from increasing levels of supply chain complexity, higher customer expectations, a more volatile environment (e.g., trade wars, Brexit), difficulty in managing costs, and lack of visibility. In particular, global trade has been in a constant state of exception since 2009, creating a need to increase supply chain resilience via increased agility and flexibility and, in sectors such as fast-moving consumer goods and even automotive, hyper-personalization can mean a lot size of one, starting from procurement all the way through production and fulfillment. At the same time, supply chains are no longer simple “chains” but have talent, financial, and physical flows all intertwined in a DNA-like spiral resulting in a (supply chain) ecosystem with high complexity. All this is often compounded by the low level of transparency caused by manual processes. In response, enterprises need to start the journey to autonomous supply chains. However, many supply chains are still not digitized, so there’s a lot of homework to be done before introducing digitalization and autonomous supply chains.

JW: What do you understand by the term “autonomous supply chain”?

JJ: The end game in an “autonomous supply chain” is a supply chain that operates without human intervention. Just imagine a parcel reaching your home, knowing it didn’t take any human intervention to fulfill your order? How much of this is fiction and how much reality?

Well, some of this certainly depends on major investments and changes to regulations in areas such as sending drones to deliver your parcels, flying over your neighborhood, or loading automated trucks crisscrossing the country with nobody behind the steering wheel; major steps in lowering costs and improving customer satisfaction can already be undertaken using current technologies. Recent surveys show that only a quarter of supply chain leaders perceive that they have reached a satisfactory automation level, leveraging the most innovative end-to-end solutions currently available.

JW: What benefits can companies expect from the implementation of an “autonomous supply chain”?

JJ: Our observations and experience link autonomous supply chains to:

  • Lower costs – it is no surprise that supply chain automation already helps to lower costs (and will do even more so in the future), combining FTE savings and lower exception handling costs coupled with productivity and quality gains
  • Improved customer satisfaction – as a customer you may ask, why should I care that the processes leading to the delivery of my products are “no touch”, that it required hardly any human intervention? Well, you will when your products are delivered faster, and that from order to delivery your experience was transparent and seamless, requiring no tedious phone calls to locate your product(s) or complains about delivery or invoicing errors!
  • Increased revenue – as companies process more, faster, with fewer handling and processing errors along the way, they create added value for their customers and benefit from capacity gains that eventually affect their top line, particularly when operational savings are passed on to lower delivery/product prices, thus allowing for a healthy combination of margin and revenue increase.

We have seen that automation can do far more than simply cut costs and that there are many ways to implement automation at scale without relying on infrastructure/regulation changes (e.g., drones) – for example, by leveraging a digitally augmented workforce. Companies have been launching proofs of concept (POCs) but often struggle to reap the true benefits due to talent shortages, siloed processes, and a lack of a long-term holistic vision.

JW: What hurdles do organizations need to overcome to achieve an autonomous supply chain?

JJ: We have observed that companies often face the following hurdles when trying to create a more autonomous supply chain:

  • Lack of visibility and transparency – due to 1) outdated process flows, and 2) siloed information systems often requiring email-based information exchange (back and forth non-standardized spreadsheets, flat files)
  • Lack of agility (influencing/impacting the overall resilience of the supply chain) – the inability to execute on insights due to slow information velocity and stiffness in their processes, often focused on functions as opposed to value-added processes cutting across the organization
  • Lack of the right talent – difficulty in finding talent in a very competitive industry with new technologies making typical supply chain profiles less relevant and new digital profiles often costly to train and hard to retain
  • Lack of centralization and consolidation – leading to high costs, poor productivity, and disjointed technology landscapes, often unable to scale across the organization due to a lack of a holistic transformation approach and proper governance.

One thing that many companies have in common is a lack of ability to deploy automation solutions at scale, cost-effectively. Too often, these projects remain at a POC stage and are parked until a new POC (often technology-driven) comes along and yet again fails to scale properly due to high costs, lack of resources, and lack of strategic vision tied to business outcomes.

 

In Part 2 of the interview, Joerg Junghanns discusses the supply chain flows that benefit from automation, describes client case examples, and highlights the success factors, adoption approach, and key technologies behind autonomous supply chains.

]]>
<![CDATA[IPsoft Looks to Reduce Time to Value While Increasing Return on AI]]>

 

NelsonHall recently attended the IPsoft Digital Workforce Summit in New York and its analyst events in NY and London. For organizations unfamiliar with IPsoft, the company has around 2,300 employees, approximately 70% of these based in the U.S. and 20% in Europe. Europe is responsible for aproximately 30% of the IPsoft client base with clients relatively evenly distributed over the six regions: U.K., Spain & Iberia, France, Benelux, Nordics, and Central Europe.

The company began life with the development of autonomics for ITSM in the form of IPcenter, and in 2014 launched the first version of its Amelia conversational agent. In 2018, the company launched 1Desk, effectively combining its cognitive and autonomic capabilities.

The events outlined IPsoft’s positioning and plans for the future, with the company:

  • Investing strongly in Amelia to enhance its contextual understanding and maintain its differentiation from “chatbots”
  • Launching “Co-pilot” to remove the currently strong demarcation between automated and agent interactions
  • Building use cases and a partner program to boost adoption and sales
  • Positioning 1Desk and its associated industry solutions as end-to-end intelligent automation solutions, and the key to the industry and the future of IPsoft.

Enhancing Contextual Understanding to Maintain Amelia’s Differentiation from Chatbots

Amelia has often suffered from being seen at first glance as "just another chatbot". Nonetheless, IPsoft continues to position Amelia as “your digital companion for a better customer service” and to invest heavily to maintain Amelia’s lead in functionality as a cognitive agent. Here, IPsoft is looking to differentiate by stressing Amelia’s contextual awareness and ability to switch contexts within a conversation, thereby “offering the capability to have a natural conversation with an AI platform that really understands you.”

Amelia goes through six pathways in sequence within a conversation to understand each utterance and the pathway with highest probability wins. The pathways are:

  • Intent model
  • Semantic FAQ
  • AIML
  • Social talk
  • Acknowledge
  • Don’t know.

The platform also separates “entities” from “intents”, capturing both of these using Natural Language Understanding. Both intent and entity recognition is specific to the language used, though IPsoft is now simplifying implementation further by making processes language-independent and removing the need for the client to implement channel-specific syntax.

A key element in supporting more natural conversations is the use of stochastic business process networks, which means that Amelia can identify the required information as it is provided by the user, rather than having to ask for and accept items of information in a particular sequence as would be the case in a traditional chatbot implementation.

Context switching is also supported within a single conversation, with users able to switch between domains, e.g. from IT support to HR support and back again in a single conversation, subject to the rules on context switching defined by the organization.

Indeed, IPsoft has always had a strong academic and R&D focus and is currently further enhancing and differentiating Amelia through:

  • Leveraging ELMo with the aim of achieving intent accuracy of >95% while using only half of the data required in other Deep Neural Net models
  • Using NLG to support Elaborate Question Asking (EQA) and Clarifying Question & Answer (CQA) to enable Amelia to follow-up dynamically without the need to build business rules.

The company is also looking to incorporate sentiment analysis within voice. While IPsoft regards basic speech-to-text and text-to-speech as commodity technologies, the company is looking to capture sentiment analysis from voice, differentiate through use of SLM/SRGS technology, and improve Amelia’s emotional intelligence by capturing aspects of mood and personality.

Launching Co-pilot to Remove the Demarcation Between Automated Handling and Agent Handling

Traditionally, interactions have either been handled by Amelia or by an agent if Amelia failed to identify the intent or detected issues in the conversation. However, IPsoft is now looking to remove this strong demarcation between chats handled solely by Amelia and chats handled solely by (or handed off in their entirety) to agents. The company has just launched “Co-pilot”, positioned as a platform to allow hybrid levels of automation and collaboration between Amelia, agents, supervisors, and coaches. The platform is currently in beta mode with a major telco and a bank.

The idea is to train Amelia on everything that an agent does to make hand-offs warmer and to increase Amelia’s ability to automate partially, and ultimately handle, edge cases rather than just pass these through to an agent in their original form. Amelia will learn by observing agent interactions when escalations occur and through reinforcement learning via annotations during chat.

When Amelia escalates to an agent using Co-pilot, it will no longer just pass conversation details but will now also offer suggested responses for the agent to select. These responses are automatically generated by crowdsourcing every utterance that every agent has created and then picking those that apply to the particular context, with digital coaches editing the language and content of the preferred responses as necessary.

In the short term, this assists the agent by providing context and potential responses to queries and, in the longer term as this process repeats over queries of the same type, Amelia then learns the correct answers, and ultimately this becomes a new Amelia skill.

Co-pilot is still at an early stage with lots of developments to come and, during 2019, the Co-pilot functionality will be enhanced to recommend responses based on natural language similarity, enable modification of responses by the agent prior to sending, and enable agents to trigger partial automated conversations.

This increased co-working between humans and digital chat agents is key to the future of Amelia since it starts to position Amelia as an integral part of the future contact center journey rather than as a standalone automation tool.

Building Use Cases & Partner Program to Reduce Time to Value

Traditionally, Amelia has been a great cognitive chat technology but a relatively heavy-duty technology seeking a use case rather than an easily implemented general purpose tool, like the majority of the RPA products.

In response, IPsoft is treading the same path as the majority of automation vendors and is looking to encourage organizations (well at least mid-sized organizations) to hire a “digital worker” rather than build their own. The company estimates that its digital marketplace “1Store” already contains 672 digital workers, which incorporate back-office automation in addition to the Amelia conversational AI interface. For example, for HR, 1Store offers “digital workers” with the following “skills”: absence manager, benefits manager, development manager, onboarding specialist, performance record manager, recruiting specialist, talent management specialist, time & attendance manager, travel & expense manager, and workforce manager.

At the same time, IPsoft is looking to increase the proportion of sales and service through channel partners. Product sales currently make up 56% of IPsoft revenue, with 44% from services. However, the company is looking to steer this ratio further in support of product, by targeting 60% per annum growth in product sales and increasing the proportion of personnel, currently approx. two-thirds, in product-related positions with a contribution from reskilling existing services personnel. 

IPsoft has been late to implement its partner strategy relative to other automation software vendors, attributing this early caution in part to the complexity of early implementations of Amelia. Early partners for IPcenter included IBM and NTT DATA, who embedded IPsoft products directly within their own outsourcing services and were supported with “special release overlays” by IPsoft to ensure lack of disruption during product and service upgrades. This type of embedded solution partnership is now increasingly likely to expand to the major CX services vendors as these contact center outsourcers look to assist their clients in their automation strategies.

So, while direct sales still dominate partner sales, IPsoft is now recruiting a partner/channel sales team with a view to reversing this pattern over the next few years. IPsoft has now established a partner program targeting alliance and advisory (where early partners included major consultancies such as Deloitte and PwC), implementation, solution, OEM, and education partners.

1Desk-based End-to-End Automation is the Future for IPsoft

IPsoft has about 600 clients, including approx. 160 standalone Amelia clients, and about a dozen deployments of 1Desk. However, 1Desk is the fastest-growing part of the IPsoft business with 176 enterprises in the pipeline for 1Desk implementations, and IPsoft increasingly regards the various 1Desk solutions as its future.

IPsoft is positioning 1Desk by increasingly talking about ROAI (the return on AI) and suggesting that organizations can achieve 35% ROAI (rather than the current 6%) if they adopt integrated end-to-end automation and bypass intermediary systems such as ticketing systems.

Accordingly, IPsoft is now offering end-to-end intelligent automation capability by combining the Amelia cognitive agent with “an autonomic backbone” courtesy of IPsoft’s IPcenter heritage and with its own RPA technology (1RPA) to form 1Desk.

1Desk, in its initial form, is largely aimed at internal SSC functions including ITSM, HR, and F&A. However, over the next year, it will increasingly be tailored to provide solutions for specific industries. The intent is to enable about 70% of the solution to be implemented “out of the box”, with vanilla implementations taking weeks rather than many months and with completely new skills taking approx.. three 3 months to deploy.

The initial industry solution from IPsoft is 1Bank. As the name implies, 1Bank has been developed as a conversational banking agent for retail banking and contains preformed solutions/skills covering the account representative, e.g. for support with payments & bills; the mortgage processor; the credit card processor; and the personal banker, to answer questions about products, services, and accounts.

1Bank will be followed during 2019 by solutions for healthcare, telecoms, and travel.

]]>
<![CDATA[Blue Prism Offers A Lever for Culture Change to Mature Enterprises]]> Blue Prism adopted the theme “Connected RPA – Powering the Connected Entrepreneur Enterprise” at its recent Blue Prism World conferences, the key components of connected-RPA being the Blue Prism connected-RPA platform, Blue Prism Digital Exchange, Blue Prism Skills, and Blue Prism Communities:

 

Components of Blue Prism's connected-RPA

 

Blue Prism is positioning by offering mature companies the promise of closing the gap with digital disruptors, both technically and culturally. The cultural aspect is important, with Blue Prism technology positioned as a lever to help organizations attract and inspire their workforce and give digitally-savvy entrepreneurial employees the technology to close the “digital entrepreneur gap” and also close the gap between senior executives and the workforce.

Within this vision, the Blue Prism roadmap is based around helping organizations to:

  • Automate more – here, Blue Prism is introducing intelligent automation skills, ML-based process discovery, and DX
  • Automate better – with more expansive and scalable automations
  • Automate together – by learning from the mistakes and achievements of others.

Introducing intelligent document processing capability

When analyzing the interactions on its Digital Exchange (DX), Blue Prism unsurprisingly found that the single biggest use, with 60% of the items being downloaded from DX, was related to unstructured document processing.

Accordingly, Blue Prism has just announced a beta intelligent document processing program, Decipher. Decipher is positioned as an easy on-ramp to document processing and is a document processing workflow that can be used to ingest & classify unstructured documents. It can be used “out-of-the-box” without the need to purchase additional licenses or products, and organizations can also incorporate their own document capture technologies, such as Abbyy, or document capture services companies within the Decipher framework.

Decipher will clean documents to ensure that they are ready for processing, apply machine learning to classify the documents, and then to extract the data. Finally, it will apply a confidence score to the validity of the data extracted and pass to a business user where necessary, incorporating human-in-the-loop assisted learning.

Accordingly, Decipher is viewed by Blue Prism as a first step in the increasingly important move beyond rule-based RPA to introduce machine learning-based human-in-the-loop capability. Not surprisingly, Blue Prism recognizes that, as machine learning becomes more important, people will need to be brought into the loop much more than at present to validate “low-confidence” decisions and to provide assisted learning to the machine learning.

Decipher is starting with invoice processing and will then expand to handle other document types.

Improving control of assets within Digital Exchange (DX)

The Digital Exchange (DX) is another vital component in Blue Prism’s vision of connected-RPA.

Enhancements planned for DX include making it easier for organizations to collaborate and share knowledge and facilitating greater security and control of assets by enabling an organization to control the assets available to itself. Assets will be able to be marked as private, effectively providing an enterprise-specific version of the Blue Prism digital exchange and within DX, there will be a “skills” drag-and-drop toolbar so that users, and not just partners, will be able to publish skills.

Blue Prism, like Automation Anywhere, is also looking to bring an e-commerce flavor to its DX: developers will be able to create skills and then sell them. Initially, Blue Prism will build some artifacts themselves. Others will be offered free-of-charge by partners in the short-term, with a view in the near term to enabling partners to monetize their assets.

Re-aligning architecture & introducing AI-related skills

Blue Prism has been working closely with cloud vendors to re-align its architecture, and in particular to rework its UI to appeal to a broader range of users and make Blue Prism more accessible to business users.

Blue Prism is also improving its underlying architecture to make it more scalable as well as more cloud-friendly. There will be a new, more native and automated means of controlling bots via a browser interface available on mobiles and tablets that will show the health of the environment in terms of meeting SLAs, and provide notifications showing where interventions are required. Blue Prism views this as a key step in moving towards provision of a fully autonomous digital workforce that manages itself.

Data gateways (available on April 30, 2019 in v6.5) are also being introduced to make Blue Prism more flexible in its use of generated data. Organizations will be able to take data from the Blue Prism platform and send it to ML for reporting, etc.

However, Blue Prism will continue to use commodity AI and is looking to expand the universe of technologies available to organizations and bring them into the Blue Prism platform without the necessity for lots of coding. This is being done via continuing to expand the number of Blue Prism partners and by introducing the concept of Blue Prism skills.

At Blue Prism World, the company announced five new partners:

  • Bizagi, for process documentation and modeling, connecting with both on-premise and cloud-based RPA
  • Hitachi ID Systems, for enhanced identity and access management
  • RPA Supervisor, an added layer of monitoring & control
  • Systran, providing digital workers with translation into 50 languages
  • Winshuttle, for facilitating transfer of data with SAP.

At the same time, the company announced six AI-related skills:

  • Knowledge & insight
  • Learning
  • Visual perception: OCR technologies and computer vision
  • Problem-solving
  • Collaboration: human interaction and human-in-the-loop
  • Planning & sequencing.

Going forward

Blue Prism recognizes that while the majority of users presenting at its conferences may still be focused on introducing rule-based processes (and on a show of hands, a surprisingly high proportion of attendees were only just starting their RPA journeys), the company now needs to take major strides in making automation scalable, and in more directly embracing machine learning and analytics.

The company has been slightly slow to move in this direction, but launched Blue Prism labs last year to look at the future of the digital worker, and the labs are working on addressing the need for:

  • More advanced process analytics and process discovery
  • More inventive and comprehensive use of machine learning (though the company will principally continue to partner for specialized use cases)
  • Introduction of real-time analytics directly into business processes.
]]>
<![CDATA[Automation Anywhere Monetizes Bot Store to Provide ‘Value as a 2-Way Street’]]> Automation Anywhere’s current Bot Store contains ~500 bots and has received ~40K downloads. In January 2019, these bots were complemented by Digital Workers, with bots being task-centric and Digital Workers being persona- and skill-centric.

 

 

So far, downloads from the Bot Store have been free-of-charge, but Automation Anywhere perceives that this approach potentially limits the value achievable from the Bot Store. Accordingly, the company is now introducing monetization to provide value back to developers contributing bots and Digital Workers to the Bot Store, and to increase the value that clients can receive. In effect, Automation Anywhere is looking to provide value as a two-way street.

The timing for introducing monetization to the Bot Store will be as follows:

  • April 16, 2019: announcement and start of sales process validation with a small number of bots and bot bundles priced within the Bot Store. Examples of “bot bundles” include a number of bots for handling email operation around Outlook or bots for handling common Excel operations
  • May 2019: Availability of best practice guides for developers containing guidelines on how to write bots that are modular and easy to onboard. Start of developer sign-up
  • Early summer 2019: customer launch through the direct sales channel. At this stage, bots and Digital Workers will only be available through the formal direct sales quotation process rather than via credit card purchases
  • Late summer 2019: launch of “consumer model” and Bot Store credit card payments.

Pricing, initially in US$ only, will be per bot or Digital Worker, with a 70:30 revenue split between the developer and Automation Anywhere, with Automation Anywhere handling the billing and paying the developer monthly. Buyers will have a limited free trial period, initially 30 days but under review, but IP protection is being introduced so that buyers will not have access to the source code. The original developer will retain responsibility for building, supporting, maintaining, and updating their bots and Digital Workers. Automation Anywhere is developing some Digital Workers itself in order to seed the Bot Store with some examples, but Automation Anywhere has no desire to develop Digital Workers medium-term itself and may, once the concept is well-proven, hand over/license the Digital Workers it has developed to third-party developers.

Automation Anywhere clearly expects that a number of smaller systems integrators will switch their primary business model from professional services to a product model, building bots for the Bot Store, and is offering developers the promise of a recurring revenue stream and global distribution ultimately through not only the Bot Store but through Automation Anywhere and its partners. Although payment will be monthly, developers will receive real-time transaction reporting to assist them in their financial management. For professional services firms retaining a strong professional services focus, but used to operating on a project basis, Automation Anywhere perceives that licensing and updating Digital Workers within this model could provide both a supplementary revenue stream, and possibly, more importantly, a means to maintain an ongoing relationship with the client organization.

In addition to systems integrators, Automation Anywhere is targeting ISVs who, like Workday, can use the Bot Store and Automation Anywhere to facilitate deployment and operation of their software by introducing Digital Workers that go way beyond simple connectors. Although the primary motivation of these firms is likely to be to reduce the time to value for their own products, Automation Anywhere expects ISVs to be cognizant of the cost of adoption and to price their Digital Workers at levels that will provide both a reduced cost of adoption to the client and a worthwhile revenue stream to the ISV. Pricing of Digital Workers in the range $800 to as high as $12k-$15K per annum has been mentioned.

So far, inter-enterprise bot libraries have largely been about providing basic building blocks that are commonly used across a wide range of processes. The individual bots have typically required little or no maintenance and have been disposable in nature. Automation Anywhere is now looking to transform the concept of bot libraries to that of bot marketplaces to add a much higher, and long-lived, value add and to put bots on a similar footing to temporary staff with updateable skills.

The company is also aiming to steal a lead in the development of such bots and, preferably Digital Workers, by providing third-parties with the financial incentive to develop for its own, rather than a rival, platform.

]]>
<![CDATA[Automation Anywhere Looking to 'Deliver the Digital Workforce for Everyone']]> Automation Anywhere, as with the RPA market in general, continues to grow rapidly. The company estimates that it now has 1,600 enterprise clients, encompassing 3,800 unique business entities across 90 countries with ~10,000 processes deployed. At end 2018, the company had 1,400 employees, and it expects to have 3,000 employees by end 2019.

The company was initially slow to go-to-market in Europe relative to Blue Prism and UiPath, but estimates it has more than tripled its number of customers in Europe in the past 12 months.

NelsonHall attended the recent Automation Anywhere conference in Europe, where the theme of the event was “Delivering Digital Workforce for Everyone” with the following sub-themes:

  • Automate Everything
  • Adopted by Everyone
  • Available Everywhere.

Automate Everything

Automation Anywhere is positioning as “the only multi-product vendor”, though it is debatable whether this is entirely true and also whether it is desirable to position the various components of intelligent automation as separate products.

Nonetheless, Automation Anywhere is clearly correct in stating that, “work begins with data (structured and unstructured) – then comes analysis to get insight – then decisions are made (rule-based or cognitive) – which leads to actions – and then the cycle repeats”.

Accordingly, “an Intelligent RPA platform is a requirement. AI cannot be an afterthought. It has to be part of current processes” and so Automation Anywhere comes to the following conclusion:

Intelligent digital workforce = RPA (attended + unattended) + AI + Analytics

Translated into the Automation Anywhere product range, this becomes:

 

 

Adopted by Everyone

Automation Everywhere clearly sees the current RPA market as a land grab and is working hard to scale adoption fast, both within existing clients and to new clients, and for each role within the organization.

The company has traditionally focused on the enterprise market with organizations such as AT&T, ANZ, and Bank of Columbia using 1,000s of bots. For these companies, transformation is just beginning as they now look to move beyond traditional RPA, and Automation Anywhere is working to include AI and analytics to meet their needs. However, Automation Anywhere is now targeting all sizes of organization and sees much of its future growth coming from the mid-market (“automation has to work for all sizes of organization”) and so is looking to facilitate adoption here by introducing a cloud version and a Bot Store.

The company sees reduced “time to value” as key to scaling adoption. In addition to a Bot Store of preconfigured bots, the company has now introduced the concept of downloadable “Digital Workers” designed around personas, e.g. Digital SAP Accounts Payable Clerk. Automation Anywhere had 14 Digital Workers available from its Bot Store as at mid-March 2019. These go beyond traditional preconfigured bots and include pretrained cognitive capability that can process unstructured data relevant to the specific process, e.g. accounts payable.

In addition, Automation Anywhere believes that to automate at the enterprise-wide level you have to onboard your workforce very fast, so that you can involve more of the workforce sooner. Accordingly, the company is providing role-based in-product learning and interfaces.

To enable the various types of user to ramp up quickly, the coming version of Automation Anywhere will provide a customizable user interface to support the differing requirements of the business, IT, and developers, providing unique views for each. For example:

  • The business user interface can be set up with a customized tutorial on how to build a simple bot using a Visio-like graphical interface. The advanced functionality can be hidden when they start using the tool. Alternatively, the business user can use the recorder to create a visual representation of what needs to be done, including documenting cycle times and savings information, etc., then passing this requirement to a developer
  • Advanced developers, on the other hand, can be set up with advanced functionality including, for example, the ability to embed their own code in, say, Python
  • An IT user can learn about and manage user management, including roles and privileges, and license management.

The Automation Anywhere University remains key to adoption for all types of user. Overall, Automation Anywhere estimates that it has trained ~100K personnel. The Automation Anywhere University has:

  • An association with 200 educational institutions
  • 26 training partners
  • 9 role-based learning tracks
  • 120 certified trainers
  • Availability in 4 course languages.

An increased emphasis on channel sales is also an important element in increasing adoption, with Automation Anywhere looking to increase the proportion of sales through partners from 50% to 70%. The direct sales organization consists of 13 field operating units broken down into pods, and this sales force will be encouraged to leverage partners with a “customer first/partner preferred” approach.

Partner categories include:

  • BPOs with embedded use of Automation Anywhere, and Automation Anywhere is now introducing tools that will facilitate support for managed service offerings
  • Global alliance partners (major consultancies and systems integrators)
  • The broader integrator community/local SIs
  • A distributor channel. Automation Anywhere is currently opening up a volume channel and has appointed distributors including TechData and ECS
  • Private Equity. Automation Anywhere has set up a PE practice to go after the more deterministic PEs who are very prescriptive with their portfolio companies.

In addition, Automation Anywhere is now starting to target ISVs. The company has a significant partnership with Workday to help the ISV automate implementation and reduce implementation times by, for example, assisting in data migration, and the company is hoping that this model can be implemented widely across ISVs.

Automation Anywhere is also working on a partner enablement platform, again seen as a requisite for achieving scale, incorporating training, community+, etc. together with a demand generation platform.

Customer success is also key to scaling. Here, Automation Anywhere claims a current NPS of 67 and a goal to exceed the NPS of 72 achieved by Apple. With that in mind, Automation Anywhere has created a customer success team of 250 personnel, expected to grow to 600+ as the team tries to stay ahead of customer acquisition in its hiring. All functions with Automation Anywhere get their feedback solely through this channel, and all feedback to clients is through this channel. In addition, the sole aim of this organization is to increase the adoptability of the product and the organization’s NPS. The customer success team does not get involved in up-selling, cross-selling, or deal closure.

Available Everywhere

“Available Everywhere” encompasses both a technological and a geographic perspective. From a hosting perspective, Automation Anywhere is now available on cloud or on-premise, with the company clearly favoring cloud where its clients are willing to adopt this technology. In particular, the company sees cloud hosting as key to facilitating its move from the enterprise to increasingly address mid-market organizations.

At the same time, Automation Anywhere has “taken installation away” with the platform, whether on-premise or on cloud, now able to be accessed via a browser. The complete cloud version “Intelligent Automation Cloud” is aimed at allowing organizations to start their RPA journey in ~4 minutes, while considerably reducing TCO.

 

 

In terms of languages, the user interface is now available in eight languages (including French, German, Japanese, Spanish, Chinese, and Korean) and will adjust automatically to the location selected by the user. At the same time, the platform can process documents in 190 languages.

Automation Anywhere also provides a mobile application for bot management.

Summary

In summary, Automation Anywhere regards the keys to winning a dominant market share in the growth phase of the RPA market as being about simultaneously facilitating rapid adoption in its traditional large enterprise market and moving to the mid-market and SMEs at speed.

The company is facilitating ongoing RPA scaling in large enterprises by recognizing the differing requirements of business users, IT, and developers, and establishing separate UIs to increase their acceptance of the platform while increasingly supporting their need to incorporate machine learning and analytics as their use cases become more sophisticated. For the smaller organization, Automation Anywhere has facilitated adoption by introducing free trials, a cloud version to minimize any infrastructure hurdles, and a Bot Store to reduce development time and time to value.

]]>
<![CDATA[The Move to B2B Platforms: Q&A with Manuel Sevilla, Capgemini CDO]]>

 

Platforms have been increasingly important in B2C digital transformation in recent years and have been used to disintermediate and create a whole raft of well-known consumer business opportunities. B2B platforms have been less evident during this period outside the obvious ecosystems built up in the IT arena by the major cloud and software companies. However, with blockchain now emerging to complement the increasing power of cognitive and automation technologies, the B2B platform is now once again on the agenda of major corporations.

One IT services vendor assisting corporations in establishing B2B platforms to reimagine certain of their business processes is Capgemini, where B2B platform development is a major initiative alongside smart automation. In this interview, NelsonHall CEO John Willmott talks with Manuel Sevilla, Capgemini’s Chief Digital Officer, about the company’s B2B platform initiatives.

 

JW: Manuel, welcome. As Chief Digital Officer of Capgemini, what do you regard as your main goals in 2019?

MS: I have two main goals:

  • First, automation. We’re looking to automate all our clients’ businesses in a smart way, transforming their services using combinations of RPA, AI, and use of APIs to move their processes to smart automation
  • Second, to build B2B platforms that enable customers to explore new business models. I see this as a key development in the industry over the next few years, fueled by the need for third-party involvement in establishing peer-to-peer blockchain-based B2B platforms.

JW: What do you see as the keys to success in building a B2B platform?

MS: The investment required to establish a B2B platform is significant by nature and has to be seen in the long-term. This significant and long-term investment is required across the following three areas:

  • Obviously, building the platform requires a significant investment since, in a B2B environment, the platform must have the ability to scale and have a sufficient number of functionalities to provide enough value to the customers
  • Governance is critical to provide mechanisms for establishing direction and priorities in both the short and long-term
  • Building the ecosystem is absolutely critical for widespread platform adoption and maintaining the platform’s longevity.

JW: How do the ecosystem requirements differ for a B2B platform as opposed to a B2C platform?

MS: B2B and B2C are very different. In B2C environments, a partial solution is often sufficient for consumers to start using it. In B2B, corporates will not use a partial platform. For example, for corporates to input their private data, the platform has to be fully secured. Also, it is important to bring a service that delivers enough value either by simplifying and reducing process costs or by providing access to new markets, or both. For example, a B2B supply chain platform with a single auto manufacturer will undoubtedly fail. The big components suppliers will only join a platform that provides access to a range of auto manufacturers, not a separate platform for each manufacturer.

Building the ecosystem is perhaps the most difficult task when creating a B2B platform. The value of Capgemini is that the company is neutral and can take the lead in driving the initiatives to make the platform happen. Capgemini recognizes humbly that for a platform to scale, it needs not only a diverse range of partners but also that Capgemini cannot be the only provider; it is critical to involve Capgemini’s partners and competitors.

JW: How does governance differ for a B2B platform?

MS: In a fast-moving B2B environment, defining the governance has to proceed alongside building the ecosystem, and it is essential to have processes in place for taking decisions regarding the platform roadmap in both the short and long-term.

B2B platform governance is not the usual two-way client/vendor governance; it is much more complex. For a B2B platform, you need to have a clear definition of who is a member and how members take decisions. It then needs enough large corporates as founder members to drive initial functionalities and to ensure that the platform will bring value and will be able to scale. Once the platform has critical mass, then the governance mechanism needs to adapt itself to support the future scaling of the platform, often with an accompanying dilution of the influence of the founder members.

The governance for a B2B platform often involves creating a separate legal entity, which can be a consortium, a foundation, or even multiple legal entities.

JW: Can you give me an example of where Capgemini is currently developing a B2B platform?

MS: Capgemini is currently developing four B2B platforms, including one with the R3 consortium to build a B2B platform called KYC Trust that aims to solve the corporate KYC problem between corporates and banks. Capgemini started work on KYC Trust in early 2016 and it is expected to go into scaled production in the next 12-24 months.

JW: What is the corporate KYC problem and how is Capgemini addressing this?

MS: Corporate KYC starts with the data collection process, with, at present, each bank typically asking the corporate several hundred questions. As each bank typically asks its own unique questions, this creates a substantial workload for the corporate across banks. Typically, it takes a month to collect the information for each bank. Then, once a bank has collected the information on the corporate, it needs to check it, which means paying third-parties to validate the data. The bank then typically uses an algorithm to score the acceptability of the corporate as a customer. This process needs to be repeated regularly. Also, the corporate typically has to wait, say, 30 days for its account to be opened.

To simplify and speed up this process, Capgemini is now building the KYC Trust B2B platform. This platform incorporates a standard KYC taxonomy to remove redundancy from, and standardize, data requests and submission, and each corporate will store the documents required for KYC in its own nodes on the platform. Based on the requests received from banks, a corporate can then decide which documents will be shown to whom and when. All these transactions will be traceable in blockchain so that the usage of each document can be tracked in terms of which bank accessed it and when.

The advantage for a bank in onboarding a new corporate using this platform is that a significant proportion of the information required from a corporate will already exist, having already been supplied to another bank. The benefits to corporates include reducing the effort in submitting information and in being able to identify which information has been used by which bank and when, where, and how.

This will speed up the KYC process and simplify data collection operations. It will also simplify how corporates manage their own data such as shareholder information and information on new beneficial owners.

JW: How does governance work in the case of KYC Trust?

MS: A foundation will be established in support of the governance of KYC Trust. The governance has two main elements:

  • Establishing the basic rules, in particular, defining how a node can be operated and specifying the applications that can be run on top of the platform to create questionnaires and how the platform will integrate with banks’ own KYC platforms
  • Providing the means for corporates to submit information, enabling the mixing of data from multiple countries while respecting local regulations. This includes splitting the information submission between the various legal entities of each corporation with data potentially only hosted locally for each legal entity.

Key principles of the foundation are respect for openness and interoperability, since there cannot be a single B2B platform that meets all the business needs. In order to build scale, it is important to encourage interoperability with other B2B platforms, such as (in this case) the Global Legal Entity Identifier Foundation (GLEIF), to maximize the usefulness and adoption of the platform.

JW: How generally applicable is the approach that Capgemini has taken to developing KYC Trust?

MS: There are a lot of commonalities. Sharing of documents in support of certification & commitments is the first step in many business processes. This lends itself to a common solution that can be applied across processes and industries. Capgemini is building a structure that would allow platforms to be built in support of a wide range of B2B processes. For example, the structure used within KYC Trust could be used to support various processes within supply chain management. Starting with sourcing, it could be used to ensure, for example, that no children are being employed in a factory by asking the factory to submit a document certified by an NGO to this effect every six months. Further along the supply chain, it could also be used, for example, to support the correct use of clinical products sold by pharmaceutical companies.

And across all four B2B platforms currently being developed by Capgemini, the company is incorporating interoperability, openness, and a taxonomy as standard features.

JW: Thank you Manuel, and good luck. The emergence of B2B platforms will be a key development over the next few years as organizations seek to reimagine and digitalize their supply chains, and I look forward to hearing more about these B2B platform initiatives as they mature.

]]>
<![CDATA[BearingPoint Looks to Evolve Advisory Model Under New Managing Partner]]>

 

NelsonHall recently attended BearingPoint’s analyst event in Lisbon. As it starts its second decade with a new Managing Partner (Kiumars ‘Kiu’ Hamidian, only the second in the company’s history), the strategy that has served BearingPoint well in its first ten years is now evolving in ways that reflect significant developments in the nature of the consulting market.

In its first decade as a company since the 2009 MBO, BearingPoint has been something of a success story in the European management and IT consulting market, achieving sustained topline growth supported by geographic expansion, and steady improvement of its EBIT margin. 2017 revenues were up 13% to €712m, with growth in all geographies and service lines, and the firm is well on its way to achieve its targeted €1bn revenues by 2020.

Key elements of strategy

Elements of BearingPoint’s strategy in recent years that remain key pillars going forward include:

  • The ‘One Firm’ mindset, with a common set of offerings and consistency of delivery methodologies across geographies
  • The focus on clients headquartered in Europe, achieving a ‘global reach’ to be able to support them in projects outside Europe through an alliance ecosystem (West Monroe Partners in the U.S., ABeam Consulting in Asia, Grupo ASSA in LATAM)
  • The business model, comprising:
    • Strategy, made up of four service lines: digital & strategy, finance & regulatory, operations, IT advisory
    • Solutions: the Solutions unit, launched in 2015, has three product lines: IP in regulatory technology, in particular fintech (e.g. its Abacus suite); advanced analytics; and digital platform solutions for the CSP and entertainment sectors (based on Infonova R6, now offered on AWS)
    • Ventures, a more recent capability; e.g. an investment in Norwegian insure-tech start-up Tribe in April 2017. Also includes employee ventures, typically coming from its ‘Be an Innovator’ initiative, and client ventures, emanating from consulting projects with start-ups
  • Selective acquisitions, for example in 2017 of retail supply-chain specialist LCP Consulting in the U.K., and an automotive consulting unit in Italy
  • An increasing emphasis in recent years on innovation, e.g. the introduction of the ‘Be an Innovator’ process and of shark tank events.

Forward-looking priorities

While BearingPoint’s next five-year plan has yet to be finalized, Hamidian outlined four priorities in the following dimensions:

  • Markets
  • Portfolio
  • People
  • Culture.

Markets

BearingPoint is looking to build up capabilities in several European countries, including the U.K. (where the practice is relatively small, focusing on sectors such as financial services) and the Netherlands. In terms of headcount, BearingPoint remains very focused on Germany and France, and has product units in Austria (ex-Infonova) and Switzerland (Abacus): the ambition is to have a minimum of 300 people in each of the major European markets. Outside Europe, BearingPoint is also looking to work with its partners to expand its presence in the U.S. and China, including Singapore, where it has a joint hub with ABeam Consulting in Asia focusing on IP-based reg-tech projects.

Portfolio

There is a very clear drive to shift from the classic process redesign work of traditional consultancy services and focus much more strongly with clients on projects that leverage IP assets, and are more transformational in nature (for example, looking at new business models). The role of the Solutions unit is critical in this. Since January, the unit has had its own P&L and regional managers, encouraging, inter alia, entrepreneurialism in both product development and GTM.

In addition to some well-established assets around reg-tech (for which it is best known), the unit has also developed IP such as its Factory Navigator, which simulates production and logistics processes; LOG 360 vehicle emissions calculation, built on SAP HANA; and Active Manager, used for coaching and training front-line managers, e.g. in call centers, to be more active/effective. All are SaaS-based offerings. One of the clients presenting to whom we spoke is a very strong advocate of Active Manager, having implemented it at a major telco and subsequently introduced it in his next role in a different sector.

Expect to see further developments to the portfolio, including industry-specific solutions. But the strategic element lies in the intersection between Solutions and Consulting – the aim is for consulting projects and also managed services increasingly to have embedded IP. 

As well as its own IP, BearingPoint is looking to increasingly position around its abilities to orchestrate an ecosystem of technology partner alliances: having started with Salesforce (now a Platinum partner), the emphasis has expanded to RPA and AI and emerging technologies such as blockchain. The last two years have seen a large increase in the number of technology partnerships, and more are to be expected.

The role of the Ventures unit is also important here. While BearingPoint also refers to employee ventures, most coming from its ‘Be an Innovator’ initiative, and to client ventures, emanating from consulting projects with start-ups, the primary focus is on market ventures. It is working with incubators such as LeVillage in Paris and weXelerate in Vienna (see our 2017 blog here) and hosting events like the BearingPoint Insurance Dialog in Cologne that offer speed dating opportunities for early stage start-ups. A recent investment was in Insignary, a South Korean startup with a binary level open source software (OSS) security and compliance scanning solution, BearingPoint’s first investment in an Asian start-up. BearingPoint is leveraging Insignary’s Clarity solution to offer a managed SAST (static apps security testing) binary scanning service in Europe.

The expansion of IP-based services is a key element of BearingPoint’s Digital & Strategy (D&S) offering, which we note has new leadership.

People

BearingPoint’s new Managing Partner has spoken repeatedly about his desire for the firm to provide a very positive employee experience, an important element in both the recruitment and retention of younger talent. Other priorities he has expressed include increasing the firm’s diversity, of generation as well as of gender (one target is 20% female Partners by 2020), and talent development. We do not know the age or experience profile of BearingPoint personnel, but we do detect a desire to have a workforce that is perhaps more balanced in terms of age and experience, and a slight shift away from a traditional consultancy profile.

We also note an evolution in leadership style with a stronger emphasis in transparency and communication: several personnel mentioned in conversation that Hamidian encourages colleagues to email him and is responsive when they do.

Culture

As part of its ambition to change the nature of much of its consulting work beyond operating model improvement to projects that have more radical transformation in mind, BearingPoint is looking (like many consulting and IT services firms) to nurture a culture where entrepreneurialism and innovation are encouraged (for example through initiatives such as shark tank events), and overall to become a more agile organization.

Hamidian is also looking to develop partners’ management and team leadership skills through initiatives such as new partner training programs.

Summary

In its first decade since the MBO, BearingPoint has succeeded in putting in place a strong foundation of an integrated European consulting firm that can claim, through its strategic partnerships, to have a more global reach. The next five years will be marked, not by global expansion, but by an evolution in positioning, with an increasing emphasis on services that leverage its own and partners’ IP to assist clients in their digital transformation, potentially also boosting margins. Expect to see more partnership announcements around IP-based offerings; shortly after the event, for example, BearingPoint announced its regtech product unit and IBM is partnering to offer a BPO service around regulatory reporting to smaller institutions in the DACH region.

Expect also to see an increase in tuck-in acquisitions of small firms operating in its target geographies (including the U.K.) that bring in industry domain and or specialist capabilities. Again, shortly after the event, BearingPoint announced its acquisition of Inpuls, which brings in capabilities in data governance and analytics and also doubles its headcount in Belgium.

As a final note, there were several aspects of the analyst day that stood out from other vendor events we have attended recently:

  • The total absence of PowerPoint presentations, with a heavy focus instead on clients telling their stories and describing how BearingPoint has supported them
  • The level of female representation (roughly 50% of the speakers) – an all-too common experience is that the only female speakers at analyst and advisory events are those from clients. Large organizations in Europe and the U.S. are increasingly demanding a level of female representation from suppliers bidding for work in certain areas of professional services; for a variety of reasons, lack of gender diversity in the talent mix will increasingly be an impediment in IT and consulting services). The level of female representation was doubtless a deliberate move; gender diversity is clearly a high priority.
]]>
<![CDATA[Atos to Leverage New Aegon Contract to Challenge for U.K. Life & Pensions Closed Books]]>

 

In 2016, Atos was awarded a 13-year life & pensions BPO contract by Aegon, taking over from the incumbent Serco and involving the transfer of ~300 people in a center in Lytham St Annes.

The services provided by Atos within this contract include managing end-to-end operations, from initial underwriting through to claims processing, for Aegon's individual protection offering, which comprises life assurance, critical illness, disability, and income protection products (and for which Aegon has 500k customers).

Alongside this deal, Aegon was separately evaluating the options for its closed book life & pensions activity and subsequently went to market to outsource its U.K. closed book business covering 1.4m customers across a range of group and individual policy types. The result was an additional 15-year deal with Atos, signed recently.

Three elements were important factors in the award of this new contract to Atos:

  • Transfer of the existing Aegon personnel
  • Ability to replatform the policies
  • Implementation of customer-centric operational excellence.

Leveraging Edinburgh-Based Delivery to Offer Onshore L&P BPS Service

The transfer of the existing Aegon personnel and maintaining their presence in Edinburgh was of high importance to Aegon, the union, and the Scottish government. The circa 800 transferred personnel will continue to be housed at the existing site when transfer takes place in summer 2019, with Atos sub-leasing part of Aegon’s premises. This is possible for Atos since it is the company’s first life closed block contract and the company is looking to win additional deals in this space over the next few years (and will be going to market with an onshore rather than offshore-based proposition).

Partnering with Sapiens to Offer Platform-Based Service

While (unlike some other providers of L&P BPS services) Atos does not own its own life platform, the company does recognize that platform-based services are the future of closed book L&P BPS. Accordingly, the company has partnered with Sapiens, and the Sapiens insurance platform will be used as a common platform and integrated with Pega BPM across both Aegon’s protection and closed book policies.

Atos has undertaken to transfer all of the closed block policies from Aegon’s two existing insurance platforms to Sapiens, and these will be transferred over the 24-month period following service transfer. The new Sapiens-based system will be hosted and maintained by Atos.

Aiming for Customer-Centric Operational Excellence

The third consideration is a commitment by Atos to implement customer-centric operational excellence. While Aegon had already begun to measure customer NPS and assess ways of improving the service, Atos has now begun to employ further the customer journey mapping techniques deployed in its Lytham center to identify customer effort and pain points. Use of the Sapiens platform will enable customer self-service and omni-channel service, while this and further automation will be used to facilitate the role of the agent and enhance the number of policies handled per FTE.

The contract is priced using the fairly traditional pricing mechanisms of a transition and conversion charge (£130m over a 3-year period) followed by a price per policy, with Atos aiming for efficiency savings of up to £30m per annum across the policy book.

Atos perceives that this service will become the foundation for a growing closed block L&P BPS business, with Atos challenging the incumbents such as TCS Diligenta, Capita, and HCL. Edinburgh will become Atos’ center of excellence for closed book L&P BPS, with Atos looking to differentiate from existing service providers by offering an onshore-based alternative with the digital platform and infrastructure developed as part of the Aegon relationship, offered on a multi-client basis. Accordingly, Atos will be increasingly targeting life & pensions companies, both first-time outsourcers and those with contracts coming up for renewal, as it seeks to build its U.K. closed book L&P BPS business.

]]>
<![CDATA[Capgemini: “In Shape and On the Move”]]>

 

We have delayed the publication of this event note until after Capgemini’s Capital Markets Day this week, when Capgemini confirmed its mid-term ambitions of 5-7% organic growth and an operating margin of 12.5-13%.

“In Shape and On the Move” were among the first few words of COO Thierry Delaporte’s closing address at Capgemini’s global analyst and advisor meet in NYC, and this claim nicely sums up what is happening at Capgemini in 2018. Let’s take a closer look.

Examining the claim

In shape

The company is in relatively good shape. In 2017 Capgemini achieved a recovery in its North American operations and delivered organic topline growth of 3.6%, an adjusted operating margin of 11.7% (its third consecutive year at over 10%) and FCF of €1,080m. Guidance for 2018 includes >7.5% CC growth, of which ~1.8% inorganic, and an adjusted operating margin of 12.5-13%. This is solid execution, especially for a European headquartered IT services major.

On the move

Having celebrated its 50th anniversary last October, Capgemini is also very evidently on the move: it has been expanding its capabilities in Digital and Cloud, including making several acquisitions around experience design, as well as making progress in renovating some of its traditional services (e.g. around agile development). It is also getting to grips with long-term challenges in presenting a unified face to the client and in gaining access to CXOs other than the CIO to position for opportunities in supporting clients with their digital transformation (where any growth in IT spend is coming from). The reorganization taking place at Capgemini today, impacting both go-to-market and portfolio management, is perhaps the most radical and most significant in its history. Certainly, it is one that is required for Capgemini to be able to position on its slogan of ‘A Leader for Leaders’.

Overview of organizational changes

In attending the event, we were keen to check whether the new group organization that had been publicly announced a few days before is as fundamental a reengineering, particularly around portfolio, as it sounds (or perhaps just an application of lipstick). After many conversations with Capgemini folk, we came away convinced. So, here’s a brief overview of the organizational changes.

Firstly, around go-to-market

Historically, Capgemini’s decentralized structure meant on occasions a lack of coordination at the account level outside the very largest strategic accounts (which have been covered for some years via dedicated account managers or country boards); the group has been seeking to address this for some years. Back in 2016, it looked as if Capgemini might start replicating the success of its Financial Services SBU – which has expanded from application services to selling the full Capgemini portfolio – to other verticals. And with the development of the portfolio around Digital Manufacturing, it looked as if this might be the case with manufacturing, rather than, as we expected in 2016, retail.

What Capgemini has opted for is more realistic for the group. There is more sector relevance, without a wholescale group-wide verticalization. FS remains a purely vertical SBU. Elsewhere, each major geographical SBU (North America & APAC; EMEA) is now aligned by sector in the go-to-market. This means the taxonomy of sector offerings is now globally standardized; also, Capgemini’s model has moved from parallel P&L structures to a more unified GTM at the account level. For smaller accounts, the GTM is by vertical within a service line.

So, there is both increased account centricity and some increased sectorial focus.

Secondly, around portfolio management

This is perhaps the more remarkable aspect of Capgemini’s restructuring. There are seven current priorities across the portfolio, which Capgemini classifies in three groups:

  • ‘Rejuvenating core IT’ (still a major part of the business; Capgemini claims ~45% of its business is in Digital and Cloud):
    • Next gen AMS
    • Digital core (S/4 HANA, ERP to cloud, intelligent process automation in BPO)
  • ‘Reinforcing high growth offers’ (with an increasing sectorial dimension around some of these):
    • Digital CX
    • Cloud
    • Cyber (will be boosted in Q4 by the Leidos commercial sector acquisition)
  • ‘The New’:
    • Digital manufacturing
    • AI & analytics.

We assume emerging technologies such as blockchain and AR/VR are either being subsumed within areas such as Digital Manufacturing or will in time appear as separate priorities within ‘The New’.

Capgemini is changing the way it is working. There are now five global business lines, with effect from July 1:

  • Capgemini Invent, comprised of Capgemini Consulting and a series of recent acquisitions: LiquidHub, Fahrenheit 212, Idean, Adaptive Lab (also Backelite, acquired back in 2011)
  • Engineering & Manufacturing Services, comprised of different units, including Sogeti in France and the U.S., IGATE’S engineering services unit, and Digital Manufacturing Services
  • Business Services
  • Cloud & Infrastructure Services
  • Insights & Data.

We would expect the service line reporting to change to reflect this in 2019.

The first two of the five global business lines are brand new practices.

The two geographical application services capabilities held within APPS.1 and APPS.2 remain as local practices – clearly this was too big a pill to swallow currently.          

Across all of these, Capgemini is, unsurprisingly, looking to inject an innovation agenda, e.g. injecting automation/AI tools and analytics into traditional IT services and BPO, and building scale and reusable solutions in the newer areas.

One minor distinction is that there is not a central unit with responsibility for developing AI models: the approach, that AI is infused everywhere, prevents the potential siloed approach to developing uses cases across service lines that we have noted with some other services providers

The big change: Capgemini Invent

The priority in terms of portfolio development is with Capgemini Invent, launched externally in September, and covering consulting, transformation, and invention activities. It has six practices:

  • Innovation and strategy, led by Fahrenheit 212, focusing on new products and services and business models
  • Customer engagement, focusing on CX to handle complexity (e.g. channels) and IT modernization. It competes with digital agencies
  • Future of technology, using emerging technologies such as AI, robotics, and blockchain
  • Insight-driven enterprise, data analytics and AI
  • Operations transformation, with a focus on industrial operations
  • People and organization.

If we are to believe what we have been told, Capgemini Consulting ceases to exist.

Speed of integration of LiquidHub shows momentum

Where Fahrenheit 212 may have helped changed the mindset of the group, LiquidHub has added scale and is the heart of Capgemini Invent in the U.S. The intended level of integration of the various units that make up Capgemini Invent is evident in the immediate retirement of the LiquidHub brand name. This is a significant difference from what we see happening in some other major service providers, where their Digital practices include acquired entities that have retained their brand names – thereby distinguishing them from the core IT services practices. There are a number of advantages if Capgemini Consulting – and the other acquired assets that make up the practice – all operate under the Capgemini ‘Invent’ brand. For example, the Invent brand could be helpful in attracting younger talent.

Conclusion

The creation of Capgemini Invent (and the concomitant retirement of Capgemini Consulting) is a bold move by Capgemini in helping it position much more strongly around business innovation. The new organization structure should also be instrumental in driving change across the group.

We were expecting to see some tuck-in acquisitions in Europe to help build a full set of Capgemini Invent capabilities across geographies, and indeed Capgemini has just announced its acquisitions of June 21 in France and of Doing in Italy; we expect there will be others, perhaps in Germany or the Nordics.

There is some progress in terms of sectoral dimension, and the fact that there is a practice in Capgemini Invent focusing on industrial operations is significant (it is not just looking at digital marketing and UX). But we think Capgemini has some way to go in certain key target sectors, and we expect sector-specific offerings to feature more prominently in the next few years.

The joint COO structure is unusual, but as well as providing a clear indication of CEO succession planning it also provides a clear dual focus for corporate developments, for example with Thierry Delaporte driving the sector plays.

During the event, Capgemini cited an example of a client where the relationship has evolved from being a volume partner (a large application maintenance contract), to a value partner (Capgemini is now their main digital partner). This illustrates neatly the ambition of the group.

]]>
<![CDATA[Wipro Harmony Simplifies GBS Harmonization & Automation]]>

 

Wipro has a long history of building operations platforms in support of shared services and has been evolving its Base))) platform for 10 years. The platform started life as a business process management (i.e. workflow) platform and includes Base))) Core, a BPM platform in use within ~25 Wipro operations floors, and Base))) Prism, providing operational metrics.

These elements of Base))) are now complemented by Base))) Harmony, a SaaS-based process capture and documentation platform.

So why is this important? Essentially, Harmony is appropriate where major organizations are looking to stringently capture and document their processes across multiple SSCs to further harmonize or automate these processes. It is particularly suitable for use where:

  • Organizations are looking to drive the journey to GBS and consolidate their SSCs into a smaller number of centers
  • Multinationals are active acquirers and need to be able to standardize and integrate SSCs within acquired companies into their GBS operations
  • BPS contracts are coming up for renewal
  • Organizations are looking to shorten the time-consuming RPA assessment lifecycle.

Supporting Process Harmonization for SSC Consolidation & Acquisition

Harmony is most appropriate for multinational organizations with multiple SSCs looking to consolidate these further. It has been used in support of standardized process documentation, and library and version control, by major organizations in the manufacturing, telecoms and healthcare sectors.

In recent years, multinationals have typically been on a journey moving away from federated SSCs, each with their own highly customized processes, to a GBS model with more standardized processes. However, relatively few organizations have completed this journey and, typically, scope remains for further process standardization and consolidation. This situation is often exacerbated by a constant stream of acquisitions and the need to integrate the operations of acquired companies. Many multinationals are active acquirers and need to be able to standardize and integrate SSCs within acquired companies into their GBS operations as quickly and painlessly as possible.

Process documentation is a key element in this process standardization and consolidation. However, process documentation in the form of SOPs can often be a manual and time-consuming process suffering from a lack of governance and change & version control.

Harmony is a standalone SaaS platform for knowledge & process capture and harmonization that aims to address this issue. It supports process capture at the activity level, enabling process steps to be captured diagrammatically along with supporting detailed documentation, including attachments, video, and audio.

The documentation is highly codified, capturing the “why, what, who, & when” for each process in a structured form along with the possible actions for each process step, e.g. allocate, approve, or calculate, using the taxonomy developed in the MIT Process Handbook.

From a review perspective, Harmony also provides a view from the perspectives of data, roles, and systems for each process step, so that, for example, it is easy to identify which data, roles or systems are involved in each step. Similarly, the user can click on, say, a specific role to see which steps that role participates in. This assists in checking for process integrity, e.g. checking that a role entering data cannot also be an approver.

Wipro estimates that documentation of a complex process with ~300 pages of SOP takes 2-3 weeks, with documentation of a simple process such as receiving an invoice or onboarding an employee taking 2-3 days, and initial training in Harmony typically taking a couple of days.

Harmony also supports process change governance, notifying stakeholders when any process modifications are made.

Reports available include:

  • SOPs, including process flows, and screenshots, which can be used for training purposes
  • SLAs
  • Project plans
  • Role summaries
  • Gap analysis, covering aspects such as SLAs and scope for automation.

Support for process harmonization and adoption of reference or “golden processes” are also key aspects of Harmony functionality. For example, it enables the equivalent processes in various countries or regions to be compared with each other, or with a reference process, identifying the process differences between regions. The initial reference process can then be updated as part of this review, adding best practices from country or regional activities.

Harmony also plays a role in best practice adoption, including within its process libraries a range of golden processes, principally in finance & accounting and human resources, which can be used to speed up process capture or to establish a reference process.

Facilitating Value Extraction from BPS Contract Renewals

Despite the lack of innovation experienced within many in-force F&A BPS contracts, the lack of robust process documentation across all centers can potentially be a major inhibitor to changing suppliers. Organizations often tend to stick with their incumbent since they are aware of the time and effort that was required for them to acquire process understanding and they are scared of the length and difficulty of transferring process knowledge to a new supplier.

Harmony can potentially assist organizations facing this dilemma in running more competitive sourcing exercises and increasing the level of business value achieved on contract renewal by baselining process maturity, identifying automation potential, and by providing a mechanism for training new associates more assuredly.

Harmony provides a single version of each SOP online. As well as maintaining a single version of the truth, this assists organizations in training associates (with a new associate able to select just the appropriate section of a large SOP, relating to their specific activity, online).

Shortening the RPA Process Assessment Lifecycle

As organizations increasingly seek to automate processes, a key element in Harmony is its “botmap” module. Two of the challenges faced by organizations in adopting automation are the need for manual process knowledge capture and the discrepancies that often arise between out-of-date SOPs and associate practice. This typically leads to a 4-week process capture and documentation period at the front-end of the typically 12-week RPA assessment and implementation lifecycle.

Harmony can potentially assist in shortening, and reducing the cost of, these automation initiatives by eliminating much of the first 4 weeks of this 12-week RPA assessment and implementation lifecycle. It does this by recommending process steps with a high potential for automation. These recommendations are based on an algorithm that takes into account parameters such as the nature of the process step, the sequence of activities, the number of FTEs, the systems used, and the repeatability of the process. The resulting process recommendations assist the RPA business analyst in identifying the most appropriate areas for automation while also providing them with an up-to-date, more reliable, source of process documentation.

]]>
<![CDATA[Genpact Acquires Barkawi Management Consultants, Targets 25%+ Growth in Supply Chain Management]]>

 

SCM is one of Genpact’s “invest-to-grow” service lines, where the company is looking to make disproportionate investments and scale up the business: in this case, to become one of the top two global supply chain transformation services vendors. In its “invest-to-grow” businesses, Genpact is looking to achieve at least twice the level of revenue growth achieved by Genpact overall and to do this by investing in complementary competencies rather than scale.

Genpact identified Barkawi Management Consultants, part of the Barkawi Group, as a potential target by working alongside the company (from now on referred to as Barkawi) within in its client base. Discussions began in late 2017, with the deal expected to close this month, August 2018, once the regulatory processes are complete.

The acquisition of Barkawi provides a strong platform for Genpact to deepen its supply chain consulting practice, achieve a revenue balance in SCM between transformation consulting and managed services, strengthen its relationships and expertise in key supply chain technologies, and strengthen its presence in Europe.

Deepening Supply Chain Consulting Capability

In the area of SCM, Genpact had existing capability in planning & inventory optimization & demand analytics and a couple of large managed services contracts. However, the company had limited front-end consulting capability, with just 30 supply chain management consultants. Although Genpact was organically adding SCM consultants, this relative lack of front-end expertise was limiting its ability to handle a significant number of concurrent prospect conversations. The acquisition of Barkawi brings 180 SCM consultants to Genpact, enabling the company to have not only a greater number of simultaneous client and prospect interactions but also to have deeper and more end-to-end conversations across more SCM transformation dimensions (including operating model transformation, technology transformation, digital transformation, and customer-oriented transformation).

Prior to the acquisition, Barkawi had ~200 consultants, with the bulk of these (~180) in the U.S. (principally in a center in Atlanta) and Europe (principally in a center in Munich). These are the operations being acquired by Genpact. The remaining Barkawi personnel were based in the Middle-East and China, which are not markets where Genpact actively generates business, and these personnel will not be transferring to Genpact.

Barkawi principally employs two types of consultant:

  • Management/process consultants active in supply chain and aftermarket services
  • Digital/technology consultants where the larger part of the practice consisted of assessment/implementation/optimization projects around partner technologies such as Kinaxis and Anaplan.

The U.S. business was slightly larger than the European business and employed a majority of personnel active as technology consultants, while the European business employed a majority of its personnel in management/process consulting.

Achieving a Balance between Transformation Consulting & Managed Services

Barkawi will be combined with Genpact’s consultants into a single SCM consulting service line, giving a broadly balanced mix across management/process consulting and technology consulting. This global service line will be headed by Mike Landry, previously head of Barkawi Management Consultants’ U.S. entity, and will be organized into supply chain consulting, aftermarket consulting, and technology, with these horizontals matrixed against the following verticals: consumer products, life sciences, industrial machinery, and product manufacturing.

Genpact is aiming to achieve a rough balance between the Genpact specialisms of consumer products and life sciences and the Barkawi specialism in industrial manufacturing. Similarly, Genpact is aiming for a roughly equal revenue split between consulting and managed services, with the CPG sector having a higher proportion of managed services contracts.

Strengthening Supply Chain Technology Relationships

Another advantage of the Barkawi acquisition is that it brings Genpact strong existing relationships with, and expertise in, supply chain planning platform companies Kinaxis and Anaplan. Barkawi is one of the leading partners of Kinaxis, and the company’s partnership with Anaplan on supply chain complements that of Genpact's with Anaplan for EPM.

Strengthening European Presence

In terms of its client base, Genpact estimates that the majority of Barkawi’s clients in the U.S. (where it was typically selling ~$200K technology consulting projects), are prospects for a wider range of Genpact supply chain transformation services. In addition, Barkawi had a strong management/process consulting presence in major manufacturers in Germany, which Genpact will seek to build on.

In addition, while the bulk of Barkawi’s European personnel are in Germany, Genpact will look to extend this capability by growing its team in both Munich and across Europe to address supply chain consulting in the wider European market. Genpact perceives there to be major consulting opportunities within the leading manufacturing companies, assisting them in implementing and optimizing technology, working with data, and creating optimization models. This applies particularly to companies with a strong element of aftermarket services, where these companies need to optimize their aftermarket models and address aftermarket fulfilment, warranty management, and forecasting.

 

Overall, Genpact is still looking to grow the supply chain management consulting team further, will continue to recruit, to support these growth initiatives.

 

]]>
<![CDATA[Bold Move by Altran, Makes Largest Ever Acquisition in the ER&D Industry with Aricent]]>

 

ER&D vendor Altran yesterday announced the acquisition of India-centric Silicon Valley-headquartered Aricent. Altran will be paying $2.0bn for Aricent through a capital increase of €0.75bn, the remaining ~€1bn in debt.

In the year ending June 30, 2017, Aricent had revenues of $687m and an EBITDA margin of 27.9%.

This is a significant acquisition for Altran, strengthening its position as the largest ER&D vendor globally. The combined company will have revenues of ~$3.1bn, and 44k employees, including ~12k in India. Altran, which already was the largest ER&D vendor in Europe, is now also a major player in North America, with revenues not far from those of HCL Technologies.

With this move, Altran will achieve several of its strategic ambitions: by 2020 to reach €3bn in revenues, an adjusted EBIT margin of 13%, and a global delivery network of over 10k.

As well as adding scale in North America, Aricent will also strengthen or fill Altran portfolio gaps in several key areas:

  • Strengthening its capabilities in the telecom industry and in semi-conductors
  • Bringing in expertise in software product development, internet technology development, and UX design (though the Frog subsidiary).

This is a highly complementary and strategic transaction. So why hasn’t the market’s initial response been warmer? Altran’s share price fell by 6% on the day of the announcement.

High Debt Level and Capital Increase

Much of the negative reaction relates to the capital increase (€0.7bn) vis-à-vis Altran’s market cap (~€2.4bn), and its increased debt (€1.3bn after the capital increase). Altran’s CEO has stressed that the target is to reduce its debt level from a leverage of 3.25 down to 2.5 within two years. The debt level will be high, but this is in the context of lowest interest rates ever.

What about Germany?

One likely impact of acquiring Aricent is that Altran is now unlikely to reach another objective, of deriving €500m in revenues by 2020 from Germany; this was also a key priority, along with the U.S. We estimate that in 2017, Altran will have revenues of €270m in its Germany/Austria business unit, half-way to its objective.

Does this matter? Well, yes; Germany is by far the largest ER&D service market in Europe, with half of its spending done by the automotive sector. While the German automotive ER&D market has been a difficult one in the past two years, resulting from changes in legislations, the reduction by Volkswagen Group (the largest spender in Europe) of its R&D spending, and price pressure, it remains a strategic country to be in for ER&D services vendors.

How Healthy is Aricent?

Altran management has not provided an indication of recent organic growth at Aricent, highlighting instead the rapid change in Aricent’s portfolio, which until the early 2010s had been very telecom-centric and had suffered from anemic spending in the sector.

Aricent has been through a reinvention, closing small geographies, and expanding its client base, from telecom equipment manufacturers to telecom service providers, and then semi-conductors (in chip design, through the $180m SmartPlay acquisition), and to automotive, relying initially on its network/connectivity expertise. In 2015, Aricent also expanded to software product/internet technology development.

Despite this reinvention, Aricent still derives 54% of its revenues from the telecoms sector. Semi-conductor/industrial has become significant (19% of revenues), Frog/UX design (16%), and product/technology development (11%) is a rising segment.

We note that Aricent has an IP partnership with an unnamed ISV: while IBM has not been announced as a partner, the structure of the deal is similar to the HCL Tech and Tech Mahindra IP partnerships, with Aricent acquiring $250m in software assets over four years, and already having a flow of revenues of ~$50m in LTM. We assume this flow represents software development fees from IBM.

Overall, a Bold Move

In short, this is a highly complementary and strategic acquisition. It does leave Altran exposed to financial stress, should market conditions deteriorate. And indeed there is some uncertainty in the automotive sector in the U.S., and in France around the key PSA account. But Aricent is not a major player in the U.S. automotive market.

Well done Altran for this incredibly bold move!


 

]]>
<![CDATA[IBM Services - Returning to the Limelight at IBM]]>

This month, NelsonHall has attended the IBM Services analyst/adviser events in New York and Paris and we noted a distinct evolution in emphasis, with a much closer alignment between IBM's two services divisions. In recent years, IBM’s Global Business Services (GBS) and GTS businesses have been somewhat in the shadows of its Cognitive Solutions segment, although together they represent around 60% of total IBM revenues. However, 2018 looks to be a year when these two divisions, under the umbrella branding of IBM Services, really come back into the IBM limelight, benefiting from several factors coming to fruition, including:

  • An increase in Watson use cases, providing real differentiation around cognitive-based offerings across GTS and GBS portfolios
  • Investments in GBS to shift its practices around digital, cognitive, and automation and cloud
  • GTS transforming to what it calls a ‘services integrator’, leveraging IBM assets
  • A much closer alignment between GBS and GTS.

In this, the first of several blogs on IBM, we will look at some of these overall developments.

GBS aligning around three ‘growth platforms’

While GBS has been focusing on IBM’s strategic imperatives for years now, the influence of Mark Foster, who has headed GBS since September 2016, is very evident. Under his leadership, GBS has aligned and focused its capabilities around three ‘growth platforms’ centered on digital, cognitive and cloud:

  • Digital strategy and iX: integrated strategy and design capabilities, extending to road map creation. iX is already at scale: IBM now has 36 iX studios globally, the latest one opening in Washington DC last month. A newer area of focus is building a larger digital strategy practice
  • Cognitive process transformation: combining GBS’ business and process change advisory (including RPA design and build), BPO and analytics capabilities into an integrated play. GBS is one of few organizations that has significant capabilities in helping clients both transform their processes internally and also in operations. With BPO, there is a very clear focus on higher value services enabled by AI (embedding Watson), also blockchain
  • Cloud application innovation: here, the ongoing emphasis is on reinventing strategic partnerships with the likes of SAP, Apple, Salesforce, and Workday for next gen enterprise application services, also leveraging automation/Watson to bring innovation into application maintenance and cloud application migration services.

Followers of Accenture will recognize that ‘growth platforms’ was a term used by Accenture for many years, then quietly dropped, so I was surprised to see the phrase now being used at GBS. Having coined the term when he was at Accenture, Foster told me, he feels no hesitation in continuing to use it as an apt description of the ambitions for GBS. Foster unveiled the growth platforms at the IBM Investor Day back in March; since then, GBS has aligned its personnel around these three areas. This year, he has brought in a slew of external hires to scale or revitalize certain parts of the portfolio, including digital strategy consulting, automation advisory, and SAP services. Expect to see a further crystallization of core GBS offerings next year.

What else are we likely to see at GBS in 2018? Here are a few suggestions…

  • Will there be some niche acquisition activity bringing in digital strategy consulting capabilities?
  • Clearly, in the cognitive process transformation growth platform, work will continue on the infusion of Watson capabilities in different use cases
  • Perhaps closer interaction between the Cognitive Process Transformation unit and the blockchain practice set up in 2016 and headed by Bridget van Kralingen (who moved there from heading GBS), for example in developing more blockchain-based use cases (that eliminate any need for queries that a transaction, say in supply chain processing, has taken place)
  • Possibly a greater emphasis on industry-specific offerings within GBS, again boosted by closer integration with the Industry Solutions practice. Will we also see some inorganic growth?
  • As GBS’ business mix continues to transform (and the proportion of its revenues coming from more price sensitive traditional IT services declines further) GBS may at last return to topline growth
  • Similarly, margin expansion, both from revenue growth and as increasing efficiencies in delivery and project management begin to outweigh the level of investments in the growth platforms and sales.

GTS: cloud and cognitive now embedded in all offerings, emphasis on ‘services integrator’

GTS, the larger and more profitable division, has also been going through a quiet transformation, also powered by IBM cloud and cognitive assets, with it positioning as a ‘services integrator’ to capture opportunities around large managed hybrid cloud and multivendor tech support services.

Its Technical Support Services unit (a large business, with a current revenue run-rate of $7.2bn) continues to target larger opportunities around multi-vendor support services in complex environments, inside and outside the datacenter, with (no surprises) embedded cognitive and automation components in the service delivery. Our next blog will look at solutions such as Augmented Remote Assist, which is already rolled out to most of its field force.

IBM’s huge infrastructure services business (revenue run rate $22.7bn) is beginning to recover from the market decline in – and its shift away from – traditional services, with cloud and cognitive now embedded in all offerings. IBM has been talking about the application of analytics and automation in GTS for over three years; cognitive is now also a reality, noticeably with the ‘IBM Services Platform with Watson’. Announced in July, the cloud-based IT operations platform, designed for hybrid cloud environments, can identify or predict potential problems and self-heal, and provide visibility via role-based dashboards. There are four key elements:

  • IBM’s data lake, containing 30+ years of systems operations data from thousands of engagements, plus other structured and unstructured data, for use in incident analysis
  • A broad set of automated capabilities for environment build, system hygiene, and dynamic automation (IPCenter), to support the design, management and optimization of IT environments
  • IBM Watson providing insights and recommendations for future automations
  • Client insights dashboards, powered by the Watson Insight Engine.

A big emphasis is on the platform’s full lifecycle management capabilities and on its flexibility to compose modular services from IBM and third-party providers. As such, it is another key asset in IBM’s armory (in addition to the nearly 60 IBM cloud data centers globally across 19 countries) in its positioning for large managed hybrid cloud services deals.

What are we likely to see at GTS in 2018?

  • TSS start to deliver revenue growth, driven by expansion in multivendor support services, also margin expansion and improved service with more widespread use of predictive analytics and solutions like Augmented Remote Assist and the use of blockchain (for sharing operational data)
  • IT infrastructure services also start to return to topline growth, driven by managed hybrid cloud
  • An expansion in AI/automation advisory services capabilities, to target the enterprise transformation agenda
  • Further developments in the use of Watson in support of infrastructure and endpoint security. We blogged in July about IBM's work to date in training Watson in the 'language' of cybersecurity (see here).

Closer alignment between and GBS and GTS

A few years ago, the migration of IBM’s BPO business from the GTS division to GBS appeared to be a lengthy and difficult process. In contrast, this year has seen more branding around a single ‘IBM Services’ capability. Noticeable at both the New York and Paris analyst events was the number of sessions, both plenary and roundtables, which were co-hosted by people from GTS and GBS. The positioning now centers on their combined capabilities, powered by IBM assets, to support clients in addressing five ‘core needs’, which IBM classifies as:

  • Finding growth in a low-growth, digitally disrupted world
  • Innovation and data leverage as a basis for new growth
  • Taking out structural costs, for competitiveness and to fund investment in growth
  • Winning the war for talent, to access intelligence and automation
  • Transforming enterprise processes and systems, to enable growth and competitiveness.

But how far does this alignment go in terms of the sales organization? Historically, collaboration has tended to be client-specific, for example in large multi-tower outsourcing pursuits. This is now changing. Apparently, in Europe alone this year IBM has invested over $7m in retraining on the new portfolios, with $30m earmarked for next year. There have also been appropriate changes to incentivization metrics and to internal processes.

In short, 2017 appears to have been a foundation year in the transformation of GBS and GTS; the prospects for both in 2018 look better than they have done for some years, with Watson, at last, really beginning to make an impact in the delivery of core and next-gen IT services.

 

In the next blog in this series, we will look at examples of where IBM is deploying cognitive and automation assets across its services portfolio.

]]>
<![CDATA[TCS’ New Service Line Structure, Business 4.0 Emphasis: Both Very Positive; Collaboration Challenge in Harnessing ignio to Optimum Benefit]]>

NelsonHall recently attended a TCS analyst event in Boston, the theme of which was Business 4.0: Intelligent, Agile, Automated, and on the Cloud. A few months ago, soon after Rajesh Gopinathan took over as CEO, TCS undertook its first major service line revamp for many years (we provided details of this in our Quarterly Update on TCS – see here). As such we were keen to learn more about how the reorganization is progressing and what this means in terms of investment priorities and any changes in market proposition. While the rationale for the new service line structure is convincing, we were left with some questions about whether TCS is in danger of creating new silos, silos moreover which have the potential to leave its large Cognitive Business Operations unit looking like a legacy business.

The opening keynote focused on what Business 4.0 enables, namely the ability to:

  • Achieve mass customization (availability of data to target every interaction within a segment of one)
  • Create exponential (business) value
  • Leverage ecosystems (an increasing differentiator)
  • Embrace risk.

As a generic positioning statement, this is an attractive cross-industry proposition by TCS, one that is a business-centric evolution from the notion that TCS has been promoting over the last year or two of the Digital Enterprise. It is positive in tone: the ‘disruption’ word doesn’t appear; also, one messaging statement is about ‘harnessing abundance’.

Major regroup and revamp of service portfolio

The importance of the new service line structure should not be underestimated: we were reminded that this is the first time TCS has undertaken such a restructure for 15 years.

The ambitions behind the revamp, as described, include wanting, inter alia:

  • For all service lines to be business outcome focused, and their offerings to address issues of board-level significance
  • To be able to deliver seamless service integration (full services play)
  • For the new set of offerings to address CXO priorities (full stakeholder)
  • To be able to offer new engagement types and non-linear pricing models (new models)
  • To go to market offering a combination of domain and digital capabilities (contextual thought leadership).

The overall emphasis is on evolving from an old model of 'Consult/ Build/Operate' to a ‘Broker/Integrate/Orchestrate’ model.

To summarize, TCS’ new Business and Technology Services (BTS) organization comprises three groups:

  • Digital Transformation Services (DTS), which has new standalone practices for areas such as IoT, Cyber, Analytics & Insights. Applications services are now broken down into smaller units such as EAS, Cloud Applications, Micro-services and APIfication
  • Cognitive Business Operations (CBO), which includes BPS, infrastructure services, applications support services, former ‘run the business’ services
  • Consulting and Systems Integration (C&SI).

Simultaneously, other established service practices that have reached scale (including some industry-specific BPS businesses and the Engineering Services unit) have been carved out and merged into the Industry Solution unit structure, enabling these vertical units to have a more integrated portfolio.

When TCS makes a big play in a new area, it invariably succeeds: its BPS and IT infrastructure services businesses, for example, have grown in not that many years to contribute nearly 28% ($5bn) of TCS’ total revenues in FY17 – and this has been achieved through organic growth.

The priorities now are clearly with:

  • The new digital practices. The importance being attached to these is reflected in the appointments of some very experienced execs to lead these; for example, Dinanth Kholkar, formerly head of TCS’ large BPS business, is now heading the much smaller Analytics & Insight practice. Investments will focus on these new practices (this is unlikely to include any significant M&A activity: unlike most of its peers, TCS has succeeded very well so far on essentially organic growth, and the messaging is that this will not change)
  • The ignio subsidiary. What will happen to ignio ultimately is not yet clear: one ambition is that ignio will be used by third parties – indeed, in discussions with ignio head Harrick Vin we heard of an organization that has asked two other systems integrators to deploy ignio. There is, of course, the possibility that ignio might eventually be spun off, though we do not see this as likely in the foreseeable future.

TCS has, to date, been highly successful in cross- and up-selling into major accounts: its ability to ‘penetrate and radiate’ is reflected in the ongoing expansion in the number of high-value accounts. And the company has been promoting its full services play for many years now.

Nevertheless, following discussions with several execs, we are left with some questions as to the potential effectiveness of the new service line structure in facilitating the development of new digital-led offerings, and this is why...

  • A new Enterprise Intelligent Automation (EAI) unit, sitting within C&SI, offers deployment of third party RPA and AI tools, as well, of course, as ignio. These are customized solutions for client-specific environments
  • Meanwhile, ignio, a standalone company, is essentially going alone to develop new use cases for ignio, for example in working out new process models within supply chain management
  • And then there is the Cognitive Business Operations unit. While the focus is on enriching the offerings with Intelligent, Agile, Automation, and Cloud, it does not appear to be spearheading TCS’ development of the kinds of new digital process models that are the hallmark of next generation managed services.   

A key challenge for TCS is harnessing both EAI and ignio to be able to develop and go to market with innovative and replicable new digital process models that will be delivered by CBO. This means a level of collaboration that is difficult to achieve across organizational boundaries. The risks include duplication of efforts and tardiness of innovation. I must emphasize at this point that this is not a challenge that is unique to TCS; we see it also in some other very large IT services/BPS providers that, like TCS have an extensive portfolio of service offerings and their own cognitive platforms. Finding ways to enable collaboration across organizational boundaries in the development of new cognitve-based offerings that benefit managed services businesses will be critical to future differentiation.

Two postscripts

  • Marketing: TCS has a new group CMO, and we were told that marketing will get a big boost in terms of promoting TCS’ services strategy
  • Innovation: one highlight of the event was CTO Ananth Krishnan providing an update on Research & Innovation at TCS. This merits a separate blog.
]]>
<![CDATA[NIIT Tech’s PACE Framework: Enabling Personalization to Differentiate Travel & Hospitality Companies]]>

 

As part of NelsonHall’s series of reports focusing on the business priorities and digital initiatives of IT services buyers, we found that buyers in the transportation industry have a significant focus on improving the customer experience. Eighty-five percent of transportation companies are pursuing an operational objective of increasing average revenue per customer to a high extent, while 75% are pursuing enhancing customer experience to a high extent.

These companies are primarily looking to achieve these objectives through the adoption of digital technologies, with 82% of transportation companies believing that improving customer experience and customer satisfaction is a highly important benefit of digital. This focus also correlates with the most pursued areas for digital initiatives: customer service and online commerce & CRM, each cited as being pursued to a high extent by more than 60% of transportation companies.

To address these needs, NIIT Technologies has introduced a new framework that enables travel and hospitality companies to improve customer service and drive revenues through greater tailoring of services to individual customer needs. With years of stagnant revenues and the rise of low-cost competitors, NIIT Tech sees an opportunity for travel and hospitality companies to use personalization to change what can be a commoditized product (a plane seat or hotel room). NIIT Tech is positioned to understand and address these needs, as NelsonHall estimates ~55% of its digital transformation consulting revenues are associated with travel and hospitality clients.

NIIT Tech’s PACE framework (Personalization for enhanced Ancillary revenue and Customer Experience) uses an assessment of the maturity of current operations to plot current capabilities, determine the target level of maturity, and map the process for achieving it. PACE considers capabilities across two dimensions: personalization maturity and engineering proficiency:

  • Personalization maturity spans from a 1:many model to a 1:1 model. It analyzes three elements to assess the use of data to tailor user experiences: system of insight, integration and channel enablement, and system of engagement
  • Engineering proficiency uses NIIT Tech’s DONE framework to assess the internal technical delivery capabilities to plot companies from ‘predictable’ (with eight mature disciplines) to ‘lean startup’ (with 23 mature disciplines). Disciplines measured span areas including business value, solution, continuous delivery, agile practices, and support.

Delivered through a 4-week assessment project, the outputs are plotted to illustrate where a company’s maturity falls within one of four quadrants: naivete, contender, thinker, nirvana.

Based on the current positioning and the company’s objectives, a project plan is then developed for the company. These projects are planned to address the three layers that comprise the customer experience mentioned above:

  • Systems of insight: the systems that capture customer data
  • Integration and channel enablement: the systems that enable the use of captured data through the dissemination of the data to inform business operations
  • System of engagement: the customer-facing user experience, informed by captured data fed through integrated environment.

These projects span all three layers as necessary. Though the entire project can require an 18 to 24-month commitment, NIIT Tech develops a project plan that includes delivering quick wins to demonstrate value, while the more challenging, lower direct ROI effort of building the foundational elements of the data capture and integration necessary to feed a tailored customer experience are completed.

PACE projects are supported by NIIT Tech’s ecosystem of partners, to align with the specific needs and preferences of clients, including UiPath, Appian, Sitecore, Salesforce, Adobe, Pega, Oracle, Tableau and others.

NIIT Tech’s experience to date has shown most travel and hospitality companies are low on the maturity cycle. But to achieve internal objectives, projects have had different types of focus, examples being:

  • Transforming a website for a large airline to better use customer data
  • Micro-services implementation to enable greater integration and channel enablement for a loyalty program
  • Managed service for business intelligence services for an Asia-Pacific airline
  • Implementation of a data lake for a large airline to expand analytics capability
  • Transforming an e-commerce platform for an Asia-Pacific airline. This included both process and technology changes, with a significant focus on increased use of automation, that enabled the time to market for changes to fall from three weeks to less than three days. With further transformation, this is expected to fall to under four hours.

In travel and hospitality industries increasingly focused on a self-defeating cycle of lower and lower prices, personalization of experiences offers an opportunity to use digital to differentiate from competitors. NIIT Tech’s PACE offering has been developed to help these clients meet or exceed their customer needs. 

]]>
<![CDATA[CSC to DXC Technology – Plus Ca Change…]]>

 

DXC Technology recently held an analyst day in the U.K., its first since being ‘born’ on April 1. It is still early days for this young organization and the integration continues (though there was little mention of this at the event). In its recent investor calls, DXC Technology has emphasized actions to strip out costs, apply financial discipline, and invest in next gen offerings, around digital and Industry Solutions and BPS.

Focus on Smaller Transactions…

It is evident that CEO Mike Lawrie is continuing the strategy he adopted at CSC with DXC. DXC has a different positioning than its legacy might suggest, having come from three of the four largest IT outsourcing firms (CSC, EDS, HPE ES) headquartered in the U.S. DXC does not consider itself to be a large IT outsourcing specialist, focusing instead on small to mid-sized contracts, cross-selling, and winning new logos, rather than being large transaction-based. This is the journey that CSC had been following in recent years.

To accompany this commercial emphasis on smaller transactions, DXC has transitioned its sales structure towards more technical sales. To this effect, DXC has adopted a two-in-a-box structure, with account managers complemented by a technical sales force aligned by service offerings. DXC is investing in online training for its account managers, targeting annual certifications, focusing on mastering the service portfolio, and also in taking a more consulting-led approach, taking a “client challenge”-centric view.

… and on Next Gen Services

Another priority for Lawrie during his tenure as CEO of CSC was building the portfolio of “next gen” offerings, and the transition towards more packaged and more repeatable services. This journey clearly continues with DXC.

DXC has reduced its service portfolio and number of offerings to under 100. In the past few months, DXC has unveiled several packaged offerings, including its series of 40 QuickStarts. DXC wants to maintain this level of discipline about its service portfolio: as the CEO of DXC in the UKI region pointed out, "if we need to create new offerings within each service line, we will have to carefully review the existing ones, and decommission a few".

Along with the streamlining and refresh of its service portfolio, DXC continues to work on increasing global delivery, and making it more industrialized, including applying RPA, AI and analytics. In its Workplace and Mobility unit, DXC is taking a systematic approach in identifying where its differentiation lies for each of the offerings, and is filing patents.

The next gen focus is taking a continuous improvement approach with the portfolio: once it has provided a similar solution/service to five to ten clients, the drive is to move to an industrialized delivery and packaged offering, making use of Indian delivery and its CoE approach.

Acquiring for Digital Capabilities

As part of its portfolio refresh, DXC has added two fast-growing service lines, Analytics, and Security, to the much larger Cloud, Platforms and ITO (CIP), Workplace and Mobility (WM); and Application Services (AS) units. The challenge with these units appears to be finding resources to meet market demand. Both Analytics and Security have a scale of 4k-5k personnel (NelsonHall estimate). NelsonHall will be publishing an analysis of DXC Analytics soon.

M&A is clearly part of the strategy, again continuing what CSC has been doing to jump start its Enterprise Cloud Applications, largely around Microsoft Dynamics (UXC Eclipse, eBECS, Tribridge) and ServiceNow (UXC, Aspediens, Fruition, and last week Logicalis SMC).

It is very clear that DXC is a very pragmatic firm. Do not expect big announcements about 'state of the art' offerings, or new business models. DXC is focusing on continuous improvement and on applying proven techniques to increase shareholder value, rather than seeking to reinvent the model. The approach Lawrie took at CSC continues at DXC. The latest example: this week, DXC announced the divestiture of its U.S. government services arm in a move that replicated the spin-off by CSC of its NPS organization two years ago. Rachael Stormonth published an analysis of this transaction here.

]]>
<![CDATA[HCL's 3-Lever Approach to Business Process Automation: Risk & Control Analysis; Lean & Six Sigma; Cognitive Automation]]> HCL has undertaken ~200 use cases spanning finance & accounting, contact, product support and cross-industry customer onboarding, and claims processing, using products including Automation Anywhere, Blue Prism, UiPath, WorkFusion, and HCL’s proprietary AI tool Exacto.

This blog summarizes NelsonHall’s analysis of HCL's approach to Business Process Automation covering HCL’s 3-lever approach, its Integrated Process Discovery Technique, its AI-based information extraction tool Exacto, the company’s offerings for intelligent product support, and its use of its Toscana BPMS to drive retail banking digital transformation.

3-Lever Approach Combining Risk & Control Analysis, Lean & Six Ssigma, and Cognitive Automation

 

 

  • The 3 lever approach forms HCL’s basis for any “strategic automation intervention in business processes”. The automation is done using third-party RPA technologies together with a number of proprietary HCL tools including Exacto, a cutting-edge Computer Vision and Machine Learning based tool, and iAutomate for run book automation

  • HCL starts by conducting a 3-lever automation study and then creates comprehensive to-be process maps. As part of this 3-lever study, HCL also conducts complexity analysis to create the RPA and AI roadmap for organizations using its process discovery toolkit. For example, HCL has looked at their entire process repository for several major banks and classified their business processes into four quadrants based on scale and level of standardization

  • When generating the “to be” process map, HCL’s Integrated Process Discovery Technique places a high emphasis on ensuring appropriate levels of compliance for the automated processes and on avoiding the automation of process steps that can be eliminated

  • The orchestration of business processes is being done using HCL’s proprietary orchestration platform, Toscana©. Toscana© supports collaboration, analytics, case management, and process discovery and incorporates a content manager, a business rules management system, a process simulator, a process modeler, process execution engines, and integrated offering including social media monitoring & management.

Training Exacto AI-based Information Extraction Tool for Document Triage within Trade Processing, Healthcare, Contract Processing, and Invoice Processing

  • HCL’s proprietary AI enabled, machine learning solution, Exacto, is used to automatically extract and interpret information from a variety of information sources. It also has natural language and image based automated knowledge extraction capabilities

  • HCL has partnered with a leading U.S. University to develop its own AI algorithms for intelligent data extraction and interpretation for solving industry level problems, including specialist algorithms in support of trade processing, contract management, healthcare document triage, KYC, and invoice processing

  • Trade processing is one of the major areas of focus for HCL. Within capital markets trade capture, HCL has developed an AI/ML solution Exacto | Trade. This solution is able to capture inputs from incoming fax based transaction instructions for various trade classes such as Derivatives, FX, Margins, etc. with accuracy of over 99%.

Combining Watson-based Cognitive Agent with Run Book Automation to Provide “Intelligent Product Support”

  • HCL has developed a cognitive solution for Intelligent Product Support based on a cognitive agent LUCY, Intelligent Autonomics using for run book automation, and Smart Analytics with MyXalytics for dashboards and predictive analytics. LUCY is currently being used in support for IT services by major CPG, pharmaceuticals, and high-tech firms and in support of customer service for a major bank and a telecoms operator

  • HCL’s tool is used for run book automation, and HCL has already automated 1,500+ run books. uses NLP, ML, pattern matching, and text processing to recommend the “best matched” for a given ticket description. HCL estimates that it currently achieves “match rates” of around 87%-88%

  • HCL estimates that it can automate 20%-25% of L1 and L2 transactions and has begun automating internal IT infrastructure help-desks.

Positioning its Toscana Platform to Drive Digital Transformation in Retail Banking

  • HCL is embarking on digital transformation through this approach and has created predefined domain-specific templates in areas including retail banking, commercial lending, mortgages, and supply chain management. Within account opening for a bank, HCL has achieved ~ 80% reduction in AHT and a 40% reduction in headcount

  • In terms of bank automation, HCL has, for one major bank, reduced the absolute number of FTEs associated with card services by 48%, a 63% decrease based on the accompanying increase in the workload. Elsewhere, for another bank, HCL has undertaken a digital transformation including implementation of Toscana©, resulting in a reduction of the number of FTEs by 46%, the implementation of a single view of the customer, a reduction in cycle time of 80%, and a reduction in the “rejection rate” from 12% to 4%.

]]>
<![CDATA[Discussion with Carole Murphy of Capgemini: Application of Intelligent Automation to the Finance Function]]>

Capgemini's Carole Murphy

 

Capgemini has recently redefined its framework for Intelligent Automation, taking an approach based on ‘five senses’. I caught up with Carole Murphy, Capgemini’s Head of BPO Business Transformation Services, to identify how their new approach to Intelligent Automation is being applied to the finance & accounting function.

 

JW: What is Capgemini’s approach to Intelligent Automation and how does it apply to finance & accounting?

CM: Capgemini is using a ‘five senses’ model to help explain Intelligent Automation to our clients and to act as a design framework in developing new solutions. The ‘five senses’ are:

  • Monitor (watch): in F&A, for example, monitoring of KPI dashboards, daily cash in monitoring, customer dispute monitoring, etc.
  • Interact (listen/talk): interacting with users, customers, and vendors, using a range of channels including virtual agents
  • Service (act): automated execution, for example using RPA to address for example cash reconciliation, automated creation of sales invoices, exceptions handling
  • Analyze (think): providing business analytics in support of P&L and cash flow and root cause analysis in support of process effectiveness and efficiency
  • Manage knowledge (remember): a knowledge base covering SOPs, collection strategies, bank reconciliation rules etc.

 

Capgemini's 'Five Senses' IA model

 

JW: Why is this approach important?

CM: It changes the fundamental nature of finance operations from reactive to proactive. Historically, within BPS contracts, the vast majority of the process has been in ‘act’ mode, supplemented by a certain amount of analytics. The introduction of Intelligent Automation enables us to move to a much more rounded and proactive approach. For example, a finance organization can now identify what is missing before it starts to cause problems for the organization. In accounts payable, for example, Intelligent Automation enables an organization to anticipate utility bills, and proactively identify any missing or unsent invoices before, say, a key facility is switched off. Similarly, in support of R2R, transactions can be monitored throughout the month and the organization can anticipate their impact on P&L well in advance of the formal month-end close. In summary, Intelligent Automation enables ongoing monitoring and analysis rather than periodic analysis and can incorporate real-time alerts, such as to identify a missing invoice and find out why it wasn’t raised. So, a much more informed and proactive approach to operations.

JW: And what impact does IA have on the roles of the people within the finance function?

CM: Intelligent Automation lifts both the role of the finance function within the organization and those of the individuals within the finance function. One of the benefits of Intelligent Automation is that it improves how we share and deploy rules and knowledge throughout the organization – making compliance more accessible and enabling colleagues to understand how to make good financial decisions. Complex queries can still be escalated but simple questions can be captured and resolved by the ‘knowbot’.  This can change how we train and continually develop our people and how we interact across the organization.  

JW: So how should organizations approach implementing Intelligent Automation within their finance & accounting functions?

CM: One of the exciting elements of the new technology is that it is designed for users to be able to implement more quickly and easily – it’s all about being agile. Alongside implementing point solutions, the value will ultimately be how to combine the senses to bring the best out of people and technology. 

We can re-think and re-imagine how we work. Traditionally, we have organized work in sequential process steps – using the 5 senses and intelligent automation we can reconfigure processes, technology and human intervention in a much more inter-connected manner with constant interaction taking place between the ‘five senses’ discussed earlier.

This means that it’s important to reimagine traditional finance & accounting processes and fundamentally change the level of ambition for the finance function. So, for example, the finance function can now start to have much more impact on the top line, such as by avoiding leakage due to missing orders and missing payments. Here, Intelligent Automation can monitor all transactions and identify any that appear to be missing. Similarly, it’s possible to implement ‘fraud bots’ to identify, for example, duplicate invoices or payments to give much greater levels of insight and control than available traditionally.

JW: What does this involve?

CM: There are lots of ways to start engaging with the technology – we see a ‘virtuous cycle’ that follows the following steps:

  • Refresh: benchmarking of the finance function
  • Reimagine: workshops involving design thinking, innovation lab, and Intelligent Automation framework to identify the ‘art-of-the-possible’
  • Reengineer: incorporating eSOAR methodology and Digital GEM
  • Roll-out: here it’s very important to take a very agile development approach, typically using preconfigured prototypes, templates, and tested platforms
  • Run: including 24/7 monitoring.

Reimagining the F&A processes in the light of Intelligent Automation, rather than automating existing processes largely ‘as-is’ is especially important. In particular, it’s important to eliminate, rather than automate, any ‘unnecessary’ process steps. Here, Capgemini’s eSOAR approach is particularly important and covers:

  • Eliminate: removing wasteful or unnecessary activities
  • Standardize processes
  • Optimize ERPs, workflow, and existing IT landscape
  • Automate: using best-of-breed tools
  • Robotics: robotizing repetitive and rule-based transactions.

Finally, it’s critical not to be scared of the new technology and possibilities. The return on investment is incredible and the initial returns can be used to fund downstream transformation. At the same time, the cost, and timescale, of failure is relatively low. So, it’s important for finance organizations to start applying Intelligent Automation to get first-mover advantage rather than just watch and wait.

JW: Thank you very much, Carole. That certainly ties in with current NelsonHall thinking and will really help our readership. NelsonHall is increasingly being asked by our clients what constitutes next generation services and what is the art-of-the-possible in terms of new digital process models. And finance and accounting is at the forefront of these developments. Certainly, design thinking is an important element in assisting organizations to rethink both accounting processes and how the finance function can make a greater contribution to the wider enterprise – and the ‘five senses’ approach helps to demystify Intelligent Automation by clarifying the roles of the various technologies such as RPA for process execution, analytics for root cause analysis, and knowledge bases for process knowledge.

]]>
<![CDATA[Genpact Acquires TandemSeven, Adding Human-Centered Design IP & Consulting to Digital Transformation Capabilities]]>

Genpact announced another acquisition today, that of TandemSeven, a design thinking (DT)-led CX /UX innovation consultancy based in Boston, with offices also in New York and Chicago. This is Genpact’s fourth acquisition in the U.S. this year; all four clearly supporting the company's drive to radically evolve its portfolio from traditional BPS to a coherent set of offerings designed to help enterprises in their operational digital transformation. What distinguishes TandemSeven from other recent acquisitions is that the primary capability it brings in is not software nor domain-specific BPS, but consulting & methodology.

Quick Overview of TandemSeven

Firstly, let’s take a quick look at TandemSeven, then how its capabilities will align with Genpact’s ‘Lean Digital’ positioning:

  • Size: the firm has 65 associates, located between Boston and New York
  • Client base: large U.S. enterprises, with many engagements in BFS sectors, particularly capital markets. Client references include LeggMason, Risk Management Solutions (RMS), APAX Partners. It has also worked in B2C sectors such as utilities, air travel, automotive
  • The work: TandemSeven’s focus areas cover B2E and B2B as well as B2C process areas, with quite a lot of its engagements having looked at support functions. For example, for Legg Mason, TandemSeven designed a new global Intranet for its employees; and in an engagement with an investment banking firm, the focus was on the reimagination of the UX of staff in the middle office supporting the traders in the front office. This aligns with Genpact’s focus on transforming enterprises’ middle and back office activities
  • The IP: TandemSeven’s consultants are supported by a platform for standardizing human-centered design. TandemSeven’s ‘UX360’ platform is a key differentiator for the company relative to other design agencies which formalizes the collection of research with customer journey mapping & modeling, and integrates these with task modeling and alignment with developers in an agile environment. It consists of a set of tools for persona modeling, customer journey mapping and task modeling with a research repository for KM which creates a System of Record for each project as well as providing linkages with agile development platforms.

Genpact’s ‘Lean Digital’ framework combines the operations view of Lean processing, DT, and digital tools to fundamentally re-architect business processes. TandemSeven provides a key building block here by enhancing Genpact’s consulting capabilities and frameworks in human-centered design, an essential element in digitalizing processes across the organization.   

TandemSeven Becomes Part of and Enhances Genpact’s Digital Solutions Unit

Genpact’s Board is fully aware that TandemSeven will be culturally different from Genpact’s BPS core operations and is accordingly adopting a careful approach to its integration. The firm’s head of consulting and head of sales will report to the head of Genpact’s Digital Solutions practice, who is also based on the U.S. East Coast; the practice will also act as a kind of family group for TandemSeven within Genpact’s larger Digital Unit, which has1k people).

Integration plans:

  • In the first instance, TandemSeven’s UX360 tool will be leveraged by ~80 Genpact DT workshop facilitators. It will provide a very strong framework and mechanism for incorporating human-centered design into the interfaces associated with redesign of industry-specific and back-office processes
  • Secondly, some TandemSeven staff will act as coaches to Genpact employees, and help create protocols for DT sessions. Genpact has been providing extensive employee training on DT for over two years; TandemSeven, and UX360, will help sharpen up the methodology
  • Thirdly, some TandemSeven staff will become involved, within multi-functional groups, in Genpact sales pursuits. As with any consultancy acquisition, a key success factor will be in the extent to which the acquirer leverages the new capabilities across its broader portfolio and client base.

First stage of a Buy and Build strategy

Rather than following with similar, possibly smaller, acquisitions in other target geographies, Genpact’s intention is to transplant a few resources from TandemSeven into the U.K and Australia and then hire locally to build local regional units.

Genpact Building Digital Hubs in Boston, NY

TandemSeven’s model to date has been to work primarily on client sites; it does not bring in a studio environment for Genpact to leverage. However, Genpact will complement this on-site capability by creating UX studios in Boston, where it has inherited space from the Rage acquisition, and also NY, where it will redesign some existing office space. And of course, OnSource is also based near Boston.

In term of cross-fertilization with West Coast capabilities, there is also some interaction with the  ‘Innovation by Design’ software engineering capabilities based in Genpact’s Palo Alto Lean Digital innovation center.

TandemSeven Strategically Important in Incorporating Human-Centered Design in Genpact Digital

This is a strategic acquisition for Genpact in its journey to evolve from traditional BPS, where it has been an eminent pureplay, to become a wider digital operations transformation partner for organizations. The TandemSeven acquisition complements Genpact’s existing DT and operations transformation capability by providing a distinctive methodology and nicely visual tool for applying human-centered design in support of digitalized middle and back-office functional areas.

]]>
<![CDATA[WNS: Thriving in a Climate of Industry Disruption]]>

 

NelsonHall recently attended WNS analyst/adviser events in New York and the U.K., where the theme was “NEXT”, with the sub-text of assisting organizations in “thriving in a state of constant disruption”.

While the trend to digitalization of business processes using technologies such as robotics and cloud, artificial intelligence, and big data is causing concern within some quarters of the BPS industry, WNS is thriving in this new environment. And, while it is reducing the revenues in established footprints, it is certainly creating major new opportunities for digitalization in client organizations.

Complementing Traditional Domain Virtues…

So, how is WNS doing this? Well the company is positioning on driving transformation through its domain knowledge, process expertise, technology and automation, analytics, global delivery capability, and client centricity.

However, while the company is becoming increasingly strong in the development of new digital process models in support of its target industry sectors, WNS’ key differentiators continue to lie in its domain knowledge, where both its go-to-market personnel and its delivery personnel are fully aligned by industry, resulting in a depth of process knowledge in a domain context, and in client centricity. In addition, WNS has not fought shy of developing tier-2 delivery capability in support of its global delivery capability, though as automation takes hold the company is now likely to consolidate into existing locations rather than add new tier 2 cities.

WNS has also been ahead of the curve in building up analytics capability, with analytics accounting for 20% of WNS’ revenues, an increasingly important component of new digital process models. Accordingly, its 2017 analyst events focused extensively on analytics capabilities and WNS TRAC.

….with Platforms, Robotics, Analytics, and Cloud

WNS TRAC (Technology, Robotics, Analytics, and Cloud) is described by WNS as “a consolidated suite of next generation, all-encompassing BPM (for which read BPS rather than workflow) enablement technology solutions”. I’m not sure this is entirely the best phrasing, but in NelsonHall-speak it largely equates to “New Digital Process Models”. However, unlike some BPS firms that offer new digital process models in either client-operated or in BPS form, WNS intends to offer TRAC as part of its BPS engagements only.

TRAC encompasses each of WNS’ industry platforms, such as Verifare for the travel sector, increasingly complementing these with robotics, AI, and analytics. WNS has developed both “industry TRAC solutions” (18 solutions covering the travel, insurance, healthcare, shipping & logistics, utilities, and CPG & retail sectors), and “cross-industry TRAC solutions”, covering CFO (F&A), CPO, standalone robotics & digital automation, and CIS (interaction analytics, speech & text analytics, and omni-channel solutions).

However, while WNS has developed strong platform capability, particularly in areas such as the travel sector, there remains work to be done in fully incorporating cognitive technologies such as machine learning (where WNS is now beginning to partner with MIT Media Lab) to fully build out many of its nascent new digital process models. For example, while WNS has a number of sourcing-related platforms plus spend analytics capability within its “CPO TRAC”, the company has yet to fully incorporate the levels of NLP (in support of supplier and contract management) and cognitive technologies (in support of virtual procurement agents) that are beginning to emerge as part of new digital S2P process models.

Looking to Productize its Analytics Services

WNS continues to win major contracts in analytics, citing recent examples in the pharmaceuticals, FMCG, retail, and insurance sectors. While the company has assisted major organizations in establishing analytics CoEs, and offers an end-to-end analytics capability (from data aggregation, through processing, to visualization and consumption), WNS is increasingly looking to productize its analytics services, often incorporating its Brandttitude and SocioSeer platforms in support of specific use cases such as media spend guidance, price & promotion optimization, and shelf space optimization.

Further Investment in Digital Components Planned

Going forward, WNS will continue to focus on its key industry domains, looking to become “an integral partner to organizations in their digital adoption journey”, and incorporating new TRAC solutions in support of next-gen domain capability. While WNS will aim to develop its own core technology stack, the company will increasingly invest in proprietary tools and platforms and undertake acquisitions in key areas such as digital, RPA, AI, and smart meters.

John Willmott and Rachael Stormonth

As well as including WNS in relevant BPS and RPA/AI research areas, NelsonHall also covers WNS in the NelsonHall Quarterly Update Program - for details contact [email protected]

]]>
<![CDATA[Genpact Cora: A Unifying Framework to Accelerate Industrialization of New Digital Process Models]]>

 

Many of the pureplay BPS vendors have been moving beyond individual, often client-specific implementations of RPA and AI and building new digital process models to present their next generation visions within their areas of domain expertise. So, for example, models of this type are emerging strongly in the BFSI space and in horizontals such as source-to-pay.

A key feature of these new digital process models is that they are based on a design thinking-centric approach to the future and heavily utilize new technologies, with the result that the “to-be” process embodied within the new digital process model typically bears little relation to the current “as-is” process whether within a BPS service or within a shared service/retained entity.

These new digital process models are based on a number of principles, emphasizing straight-through processing, increased customer-centricity and proactivity, use of both internal and external information sources, and in-built process learning. They typically encompass a range of technologies, including cloud system of engagement platforms, RPA, NLP, machine learning and AI, computer vision, predictive and prescriptive analytics held together by BPM/workflow and command & control software.

However, while organizations are driving hard towards identifying new digital process models and next generation processes, there are a relatively limited number of examples of these in production right now, their implementations use differing technologies and frameworks, and the rate of change in the individual underlying technology components is potentially very high. Similarly, organizations currently focusing strongly on adoption of, say, RPA in the short-term realize that their future emphasis will be more cognitive and that they need a framework that facilitates this change in emphasis without a fundamental change in framework and supporting infrastructure.

Aiming for a Unifying Framework for New Digital Process Models

In response to these challenges, and in an attempt to demonstrate longevity of next generation digital process models, Genpact has launched a platform called “Genpact Cora” to act as a unifying framework and provide a solid interconnect layer for its new digital process models.

Genpact Cora is organized into three components:

  • Digital Core: dynamic workflow (based on PMNSoft acquisition), cloud-based systems of engagement, blockchain, mobility & ambient computing, and RPA
  • Data Analytics: advanced visualization, Big Data, data engineering, IoT
  • AI: conversational AI, computational linguistics, computer vision, machine learning & data science AI.

One of the aims of this platform is to provide a framework into which technologies and individual products can be swapped in and out as technologies change without threatening the viability of the overall process or the command and control environment, or necessitating a change of framework. Accordingly, the Genpact Cora architecture also encompasses an application program interface (API) design and an open architecture.

Genpact is then building its new digital process models in the form of “products” on top of this platform. Genpact new digital process model “products” powered by Cora currently support a number of processes, including wealth management, commercial lending, and order management.

However, in the many process areas where these “products” are not yet formed, Genpact will typically take a consulting approach, initially building client-specific digital transformations. Then, as the number of assignments in any specific process area gains critical mass, Genpact is aiming to use the resulting cumulative knowledge to build a more standardized new digital process model “product” with largely built-in business rules that just require configuring for future clients. And each of these “products” (or new digital process models) will be built on top of the Genpact Cora platform.

Launching “Digital Solutions” Service in Support of Retained Operations

Another trend started by the desire for digital process transformation and the initial application of RPA is that organizations are keen to apply new digital process models not just to outsourced services but to their shared services and retained organizations. However, there is currently a severe shortage of expertise and capability to meet this need. Accordingly, Genpact intends to offer its Genpact Cora platform not just within BPS assignments but also in support of transformation within client retained services. Here, Genpact is launching a new “Digital Solutions” service that implements new digital process models on behalf of the client shared services and retained organizations and complements its “Intelligent Operations” BPS capability. In this way, Genpact is aiming to industrialize and speed up the adoption of new digital process models across the organization by providing a consistent and modular platform, and ultimately products, for next generation process adoption.

]]>
<![CDATA[My BIG Opportunity to Help Sourcing Buyers]]>

 

I am excited to be embarking on a new role at NelsonHall, having enjoyed serving as a Customer Management Services industry analyst for almost five years. As the new advisor for NelsonHall’s Buyer Intelligence Group (BIG), I get the opportunity to work more closely with our buy-side clients and broader BIG community, helping them to engage around hot industry topics and to get the most from NelsonHall’s extensive buy-side resources.

BIG is a community of sourcing executives from buy-side organizations across various industries, and I have observed its impressive growth in recent months. It is a forum for sharing sourcing experiences and addressing key industry questions with peers, as well as accessing knowledge from NelsonHall analysts. BIG is a community for buyers only, where they can engage with peers in privacy.  

There are several membership levels, the foundation level being free. Benefits of foundation level membership include access to:  

  • An online community where buyers can join special interest sub-groups, use the members’ discussion forum to pose questions/comments and share responses
  • Regular free webinars to learn from peers, NelsonHall analysts, and other industry experts (see below)
  • NelsonHall’s (vendor) Evaluation and Assessment (NEAT) tool, which provides comparative vendor evaluations across a wide range of business process services (BPS) and IT services (ITS) markets to help members get started with vendor shortlisting for their sourcing initiatives
  • Highly informative blog articles from NelsonHall’s industry analysts working across BPS and ITS markets.

The rich content available to members of BIG includes access to regular free webinars, which we run once or twice a month. Recent webinars include a series on RPA & AI adoption, the most recent one drawing together the lessons learned from organizations who have successfully implemented RPA & AI projects that are yielding impressive results. We have also run webinars for members on areas including Human Resources Outsourcing, Customer Management Services, and Cloud Adoption. The next BIG webinar is on Thursday 29th June, on ‘Key Trends & Future Shape of Payroll Outsourcing’ with NelsonHall’s Pete Tiliakos.

As part of my new role, I am regularly reaching out to service buyers to understand their sourcing information needs and to match this with content in the BIG community, including the webinars. I would be happy to hear from sourcing decision-makers who are interested in learning more about BIG and how we can help address their sourcing information requirements. I can be reached at [email protected].

Reminder: The next BIG webinar is on Thursday 29th June, on ‘Key Trends & Future Shape of Payroll Outsourcing’ with NelsonHall’s Pete Tiliakos.

]]>
<![CDATA[Immediate Takeaways from Infosys Confluence 2017 (vlog)]]>  In this video,Rachael Stormonth, NelsonHall’s EVP Research, reports from the Infosys Confluence 2017 event in SanFrancisco.

 

]]>
<![CDATA[NelsonHall Takes Transformational Approach to the Role of the CFO]]> The nature of finance & accounting is changing rapidly with the advent of new technologies. RPA has already shown its potential to reduce transactional F&A costs by 20% while improving quality of service and the application of cognitive technologies over the next few years will multiply this existing impact several times over. And, with the advent of machine learning, the processes will increasingly be knowledge-based and self-learning.

And, at the same time that basic accounting processes are being automated, so is financial reporting and analytics. Here, natural language generation coupled with analytics is leading to automatic reporting and interpretation of results while predictive and prescriptive analytics are increasingly identifying appropriate company behavior.

Finally, as the operational accounting and reporting processes become automated, the outsourcing vendors are increasingly moving upstream into financial planning and analysis.

So, the nature of finance & accounting is undergoing dramatic transformation. But can the same be said for the role of the CFO? So far this seems to be relatively unchanged, and we believe the time is right for a corresponding transformation.

Hence, NelsonHall tasked its HR department to identify a new CFO for the modern age. In the spirit of design thinking and achieving 10X impact, we thought, “Why not change the role, so that involvement with the CFO, rather than increasing the stress of all concerned (as has often been our experience due to the usual requests for budget cuts and increased performance), actually lowered the stress of all concerned and enhanced the mental health of the organization?”. This would be a truly transformational outcome.

So we set out with a charter to change the role of the CFO from stress-inducing to stress-reducing and to measure the falls in blood pressure of personnel after encounters with the CFO. And while we don’t yet have definitive quantitative results, I think we can confidently assert that this approach is working in the initial pilots.

We decided to look beyond the CFO stereotype of someone with traditional finance skills and a laser focus on analysis, reporting and control. In fact (and this might be a useful tip for executive recruitment agencies), we used the latest thinking in talent acquisition and “consumerized” our hiring process. The result was a generation Z hire (born after 2000) who displays none of the uptight characteristics normally associated with a CFO. We believe he’s a real cool cat.

His background is unknown (background checking was something of an issue), but then why adopt traditional hiring techniques when you are seeking to be transformational? Having said that, he is street-wise and knows what he wants out of his career and life in general.

In terms of daily routine, he turns up for work at about 07.30 in the morning. He commences his duties in corporate stress reduction by welcoming each employee with a purr, and following a breakfast (we presume his second breakfast) he purrs even more. Purring actually has healing properties – it makes the human heart-rate slow down, it lowers blood pressure and stress, and it boosts the immune system, enabling humans to better cope with the day-to-day tasks. So, a truly transformational impact in the role of the CFO.

We thought you might be interested to find out more about our CFO (Chief Feline Officer) Leo’s typical day in the office, so we have outlined this below as an example to all organizations considering taking this approach:

 

07.30: Leo arrives at work, bright eyed and bushy tailed, looking forward to his second breakfast.

 

07.31: “Let me in please……I want my second breakfast.” He’s keen.

 

07.35: “That’s better……I feel I can start the day how I mean to go on.”

 

07.45: “Now what shall I do?......That’s a nice photograph of me……I’m quite handsome, aren’t I?” Who said CFOs were posers?

 

08:00: Zzzzzz

 

11.30: “Is it lunch time yet?......Oh well, I might go back to sleep!”

 

14.00: Zzzzzzzzzzzzzzzz

 

15:00: “Not impressed with the task list for today……anyway, we’ve gone digital, so I don’t need this paper, but it’s great to sit on!  Maybe I’ll get some more sleep.”

 

17:30: Zzzzzzzzzzzzzzzzzzzzzz

 

18:30: Time to leave, but this guy is a workaholic. “The only way you're going to get me out of the door is to feed me some more cat biscuits!”

 

We hope that this brief case study shows you how you might adapt the role of the CFO within your own organization to achieve a truly transformational impact on corporate well-being.

]]>
<![CDATA[BearingPoint: Innovating and Partnering to Reach €1bn Revenue]]> NelsonHall recently attended BearingPoint’s 2017 analyst summit in Le Village by CA, Paris. A key takeaway from the event was BearingPoint’s ability to partner with its clients and its partner network to drive innovation.

Innovation Hubs

Crédit Agricole, one of its largest clients in France, set up Le Village by CA in 2014 as an innovation hub to nurture startups, and not just in fintech. BearingPoint has contributed to establishment and operation of the innovation village from CA Group’s initial reflections on the project. Using a network of partners including the likes of BearingPoint, Microsoft, HPE, IBM, Orange and Philips, the village hosts ~100 startups with solutions inside and out of the financial services space.

The center is in prime real-estate in Paris, surrounded by the headquarters of many of France’s largest enterprises. This proximity and the partnerships allow the start-ups to gain unprecedented access to these organizations. In some cases, through the coaching provided by the partners and this access, start-ups have saved six months in sales development.

Each month, start-ups pitch to the partners for the ability to enter the village, with pitches lasting a tightly defined 13 minutes: 5 minutes of pitch time, 5 for Q&A and 3 for decision making. Successful start-ups receive the ability to rent space in the village half price, and support & mentoring through the expertise of CA and the partners for up to two years. BearingPoint highlights the importance of this mentoring: ‘incubating is not enough, now you need coaching’. BearingPoint’s contribution includes start-up mentoring and support programs and offering participation in think tanks and other events. Through this coaching the center aims for ~50-60% start-up success, with BearingPoint and other partners offering co-creating of solutions.

Since its inception CA has established 8 such villages, supporting 237 startups across France, and plans to open 30 more villages in 2017.

Global Alliance Network

BearingPoint’s global alliance network includes ABeam Consulting, Grupo ASSA, and West Monroe Partners. Each alliance partner adds geographic coverage; ABeam Consulting in APAC, primarily Japan; BearingPoint in EMEA; Grupo ASSA in Latin America; West Monroe Partners in North America. The global alliance network has 10.2k FTEs across 35 countries with revenues of ~$1.5bn (of which BearingPoint $655m).

The global alliance network promotes ‘working as one group’, and is co-developing centers as a group effort. It intends to open a mobile innovation center, starting in Vienna hosting ~10 start-ups, and thereafter in APAC.

Each alliance member owns the relationship of clients headquartered in their geography, leveraging the global alliance capabilities. The first cross-alliance contract is Continental, where BearingPoint owns the relationship, with resources from West Monroe, Grupo ASSA and ABeam providing SAP integration services across their respective geographies.

The global alliance network will help BearingPoint serve its clients with global operations – its target 2020 revenue for RoW is €65m.

Operating Model

BearingPoint’s operating model is designed to support innovation across its Consulting, Solutions and Ventures organizational structure.

BearingPoint runs shark-tanks twice yearly out of innovation hubs such as Le Village by CA to choose innovative companies to invests in for its Ventures unit.

With this operating model, in 2016 BearingPoint generated 420 new ideas from BearingPoint’s consultants, drawing from their experiences in client engagements, created 11 accelerators that have generated €47m in new sales, and spun off its first venture: blockchain technology for financial institutions, acquired by Digital Asset Holding.

This model has been a success; global alliance partner ABeam is working on implementing a similar incubator platform in 2017/18.

BearingPoint’s 2020 ambitions include €1bn in revenues and a 15% EBIT margin. Three key elements of BearingPoint’s growth strategy are:

  • Demonstrating its global reach through its global alliance network
  • The solution business adding €100m to its current €139m revenue
  • M&A to fill gaps in the portfolio and to support growth in strategic areas. This week for example. BearingPoint announced the acquisition of LCP consulting to strengthen its presence in the retail sector.
]]>
<![CDATA[Wipro & Automation Anywhere: Extending Beyond Rule-Based RPA into New Digital Business Process Models]]>

 

Wipro began partnering with Automation Anywhere in 2014. Here I examine the partnership, looking at examples of RPA deployments across joint clients, at how the momentum behind the partnership has continued to strengthen, and at how the partners are now going beyond rule-based RPA to build new digital business process models.

Partnership Already has 44 Joint Clients

Wipro initially selected Automation Anywhere based on the flexibility and speed of deployment of the product and the company’s flexibility in terms of support. The two companies also have a joint go-to-market, targeting named accounts with whom Wipro already has an existing relationship, plus key target accounts for both companies. 

To date, Wipro has worked with 44 clients on automation initiatives using the Automation Anywhere platform, representing ~70% of its RPA client base. Of these, 17 are organizations where Wipro was already providing BPS services, 27 are clients where Wipro has assisted in-house operations or provided support in applying RPA to processes already outsourced to another vendor.

In terms of geographies, Wipro’s partnership with Automation Anywhere is currently strongest in the Americas and Australia. However, Automation Anywhere has recently been investing in a European presence, including the establishment of a services and support team in the U.K., and the two companies are now focusing on breaking into the major non-English-speaking markets in Continental Europe.

So let’s look at a few examples of deployments.

For an Australian telco, where Wipro is one of three vendors supporting the order management lifecycle, Wipro had ~330 FTEs supporting order entry to order provisioning. Wipro first applied RPA to these process areas, deploying 45 bots, replacing 143 FTEs. The next stage looked across the order management lifecycle. Since the three BPS vendors were handling different parts of the lifecycle, an error or missing information at one stage would result in the transaction being rejected further downstream. In order to eliminate, as far as possible, exceptions from the downstream BPS vendors, Wipro implemented "checker" bots which carry out validation checks on each transaction before it is released to the next stage in the process, sending failed transactions and the reasons for failure back to the processing bots for reprocessing and, where appropriate, the collection of missing information. This reduced the number of kick-backs by 73%.

Other clients where Wipro has used Automation Anywhere in RPA implementations include:

  • A U.S. bank, automating bank account statement reconciliation (~94% time-saving), the bounce-back process (~60% time-saving), and account activation and day 2 check (~50% time-saving)
  • A U.S.-based clothing manufacturer, automating journal processing: auto-selecting the template, auto data entry into the document, auto emails with offer, and auto data entry into ERP. Led to a 38% FTE reduction
  • A steel manufacturing company, automating invoice processing. Led to 50% FTE reduction
  • A European network equipment provider: order management (supporting CDR creation, invoice, OD creation, order entry, & order entry changes), achieving a 41% productivity improvement; also procurement across P2P and MDM processes, a 40% productivity improvement
  • A U.K. based telco, automating order management; achieved a £1.4m cost reduction and an 80% reduction in wait time
  • A multi-national medical devices company: automating 10 processes within P2P; replaced 61 FTEs and produced a ~13% productivity benefit.

Using The Partnership to Enhance Speed-to-Benefit within Rule-Based Processes

The momentum behind the partnership has continued to strengthen, with Wipro achieving a number of significant wins in conjunction with Automation Anywhere over the past three months, including a contract which will result in the deployment of in excess of 100 bots within a single process area over the next 6 months. In the last quarter, as organizations begin to scale their RPA roll-outs, Wipro has seen average RPA deal sizes increase from 25-40 bots to 75-100 bots.

Key targets for Wipro and Automation Anywhere are banking, global media & telecoms, F&A, and increasingly healthcare and Wipro has recently been involved in discussions with new organizations across the manufacturing, retail, and consumer finance sectors in areas such F&A, order management, and industry-specific processing.

Out of its team of ~450 technical and functional RPA FTEs (~600 FTEs if we include cognitive), Wipro has ~200 FTEs dedicated to Automation Anywhere implementations. This concentration of expertise is assisting Wipro in enhancing speed-to-benefit for clients, particularly in areas where Wipro has conducted multiple assignments, for example in:

  • Banking: payments, new account opening, and account maintenance
  • Insurance: accounts reconciliation and policy servicing
  • Capital markets: payable charges, reporting trading, verifications, trade settlements, static data maintenance.

Overall, Wipro has ~400 curated and non-curated bots in its library. This has assisted in halving implementation cycle times in these areas, to around four weeks.

Wipro also perceives the ease of deployment and ease of debugging of Automation Anywhere, facilitated by the structuring of its platform into separate orchestration and task execution bots, is another factor that has helped enhance speed-to-benefit.

Wipro’s creation of a sizeable team of Automation Anywhere specialists means it has the bandwidth to respond rapidly to new opportunities and to initiate new projects within 1-2 weeks.

Speed of support to architecture queries is another important factor both in architecting in the right way and in speed-to-market. Around a third (~100) of Automation Anywhere personnel are within its support and services organization, providing 24X7 support by phone and email, and ensuring a two-day resolution time. This is of particular importance to Wipro in support of its multi-geography RPA projects.

Extending the Partnership: Tightening Integration between Automation Anywhere & Wipro Platforms to Build New Digital Business Process Models

In addition to standard rule-based RPA deployments of Automation Anywhere, Wipro is also increasingly:

  • Using Automation Anywhere bots to handle unstructured data (currently deployed with ~25% of clients), and to provide process analytics
  • Integrating Automation Anywhere with Wipro platforms such as Base))), its BPM process interaction design and execution platform, via the API functionality within Automation Anywhere’s metabots. To date, Wipro has eight clients where it is using Automation Anywhere bots combined with its Base))) platform, in processes such as A/P, order management, A/R, and GL.

In an ongoing development of the partnership, Wipro will use Automation Anywhere cognitive bots to complement Wipro HOLMES, using Automation Anywhere for rapid deployments, probably linked to OCR, and HOLMES to support more demanding cognitive requirements using a range of customized statistical techniques for more complicated extraction and understanding of data and for predictive analytics.

Accordingly, Wipro is strengthening its partnership with Automation Anywhere both to deliver tighter execution of rule-based RPA implementations and as a key platform component in the creation of future digital business process models.

]]>
<![CDATA[Unisys’ Evolution Continues with Security, Industry-Specific IP & Analytics at the Core]]>

 

NelsonHall recently attended a Unisys analyst and advisor day in London. The tagline for the UnisysNOW event was Securing Your Tomorrow, with security taking a central role in the firm’s corporate transformation, along with a vertical-led go-to-market strategy, and analytics. Here I take a look at Unisys’ transformation to date, and at its future strategic direction.

Unisys financial performance

Unisys has undergone a number of turnaround programs over the last ten years; its last CEO refreshed leadership, sales, delayered the organization, simplified the portfolio, and reduced the expense structure. There has been a major reduction in the global field-force, with NelsonHall estimating this has almost halved over recent years to ~4k personnel.

The current CEO Peter Altabef, who joined in January 2015, has reorganized Unisys to a vertical-led go to market organization, and introduced a stronger focus on industry IP, supported by globally integrated delivery teams. In April 2015, the company introduced a further cost reduction plan to drive a more competitive cost structure and rebalance its global skill set. This achieved $205m in annualized cost savings exiting 2016, above its original plan of $200m.

Unisys’ revenues in 2016 were $2.82bn, a CC decline of 4.4%, at the upper end of guidance, and non-GAAP operating margin was 7.7%. However, the Services business (~85% of overall revenues) has recorded revenue decline for the last three years. The ongoing focus is to improve margins of Services and overall business through operational efficiencies, improvements to WC and FCF.

However, margin expansion at corporate level is dependent on Technology revenue and ClearPath licenses which tend to be lumpy on a quarter basis. Unisys also has a major challenge over the next few years with its underfunded U.S. DB plan.

Leadership and salesforce refresh

Altabef has built an entirely new senior leadership team, with hires from the likes of Accenture, IBM, Capgemini and Dell Services, bringing in outside experience and helping evolve the Unisys culture, and over a quarter of the company’s client execs have been replaced. There is a stronger focus on proactive proposals.

Looking to demonstrate its innovation capabilities with clients, Unisys has launched a dedicated center in its Virginia HQ for exec briefings about new services. Of the first six exec briefings, four were with clients in Australia.

Consultative-led sales approach

Unisys is also looking to significantly ramp-up its consulting and consultative-led sales engagements (advisory & project work). It has launched a new sales enhancement training program focusing on advisory and consulting skills across the firm, and has also hired 54 domain experts.

The aim is to drive outcome-based client conversations, focusing on business issues, with a drive to increase overall client satisfaction.

It has also introduced a new compensation plan, where a third of account executive compensation is now tied directly to client NPS scores.

Vertical-led go-to-market strategy, leveraging IP to target market opportunities

Unisys has aligned its sales executives to four verticals: Public Sector, Travel & Transportation, Life Sciences & Healthcare, and Financial Services. These are supported by 36 vertical-specific solutions and services such as Digital Investigator (an upgraded version of U-LEAF, Unisys Law Enforcement Application Framework) in the public sector, Pharma Track & Trace in life sciences, and Elevate by Unisys in financial services.

Unisys has deployments of Digital Investigator across EMEA, the Caribbean and Australia. Unisys is likely to see further expansion of its Law Enforcement and Border Protection capabilities across EMEA, in particular U.K. (post Brexit); there may also be opportunities in the U.S.

Elevate by Unisys, its omni-channel digital banking platform, has built-in biometrics and is offered as-a-Service (via public or private cloud) or as an on-premise solution. Launched in Q1 2017, it is being targeted at tier 2 banks in EMEA first and then in APAC and LATAM.

Reinforced by security

As highlighted by the tagline, a key focus of Unisys continues to be security, in particular the Unisys Stealth assets. Altabef highlighted that “(we) don’t enter every conversation with Stealth, (we) enter with the offering and what security makes sense in the context of the offering”.

Unisys has evolved Stealth from dedicated on-premise software, and it is now available on AWS and Azure. Stealth functionality now includes Stealth(mobile), Stealth(cloud), Stealth(identity), Stealth(analytics). The recently launched Stealth(aware) automates the implementation of Stealth. Stealth revenues are still relatively small (NelsonHall estimated revenue of ~$12m in 2016), but now has a dedicated sales team.

Unisys also highlighted its plans for ClearPath Forward, including the ability to deploy ClearPath Forward in hybrid cloud environments by 2020.

Improving the Services business

Unisys is looking to increase its use of automation and AI across EUC, infrastructure managed services, and application services. Initiatives currently underway in support of this include:

  • Within C&IS, Unisys is expanding automation with toolsets including VantagePoint and ServiceNow, driving more self-help, and also the use of Tech Cafes. Unisys cited an example within infrastructure support, where it currently deals with ~16k tickets per month, and is looking to automate 75-80% of these tickets
  • In application services, Unisys is looking to grow its transformational consulting capabilities, next-generation ADM and DevOps capabilities. It has made a number of recent hires (agile transformation leads). As well as increasing automation, including the use of virtual agents and chatbots in service delivery, it is also leveraging analytics within application maintenance.

Unisys is also seeking to further leverage its advanced data analytics capabilities, which are currently focused on its main client, the Department of Homeland Security, where it claims to have ~300 personnel running >700k predictive models daily to identify threat. Unisys plans to ramp up its capabilities in advanced data analytics, and is hiring a number of data scientists and consultants.

Summary

Unisys continues on its corporate transformation, and has selected security, industry-specific IP and analytics as focus areas for growth. The transition to a consultative and advisory led sales culture; and ramping up on domain experts across industries will take time.

Altabef highlighted EMEA (~27% of 2016 revenues) as a region of focus, and this has been evidenced with a number of recent senior hires in the EMEA region. Restructuring continues in EMEA: the divestment last year of its Italian SAP practice helped to improve EMEA operating margin.

The early signs are that Unisys continues to improve its overall financial stability.

In the near to mid-term, improving margins of the Services business will continue to be a priority for the company.

NelsonHall has recently published an updated Key Vendor Assessment on Unisys, providing a comprehensive and objective analysis of Unisys’ IT and business process services offerings, capabilities, and market and financial strength. NelsonHall also produces Quarterly Updates on Unisys in its Quarterly Vendor Update program. For details, contact [email protected].

]]>
<![CDATA[NelsonHall’s Blogging Year: A Selection From 2016]]> NelsonHall analysts are regular bloggers, and while you might be familiar with a number of them, you might not be aware of the full range of topics that NelsonHall analysts blog about. We thought it was an opportune time to look back and pick out just a few of the many blog articles produced last year from different corners of NelsonHall research to give readers a flavour of the scope of our coverage.

 

 

We continue to keep abreast of unfolding developments in RPA and cognitive intelligence. In October and November, John Willmott wrote a sequence of three handy blogs on RPA Operating Model Guidelines:

Turning to Andy Efstathiou and some of his musings on FinTech and RPA developments in the Banking sector:

Regarding developments in Customer Management Services:

Fiona Cox and Panos Filippides have been keeping an eye on BPS in the Insurance sector. Two of their blogs looked at imminent vendor M&A activity:

Blogs in the HR Outsourcing domain have included innovation in RPO, and in employee engagement, learning at the beginning of the employee life cycle, talent advisory and analytics services, employer branding, improving the candidate experience, benefits administration and global benefits coverage, cloud-based HR BPS, and more! Here’s a couple on payroll services, so often an overlooked topic, that you might have missed:

Dominique Raviart continues to keep a close eye on developments in Software Testing Services. For example:

Dominique also keeps abreast of unfolding developments in the IT Services vendor landscape. For example, in November he wrote about Dell Services: the Glue for "One NTT DATA" In North America.

Staying with IT Services, David McIntire:

Meanwhile, Mike Smart has been blogging about IoT. Here are two of his earlier ones:

And Rachael Stormonth continues to consider the significance of unfolding developments in the larger and more interesting IT Services and BPS vendors:

That’s just a small sample of the wide-ranging themes and hot topics covered by NelsonHall blog articles in our trademark fact-based, highly insightful style.

Keep up with the latest blogs from these and other NelsonHall analysts throughout 2017 here, and sign up to receive blog and other alerts by topic area, or update your preferences, here

]]>
<![CDATA[RPA Operating Model Guidelines, Part 3: From Pilot to Production & Beyond – The Keys to Successful RPA Deployment]]>

As well as conducting extensive research into RPA and AI, NelsonHall is also chairing international conferences on the subject. In July, we chaired SSON’s second RPA in Shared Services Summit in Chicago, and we will also be chairing SSON’s third RPA in Shared Services Summit in Braselton, Georgia on 1st to 2nd December. In the build-up to the December event we thought we would share some of our insights into rolling out RPA. These topics were the subject of much discussion in Chicago earlier this year and are likely to be the subject of further in-depth discussion in Atlanta (Braselton).

This is the third and final blog in a series presenting key guidelines for organizations embarking on an RPA project, covering project preparation, implementation, support, and management. Here I take a look at the stages of deployment, from pilot development, through design & build, to production, maintenance, and support.

Piloting & deployment – it’s all about the business

When developing pilots, it’s important to recognize that the organization is addressing a business problem and not just applying a technology. Accordingly, organizations should consider how they can make a process better and achieve service delivery innovation, and not just service delivery automation, before they proceed. One framework that can be used in analyzing business processes is the ‘eliminate/simplify/standardize/automate’ approach.

While organizations will probably want to start with some simple and relatively modest RPA pilots to gain quick wins and acceptance of RPA within the organization (and we would recommend that they do so), it is important as the use of RPA matures to consider redesigning and standardizing processes to achieve maximum benefit. So begin with simple manual processes for quick wins, followed by more extensive mapping and reengineering of processes. Indeed, one approach often taken by organizations is to insert robotics and then use the metrics available from robotics to better understand how to reengineer processes downstream.

For early pilots, pick processes where the business unit is willing to take a ‘test & learn’ approach, and live with any need to refine the initial application of RPA. Some level of experimentation and calculated risk taking is OK – it helps the developers to improve their understanding of what can and cannot be achieved from the application of RPA. Also, quality increases over time, so in the medium term, organizations should increasingly consider batch automation rather than in-line automation, and think about tool suites and not just RPA.

Communication remains important throughout, and the organization should be extremely transparent about any pilots taking place. RPA does require a strong emphasis on, and appetite for, management of change. In terms of effectiveness of communication and clarifying the nature of RPA pilots and deployments, proof-of-concept videos generally work a lot better than the written or spoken word.

Bot testing is also important, and organizations have found that bot testing is different from waterfall UAT. Ideally, bots should be tested using a copy of the production environment.

Access to applications is potentially a major hurdle, with organizations needing to establish virtual employees as a new category of employee and give the appropriate virtual user ID access to all applications that require a user ID. The IT function must be extensively involved at this stage to agree access to applications and data. In particular, they may be concerned about the manner of storage of passwords. What’s more, IT personnel are likely to know about the vagaries of the IT landscape that are unknown to operations personnel!

Reporting, contingency & change management key to RPA production

At the production stage, it is important to implement a RPA reporting tool to:

  • Monitor how the bots are performing
  • Provide an executive dashboard with one version of the truth
  • Ensure high license utilization.

There is also a need for contingency planning to cover situations where something goes wrong and work is not allocated to bots. Contingency plans may include co-locating a bot support person or team with operations personnel.

The organization also needs to decide which part of the organization will be responsible for bot scheduling. This can either be overseen by the IT department or, more likely, the operations team can take responsibility for scheduling both personnel and bots. Overall bot monitoring, on the other hand, will probably be carried out centrally.

It remains common practice, though not universal, for RPA software vendors to charge on the basis of the number of bot licenses. Accordingly, since an individual bot license can be used in support of any of the processes automated by the organization, organizations may wish to centralize an element of their bot scheduling to optimize bot license utilization.

At the production stage, liaison with application owners is very important to proactively identify changes in functionality that may impact bot operation, so that these can be addressed in advance. Maintenance is often centralized as part of the automation CoE.

Find out more at the SSON RPA in Shared Services Summit, 1st to 2nd December

NelsonHall will be chairing the third SSON RPA in Shared Services Summit in Braselton, Georgia on 1st to 2nd December, and will share further insights into RPA, including hand-outs of our RPA Operating Model Guidelines. You can register for the summit here.

Also, if you would like to find out more about NelsonHall’s expensive program of RPA & AI research, and get involved, please contact Guy Saunders.

Plus, buy-side organizations can get involved with NelsonHall’s Buyer Intelligence Group (BIG), a buy-side only community which runs regular webinars on RPA, with your buy-side peers sharing their RPA experiences. To find out more, contact Matthaus Davies.  

This is the final blog in a three-part series. See also:

Part 1: How to Lay the Foundations for a Successful RPA Project

Part 2: How to Identify High-Impact RPA Opportunities

]]>
<![CDATA[Dell Services: Complementing FTEs with Proprietary AFTE Technology]]> This is the fourth in a series of blogs on vendors’ RPA initiatives in the insurance sector.

We now turn our attention to Dell Services, which has adopted an automation focus across its life and healthcare insurance BPS processes.

Focusing on healthcare payer & provider and life insurance process automation

In 2016, life insurance accounts for around 30% of Dell Services’ overall BPS revenues and healthcare payer accounts for approximately 35%, with healthcare provider making up the balance. Dell Services takes a platform-led approach to its BPS:

  1. It has its own LifeSys platform for life insurance, on to which it migrates a client’s book of business and provides administration services in its own environment; or

  2. It partners with a third party supplier for platform capability and tailors it to fit the needs of the book of business, from which it can then provide services, e.g. Dell Services uses partner ikaSystems for its healthcare payer platform needs, on top of which it layers its Dell Business Process Management Suite (DBPMS) tools. The tools include:

  3. An enterprise dashboard: including KPI tracking and trend analysis for SLA metrics

  4. Client extranet: including an issues log

  5. Queue management: including skill-set based routing and priority allocation.

Automation Ideation led by BPS delivery teams

Unlike other providers, who tend to be led by their clients with respect to automation, the process at Dell  Services starts with an internal ‘ideas generation’ stage, achieved either through Dell’s ‘LEAP’ (Listen, Engage, Act, Progress) portal where agents are able to log ideas, together with perceived benefits (and are rewarded if their ideas are selected) or via the Business Process Improvement (BPI) team who carry out a ‘click study’ to identify ways in which the process could be re-engineered or automated. In line with its peers, an internal concern about increasing automation was the inevitable change in job composition; for this reason, the LEAP portal is considered particularly important to ensure employees are involved and engaged in driving the initiative forward. In addition, supervisors are targeted with an annual 5%-15% AFTE target. Once an idea has been selected, a feasibility study takes place before the idea is tested and bots are deployed by the central AFTE automation team. Bot management is then passed to the operations team while the bots are monitored through the central bot command center.

Balancing AFTEs with FTEs

In line with the market, Dell Services has concentrated its efforts on applying automation to high volume processes, which account for ~30% to 35% of its overall book of business. To achieve this, it is targeting the introduction of ~300-400 AFTEs year on year, though this is not a static number since clients are on-boarded throughout the year. The overall aim is to achieve around 6% productivity improvement per annum.

Although Dell Services does use third-party RPA platforms, it has developed its own “AFTE” platform incorporated within the Dell Business Process Management Platform. AFTE bots rather than third-party bots are typically deployed where the Dell BPMS platform is already being used or is to be used.

High volume processes (in which AFTEs are being used to varying extents) within each of Dell Services’ insurance services include:

  • Life insurance:
    • Data entry and indexing: freeing up FTEs to carry out other activities such as policy holder services where less work is typically carried out by AFTEs – though this is something that Dell is looking to change and where Dell is investing in automation initiatives
    • Policy issuing: currently, the work is handled 50% by FTEs and 50% by AFTEs, with Dell seeking greater tool maturity before it is able to drive greater automation here
    • Premium accounting
  • Healthcare payer:
    • In-bound calls: FTE-led
    • Adjustments: FTE-led
    • Adjudication: 50% AFTE, 50% FTE
    • Claim processing: FTE-led
    • Member enrolment: FTE-led
    • Provider maintenance: 50% AFTE, 50% FTE
  • Healthcare provider:
    • File download: exclusively AFTE
    • Medical coding: 50% AFTE, 50% FTE
    • Change entry: FTE-led
    • Payment posting: AFTE-led
    • Credit balancing: 50% AFTE, 50% FTE
    • Accounts receivables: FTE-led

A simple example to illustrate some of the quantifiable benefits that have been achieved through automation can be seen through the work that took place to automate call center operations at one of Dell’s life insurance clients. Prior to the introduction of automation, call center agents were required to use a number of screens to capture customer information, which often resulted in comparatively low accuracy, and a high handling time. The system was not user-friendly and baseline training typically took around 10 weeks. Ultimately SLAs were being missed. To address this, Dell condensed the numerous screens into one screen and introduced rule-based processes to ensure no manual calculations were required to complete the form, unlike previously, where up to six manual calculations were required. As a result, AHT fell from 471 seconds to 374 and training took ~7 weeks, as opposed to 10. The quality of data capture increased from 88% to 95% and the average time taken to update notes fell from 110 seconds to 15 seconds¸ because the system was largely able to perform updates itself.

Plans to Implement Machine Learning within Dell BPM Platform

Over the last four years, Dell has extended its capabilities from simple script based-processing, to the development of AFTEs, including an associated AFTE command center. Going forward, the intention is to incorporate a self-learning capability, implement technologies such as NLP and machine learning within the Dell BPMS platform, and to secure end-to-end automation in the processes that are already largely being carried out by AFTEs, e.g. indexing.

]]>
<![CDATA[RPA Operating Model Guidelines, Part 2: How to Identify High-Impact RPA Opportunities]]>

 

As well as conducting extensive research into RPA and AI, NelsonHall is also chairing international conferences on the subject. In July, we chaired SSON’s second RPA in Shared Services Summit in Chicago, and we will also be chairing SSON’s third RPA in Shared Services Summit in Braselton, Georgia on 1st to 2nd December. In the build-up to the December event we thought we would share some of our insights into rolling out RPA. These topics were the subject of much discussion in Chicago earlier this year and are likely to be the subject of further in-depth discussion in Atlanta (Braselton).

This is the second in a series of blogs presenting key guidelines for organizations embarking on an RPA project, covering project preparation, implementation, support, and management. Here I take a look at how to assess and prioritize RPA opportunities prior to project deployment.

Prioritize opportunities for quick wins

An enterprise level governance committee should be involved in the assessment and prioritization of RPA opportunities, and this committee needs to establish a formal framework for project/opportunity selection. For example, a simple but effective framework is to evaluate opportunities based on their:

  • Potential business impact, including RoI and FTE savings
  • Level of difficulty (preferably low)
  • Sponsorship level (preferably high).

The business units should be involved in the generation of ideas for the application of RPA, and these ideas can be compiled in a collaboration system such as SharePoint prior to their review by global process owners and subsequent evaluation by the assessment committee. The aim is to select projects that have a high business impact and high sponsorship level but are relatively easy to implement. As is usual when undertaking new initiatives or using new technologies, aim to get some quick wins and start at the easy end of the project spectrum.

However, organizations also recognize that even those ideas and suggestions that have been rejected for RPA are useful in identifying process pain points, and one suggestion is to pass these ideas to the wider business improvement or reengineering group to investigate alternative approaches to process improvement.

Target stable processes

Other considerations that need to be taken into account include the level of stability of processes and their underlying applications. Clearly, basic RPA does not readily adapt to significant process change, and so, to avoid excessive levels of maintenance, organizations should only choose relatively stable processes based on a stable application infrastructure. Processes that are subject to high levels of change are not appropriate candidates for the application of RPA.

Equally, it is important that the RPA implementers have permission to access the required applications from the application owners, who can initially have major concerns about security, and that the RPA implementers understand any peculiarities of the applications and know about any upgrades or modifications planned.

The importance of IT involvement

It is important that the IT organization is involved, as their knowledge of the application operating infrastructure and any forthcoming changes to applications and infrastructure need to be taken into account at this stage. In particular, it is important to involve identity and access management teams in assessments.

Also, the IT department may well take the lead in establishing RPA security and infrastructure operations. Other key decisions that require strong involvement of the IT organization include:

  • Identity security
  • Ownership of bots
  • Ticketing & support
  • Selection of RPA reporting tool.

Find out more at the SSON RPA in Shared Services Summit, 1st to 2nd December

NelsonHall will be chairing the third SSON RPA in Shared Services Summit in Braselton, Georgia on 1st to 2nd December, and will share further insights into RPA, including hand-outs of our RPA Operating Model Guidelines. You can register for the summit here.

Also, if you would like to find out more about NelsonHall’s expensive program of RPA & AI research, and get involved, please contact Guy Saunders.

Plus, buy-side organizations can get involved with NelsonHall’s Buyer Intelligence Group (BIG), a buy-side only community which runs regular webinars on sourcing topics, including the impact of RPA. The next RPA webinar will be held later this month: to find out more, contact Guy Saunders.  

In the third blog in the series, I will look at deploying an RPA project, from developing pilots, through design & build, to production, maintenance, and support.

]]>
<![CDATA[RPA Operating Model Guidelines, Part 1: Laying the Foundations for Successful RPA]]>

 

As well as conducting extensive research into RPA and AI, NelsonHall is also chairing international conferences on the subject. In July, we chaired SSON’s second RPA in Shared Services Summit in Chicago, and we will also be chairing SSON’s third RPA in Shared Services Summit in Braselton, Georgia on 1st to 2nd December. In the build-up to the December event we thought we would share some of our insights into rolling out RPA. These topics were the subject of much discussion in Chicago earlier this year and are likely to be the subject of further in-depth discussion in Atlanta (Braselton).

This is the first in a series of blogs presenting key guidelines for organizations embarking on RPA, covering establishing the RPA framework, RPA implementation, support, and management. First up, I take a look at how to prepare for an RPA initiative, including establishing the plans and frameworks needed to lay the foundations for a successful project.

Getting started – communication is key

Essential action items for organizations prior to embarking on their first RPA project are:

  • Preparing a communication plan
  • Establishing a governance framework
  • Establishing a RPA center-of-excellence
  • Establishing a framework for allocation of IDs to bots.

Communication is key to ensuring that use of RPA is accepted by both executives and staff alike, with stakeholder management critical. At the enterprise level, the RPA/automation steering committee may involve:

  • COOs of the businesses
  • Enterprise CIO.

Start with awareness training to get support from departments and C-level executives. Senior leader support is key to adoption. Videos demonstrating RPA are potentially much more effective than written papers at this stage. Important considerations to address with executives include:

  • How much control am I going to lose?
  • How will use of RPA impact my staff?
  • How/how much will my department be charged?

When communicating to staff, remember to:

  • Differentiate between value-added and non value-added activity
  • Communicate the intention to use RPA as a development opportunity for personnel. Stress that RPA will be used to facilitate growth, to do more with the same number of people, and give people developmental opportunities
  • Use the same group of people to prepare all communications, to ensure consistency of messaging.

Establish a central governance process

It is important to establish a strong central governance process to ensure standardization across the enterprise, and to ensure that the enterprise is prioritizing the right opportunities. It is also important that IT is informed of, and represented within, the governance process.

An example of a robotics and automation governance framework established by one organization was to form:

  • An enterprise robotics council, responsible for the scope and direction of the program, together with setting targets for efficiency and outcomes
  • A business unit governance council, responsible for prioritizing RPA projects across departments and business units
  • A RPA technical council, responsible for RPA design standards, best practice guidelines, and principles.

Avoid RPA silos – create a centre of excellence

RPA is a key strategic enabler, so use of RPA needs to be embedded in the organization rather than siloed. Accordingly, the organization should consider establishing a RPA center of excellence, encompassing:

  • A centralized RPA & tool technology evaluation group. It is important not to assume that a single RPA tool will be suitable for all purposes and also to recognize that ultimately a wider toolset will be required, encompassing not only RPA technology but also technologies in areas such as OCR, NLP, machine learning, etc.
  • A best practice for establishing standards such as naming standards to be applied in RPA across processes and business units
  • An automation lead for each tower, to manage the RPA project pipeline and priorities for that tower
  • IT liaison personnel.

Establish a bot ID framework

While establishing a framework for allocation of IDs to bots may seem trivial, it has proven not to be so for many organizations where, for example, including ‘virtual workers’ in the HR system has proved insurmountable. In some instances, organizations have resorted to basing bot IDs on the IDs of the bot developer as a short-term fix, but this approach is far from ideal in the long-term.

Organizations should also make centralized decisions about bot license procurement, and here the IT department which has experience in software selection and purchasing should be involved. In particular, the IT department may be able to play a substantial role in RPA software procurement/negotiation.

Find out more at the SSON RPA in Shared Services Summit, 1st to 2nd December

NelsonHall will be chairing the third SSON RPA in Shared Services Summit in Braselton, Georgia on 1st to 2nd December, and will share further insights into RPA, including hand-outs of our RPA Operating Model Guidelines. You can register for the summit here.

Also, if you would like to find out more about NelsonHall’s extensive program of RPA & AI research, and get involved, please contact Guy Saunders.

Plus, buy-side organizations can get involved with NelsonHall’s Buyer Intelligence Group (BIG), a buy-side only community which runs regular webinars on sourcing topics, including the impact of RPA. The next RPA webinar will be held in November: to find out more, contact Matthaus Davies.  

 

In the second blog in this series, I will look at RPA need assessment and opportunity identification prior to project deployment.

 

]]>
<![CDATA[NelsonHall Launches Seven NEAT Vendor Evaluations in Summer 2016 as Support for Sourcing Buyers Continues Apace]]> A bumper summer for NEAT vendor evaluation projects across HRO, Insurance BPS, Banking BPS, CMS, and IT Services. Plus news of NelsonHall's Buyer Intelligence Group.

Many businesses experience a lull during holiday season, but for NelsonHall, summer 2016 has been non-stop, with the release of seven NelsonHall Vendor Evaluation & Assessment Tool (NEAT) projects in June, July, and August. NEATs are an essential strategic tool used by sourcing managers in assessing vendor capability, and help cut the time and cost of sourcing projects by 50%.

Three HRO NEATs were published (RPO, Multi-Process HRO, and Cloud-Based HR Services), along with NEATs for Life, Annuities & Pensions BPS, Retail Banking BPS, CMS in Retail & CPG, and Software Testing. In total, 120 vendor evaluations were made for these projects, based on in-depth interviews with the vendors and a selection of their clients. This resulted in a total of 26 NEAT evaluation graphs showing vendor capability overall and within specific areas of focus (e.g. digital capability, transformation, cost reduction, revenue increase, etc.).

NelsonHall's Speed-to-Source methodology

NEAT tools are a part of NelsonHall's "Speed-to-Source" methodology. The tool sits at the front-end of the vendor screening process and consists of a two-axis model: assessing vendors against their "ability to deliver immediate benefit" to clients and their "ability to meet future client requirements". Vendors are scored against a wide range of criteria, establishing a number of scenarios, each representing a different business situation or client business need.

Furthermore, NEAT tools enable buy-side organizations to input their own weightings and tailor the NEAT dataset to their specific requirements across all the vendor evaluation criteria. Using the interactive web-based tools, sourcing managers can configure the NEAT evaluations in accordance with their own priorities and business requirements for service offerings, delivery capability, customer presence, benefits achieved, and other criteria.

To find out more about NelsonHall's NEAT vendor evaluations, and to arrange a demonstration, contact Guy Saunders.

NelsonHall has also been busy building its Buyer Intelligence Group (BIG), an exclusive place for senior sourcing decision-makers to share best practice, promote thought leadership around hot industry topics, and seek expert advice. The community consists of experienced sourcing executives only, invited from leading buy-side organizations across industries. NelsonHall is currently presenting a series of webinars for community members on Robotic Process Automation, with our next event on September 20th. 

To find out more about NelsonHall's Buyer Intelligence Group, and to apply for membership, buy-side sourcing professionals can contact Matthaus Davies.

]]>
<![CDATA[Genpact Assists Client in Targeting 10x Process Improvement, Applying Design Thinking to Order Management]]>

Within its Lean Digital approach, Genpact is using digital and design thinking (DT) to assist organizations in identifying and addressing what is possible rather than just aiming to match current best-in-class, a concept now made passé by new market entrants.

At a recent event hosted at Genpact’s new center in Palo Alto, one client speaker described Genpact’s approach to DT. The company, a global consumer goods giant, had set up a separate unit within its large and mature GBS organization with a remit to identify major disruptions - with a big emphasis on “major”. It set a target of 10x improvement (rather than, say 30%) to ensure thinking differently about activities, in order to achieve major changes in approach, not simply incremental improvements within existing process frameworks. The company already had mature best-of-breed processes and was being told by shared service consultants that the GBS organization merely needed to continue to apply more technology to existing order management processes. However, the company perceived a need to “do over” its processes to target fundamental and 10x improvements rather than continue to enhance the status quo.

The establishment of a separate entity within the GBS organization to target this level of improvement was important in order to put personnel into a psychological safety zone separated from the influence of existing operations experts, existing process perceived wisdom, and a tendency to be satisfied with incremental change. The unit then mapped out 160 processes and screened them for disruption potential, using two criteria to identify potential candidates:

  • Are relevant disruptive technologies available and sufficiently mature now? Technologies ruled out at this stage included IoT and virtual customer service agents (the latter because they felt a 1% error rate was unacceptable in a commercial process)
  • Does the company have the will to disrupt the process?

The exercise identified five initial areas for disruption with one of these being order management.

On order management, the company then sought external input from an organization that could contribute both subject matter expertise and DT capability. And Genpact, not an existing supplier to the GBS organization for order management, provided a team of 5-10 dedicated personnel supported by a supplementary team of ~30 personnel.

The team undertook an initial workshop of 2-3 days followed by a 6-8 week design thinking and envisioning journey. The key principles here were “to fall in love with the problem, not the solution”, with the client perceiving many DT consultancies as being too ready to lock-in to a (preferred) solution too early in the DT exercise, and to use creative inputs, not experts. In this case, personnel with experience in STP in capital markets were introduced in support of generating new thinking, and it was five weeks into the DT exercise before the client’s team was introduced to possible technologies.

This DT exercise identified two fundamental principles for changing the nature of order management:

  • “No orders/no borders”, questioning the idea of whether an order was really necessary, instead viewing order management as a data problem, one that involves identifying the timing of replenishment based on various signals including those from retailers
  • The concept of the order management agent as an ‘orchestrator’ rather than a ‘doer’, with algorithms being used for basic ‘information directing’ rather than agents.

This company identified the key criteria for selecting a design thinking partner to be a service provider that:

  • Has the courage (and insight) to disrupt themselves and destroy their own revenue
  • Will spend a long time on the problem and not force a favored solution.

Genpact claims to be ready to cannibalize its own revenue (as do, indeed, all BPS providers we have spoken to – the expected quid pro quo being that the client outsources other activities to them). However, in this example, the order management “agents” being disrupted consist of 200-300 in-house client FTEs and 400-500 FTEs from other BPS service providers, so there is no immediate threat to Genpact revenues.

The Real Impact of RPA/AI is Still Some Way Off

Clearly the application of digital, RPA and AI technologies is going to have a significant impact on the nature of BPS vendor revenues in future, and, of course, on commercial models. However, at present, the level of revenue disruption facing BPS vendors is being limited by:

  • Organizations typically seeking 6%-7% cost reduction per annum, rather than higher, truly disruptive, targets
  • Genpact’s own estimate that ~80% of clients currently reject disruptive value propositions.

Nonetheless, organizations are showing considerable interest in concepts such as Lean Digital. Genpact CEO ‘Tiger’ Tyagarajan says he has been involved in 79 CEO meetings (to discuss digital process transformation/disruptive propositions as a result of the company’s lean digital positioning) in 2016 compared to fewer than 10 CEO meetings in the previous 11 years.

Order Management an Activity Where Major Disruption Will Occur

Finally, this example (one of several that we have seen) illustrates that order management, which tends to have significant manual processing and to be client or industry-specific, is becoming a major target for the disruptive application of new digital technologies. 

**********************

See also Genpact Combining Design Thinking & Digital Technologies to Generate Digital Asset Utilities by Rachael Stormonth, published this week here.

]]>
<![CDATA[Genpact Combining Design Thinking & Digital Technologies to Generate Digital Asset Utilities]]>

 

Genpact recently hosted an advisor/analyst session at its new Innovation Center in Palo Alto. So why is a BPO specialist opening a center in Silicon Valley? Genpact is using the center as a hub for the development (and showcasing) of platform-based digital assets aimed at mimicking the industry disruptors by being based on standardized, simplified operating models and distributed technology that can be deployed at web scale as utilities. The center is also being used to house design thinking (DT) workshops, and as a co-innovation Lab.

Genpact views its role as bringing domain knowledge and understanding of process and then leveraging DT and digital technologies, and emphasizes it is prepared to destroy existing BPO revenues in the process. It recognizes that operating to traditional upper quartile and best-in-class standards is no longer adequate as organizations look to compete with new forms of digitally-based competition. The emphasis in Genpact’s digital strategy is to continue to focus on middle- and back-office processes but to reorganize these processes with greater emphasis on straight through processing (STP) and on real time insights. So, while Genpact continues to put process before technology, the company is focusing on the application of 12 key technologies, specifically cloud/SaaS, mobility, dynamic workflow, advanced visualization, RPA, machine learning, cognitive computing, NLP, NLG, IoT, data analytics, and autonomic computing.

In general, Genpact is partnering with, rather than acquiring, companies with these technologies, though it has acquired PNMsoft due to the critical importance of dynamic workflow technology as a backbone for new digital processes, and it also continues to invest in IT services companies with expertise in applying these technologies, hence its acquisition of Endeavour Technologies to strengthen its mobile capability. Key technology partners who attended the session are:

  • PNMsoft, an Israel-headquartered provider of workflow, BPM and case management (HotOperations) software and Microsoft Gold partner, recently acquired by Genpact.  Though Genpact does not yet have experience of using PNMsoft solutions with a client, it will have been attracted by the ability of software such as PNMsoft Sequence to enable organization to establish multiple versions of workflows for a particular process, e.g. loan origination, and judge the impact of moving work between each of these workflows in terms of process SLAs and cost. This approach also means organizes can test multiple workflows and change/optimize processes without service interruption
  • Automation Anywhere, combining cognitive, analytics, and RPA technology
  • Rage Frameworks, developing “intelligent machines” for a platform-based approach to knowledge work. Seven of the current 16 “intelligent machine” platforms being developed by Rage Frameworks are being developed in conjunction with Genpact
  • Arria, a small U.K. based Natural Language Generation specialist.

Genpact is increasingly emphasizing its role in assisting organizations in creating new digital business models, and is looking to build digital assets based on combinations of the 12 technologies. Pilots and live examples demonstrated at the Innovation Center included:

  • Insurance policy servicing automation. Here the utility service on offer takes policy change feeds from a variety of channels, potentially including portals, emails, contact centers via use of voice-to-text technology, and uses NLP technology to identify appropriate data and update the policy administration system (PAS). Where exceptions occur, it can send details to an agent or request more information automatically, generating a NLG response to the originator. This utility is currently live at two Genpact insurance BPS clients
  • Using neural chat in support of opening corporate banking accounts. This incorporates use of OCR technology to extract data from images of documents taken via smartphone, using neural chat to present details of missing information to both the agent and the customer simultaneously for validation
  • Wind turbine predictive maintenance. In this example, live since Q3 2015, sensor data is being used to monitor a fleet of 5,000 wind turbines to identify the failure rate of parts including the likelihood of part failure in a particular location and month (taking into account the age of parts, the location of the part, and the impact of the weather). This data is then combined with information from the ERP system to identify the potential cost in terms of lost production and wider damage of a part failure, together with the cost of holding individual parts to optimize the number of parts held by individual location. It has enabled the client to move away from scheduled seasonal maintenance to a more predictive response to maintenance. The client has benefited from increased uptime (more revenues) and a significant reduction in maintenance costs. This approach is clearly applicable in areas such as cable network management and aircraft maintenance
  • Use of NLG in support of management reporting to produce a commentary on the underlying data and charts, aiming to produce one version of the truth and focusing the interpretation of the data and graphs onto the most salient points, thereby avoiding individual managers coming to differing conclusions from the same data
  • Reimagination of the quotes process for an electrical distributor. In the current process, the distributor receives electrical blueprints from which it manually extracts SKU-based orders to enter into its order management system. This is typically taking 3-hours per blueprint. Genpact has developed a pilot incorporating computer vision to input the blueprint, NLP to identify table data within the image, and machine learning to enhance the identification and interpretation of table data, which can then be input into the order management system. In this instance, the supervised learning exercise in support of the machine learning involved ~12 personnel over a 6-week period. This example, which is not yet live, illustrates Genpact looking at areas of a client’s operations that could be completely digitized; its proposed solution eliminates all manual processing.

Some of these examples are not yet in production, but all are evidently transformational. Clearly, Genpact is at an early stage in its development of digital assets, and the technologies and the technology vendors with which it works will evolve considerably. But it is clearly investing in Lean Digital Innovation in earnest: with examples such as the reimagination of the quotes process for an electrical distributor, Genpact has come up with an offering that is very different from its legacy in BPO services: presumably the commercial model, which I was told has yet to be finalized, will be transaction-based.

Among the strengths of Genpact’s approach with Lean Digital are its:

  • Emphasis on using combinations of emerging technologies rather than on single technologies to create point-based digital assets
  • Domain knowledge and understanding of process (its heritage): the approach starts with a focus on transforming a business process rather than with the application of automation/AI
  • Evident understanding that emerging technology companies just want to try some pilots with real clients rather than spend time on contractual arrangements and joint marketing.

********************

See also Genpact Assists Client in Targeting 10x Process Improvement, Applying Design Thinking to Order Management by John Willmott, published this week here.

]]>
<![CDATA[Tech Mahindra Explains its Pininfarina Majority Stake Acquisition]]> NelsonHall recently had a discussion with the head of the Integrated Engineering Services (IES) unit of Tech Mahindra regarding its 2016 acquisition of Pininfarina (along with parent company Mahindra & Mahindra). Turin-based Pininfarina is an icon in the world of automotive design, having serviced the likes of Ferrari for 80 years. It is a company of relatively limited size (€87m in 2015 revenues and a headcount of 650).

The acquisition gave Pininfarina an enterprise value of €92m (including €50m in net debt). So why did Tech Mahindra wanted to expand into design and styling services? What are the synergies between style and design services and IES? Why did Tech Mahindra pay such a high price? And how will it turn around Pininfarina? From our conversations with Tech Mahindra, we gathered the following.

  1. This acquisition is about gaining further scale in automotive and strengthening relationships with automotive OEMs in Europe. Tech Mahindra had been looking for three years to make an acquisition in the key market of Germany. The acquisition is about being able to cross-sell IES to the client base of Pininfarina
  2. Automotive is a strategic sector for Tech Mahindra: manufacturing is its second largest vertical after telecom and it has 25 automotive clients. It derives one third of its manufacturing sector revenues from IES, with automotive accounting for a third of its IES revenues
  3. Pininfarina is likely to return to profitability: its lack of profitability in the past eight years is partly a reflection of its small series vehicle manufacturing, now discontinued and the resulting spare parts business (predicted to end by 2020). In addition, Tech Mahindra wants to grow Pininfarina by taking it into new geographies, China being a priority, also India and eventually Russia and Brazil. Tech Mahindra will also help drive the client base expansion that Pininfarina had initiated to the train equipment and luxury boats industries (currently involving 100 personnel)
  4. Pininfarina provides more than style and design services. The company already derives ~60% of its service revenues from IES, servicing German clients (30% of service revenue) and, to a lower extent, Italian automotive OEMs (20%)
  5. Pininfarina’s style and design capabilities will remain local and onshore but are a niche area: Tech Mahindra estimates that spending on style and design services account for 15% of spending in services related to new vehicle programs and 5% of existing program refreshes. IES accounts for the remaining 85% and 95% spending (respectively for new and existing programs). Tech Mahindra will maintain local onshore teams but believes there is vast room for expansion for Indian delivery
  6. Within IES, Pininfarina brings in capabilities around vehicle body and power train, as well as interior design. This complements the capabilities of Tech Mahindra to the automotive sector, which include:
  • Vehicle Engineering and Vehicle Electronics
  • Connected Vehicle solutions and services around ADAS, Infotainment, etc.
  • Software engineering, embedded systems including infotainment
  • New IT-based offerings such as IT systems for car sharing schemes, and IoT (e.g. electric cars services, identifying parking spaces available).

Overall, Tech Mahindra was reassuring on the Pininfarina acquisition. Several question marks remain (pricing and structure of the deal with a double governance from Tech Mahindra and Mahindra & Mahindra). Yet, the deal highlights Tech Mahindra’s continued expansion in the key automotive sector for its IES activities, for its design and styling services to engineering capabilities, and a more comprehensive delivery model now expanded to Italy and Germany.

]]>
<![CDATA[TCS Leapfrogging RPA & as-a-Service with Neural Automation & Services-as-Software]]> Much of the current buzz in the industry continues to be centered on RPA, a term currently largely synonymous with automation, and this technology clearly has lots of life left in it, for a few years at least. Outside service providers, where its adoption is rapidly becoming mature, RPA is still at the early growth stage in the wider market: while a number of financial services firms have already achieved large-scale roll-outs of RPA, others have yet to put their first bot into operation.

RPA is a great new technology and one that is yet to be widely deployed by most organizations. Nonetheless, RPA fills one very specific niche and remains essentially a band-aid for legacy processes. It is tremendous for executing on processes where each step is clearly defined, and for implementing continuous improvement in relatively static legacy process environments. However, RPA, as TCS highlights, does have the disadvantages that it fails to incorporate learning and can really only effectively be applied to processes that undergo little change over time. TCS also argues that RPA fails to scale and fails to deliver sustainable value.

These latter criticisms seem unfair in that RPA can be applied on a large scale, though frequently scale is achieved via numerous small implementations rather than one major implementation. Similarly, provided processes remain largely unchanged, the value from RPA is sustained. The real distinction is not scalability but the nature of the process environment in which the technology is being applied.

Accordingly, while RPA is great for continuous improvement within a static legacy process environment where processes are largely rule-based, it is less applicable for new business models within dynamic process environments where processes are extensively judgment-based. New technologies with built-in learning and adaptation are more applicable here. And this is where TCS is positioning Ignio.

TCS refers to Ignio as a “neural automation platform” and as a “Services-as-Software” platform, the latter arguably a much more accurate description of the impact of digital on organizations than the much-copied Accenture “as-a-Service” expression.

TCS summarizes Ignio as having the following capabilities:

  • “Sense”: ability to assimilate and mine diverse data sources, both internal and external, both structured and unstructured (via text mining techniques)
  • “Think”: ability to identify trends & patterns and make predictions and estimate risk
  • “Act”: execute context-aware autonomous actions. Here TCS could potentially have used one of the third-party RPA software products, but instead chose to go with their own software instead
  • “Learn”: improving its knowledge on a continuous basis and self-learning its context.

TCS Ignio, like IPsoft Amelia, began life as a tool for supporting IT infrastructure management, specifically datacenter operations. TCS Ignio was launched in May 2015 and is currently used by ten organizations, which includes Nationwide Building Society in the U.K. All ten are using Ignio in support of their IT operations, though the scope of its usage remains limited at present, with Ignio being used within Nationwide in support of batch performance and capacity management. Eventually the software is expected to be deployed to learn more widely about the IT environment and predict and resolve IT issues, and Ignio is already being used for patch and upgrade management by one major financial services institution.

Nonetheless, despite its relatively low level of adoption so far within IT operations, TCS is experiencing considerable wider interest in Ignio and feels it should strike while the iron is hot and take Ignio out into the wider business process environment immediately.

The implications are that the Ignio roll-out will be rapid (expect to see the first public example in the next quarter) and will take place domain by domain, as for RPA, with initial targeted areas likely to include purchase-to-pay and order-to-cash within F&A and order management-related processes within supply chain. In order to target each specific domain, TCS is pre-building “skills” which will be downloadable from the “Ignio store”. One of the initial implementations seems likely to be supporting a major retailer in resolving the downstream implications of delivery failures due to causes such as traffic accidents or weather-related incidents. Other potential supply chain-related applications cited for Ignio include:

  • Customer journey abandonment
  • The profiling, detection, and correction of check-out errors
  • Profiling, detecting, and correcting anomalies in supplier behavior
  • Detection of customer feedback trends and triggering corrective action
  • Profiling and predicting customer behavior.

Machine learning technologies are receiving considerable interest right now and TCS, like other vendors, recognizes that rapid automation is being driven faster than ever before by the desire for competitive survival and differentiation, and in response is adopting a “if it can be automated, it must be automated” stance. And the timescales for implementation of Ignio, cited at 4-6 weeks, are comparable to that for RPA. So Ignio, like RPA, is a relatively quick and inexpensive route to process improvement. And, unlike many cognitive applications, it is targeted strongly at industry-specific and back office processes and not just customer-facing ones.

Accordingly, while RPA will remain a key technology in the short-term for fixing relatively static legacy rule-based processes, next generation machine learning-based “Services-as-Software” platforms such as Ignio will increasingly be used for judgment-based processes and in support of new business models. And TCS, which a year ago was promoting RPA, is now leading with its Ignio neural automation-based “Services-as-Software” platform.

]]>
<![CDATA[WNS: Targeting Domain Superiority Across Personnel & Platforms]]> NelsonHall has recently attended both the WNS analyst session in New York and Infosys Confluence in San Francisco. It may seem trite to suggest that their chosen locations reflect the differing approaches to the market being taken by the two firms, but there are some parallels.

Sector domain focus

While Infosys is taking a horizontal approach to taking emerging technologies to the next level, WNS regards sector domain expertise as its key differentiator. Accordingly, while Infosys has moved delivery into a separate horizontal delivery organization, WNS continues to organize by vertical across both sales and delivery and looks to offer its employees careers as industry domain experts – it views its personnel as being sector experts and not just experts in a particular horizontal.

The differing philosophies of the two firms are also reflected in their approach to technology. WNS brought in a CTO nine months ago and now has a technology services organization. But where Infosys is building tools that are applicable cross-domain, WNS is building platforms and BPaaS solutions that address specific pain points within targeted sectors.

Location strategy

And while both firms are seeking (like all service providers) to achieve non-linear revenue growth, they have markedly different location strategies. Infosys is placing its bets on technology and largely leaving its delivery footprint unchanged, whereas WNS is increasingly taking its delivery network to tier 2 cities, not just in India but also within North America, Eastern Europe and South Africa, to continue to combine the cost benefits of labor arbitrage with those of technology. This dual approach should assist WNS in providing greater price-competitiveness and protection for its margins in the face of industry-wide pressure from clients for enhanced productivity improvement. Despite the industry-wide focus and investment on automation, the levels of roll-out of automation across the industry have typically been insufficient to outstrip pricing declines to generate non-linear revenue growth, and will remain so in the short-term, so location strategy still has a role to play.

At the same time, WNS is fighting hard for the best talent within India. For example, the company:

  • Is taking part in a marketing campaign in India to persuade leading graduates (and their families), who may have increasingly been thinking that the BPS industry was not for them, that they can build an exciting career in BPS as domain experts
  • Has announced a 2-year “MBA in Business Analytics” program in India developed in conjunction with NIIT. The program is aimed at mathematically strong graduates with 3-4 years of experience, who spend their first year in training, and their second year working on assignments with WNS. The program is delivering 120 personnel to WNS.

Industry domain credentials & technology strategy

Like a number of its competitors, WNS is increasingly focused on assisting organizations in adopting advanced digital business models that will offer them protection “not just from existing competitors but from competitors that don’t yet exist”.  In particular, WNS is strengthening its positioning both in verticals where it is well established, such as insurance and the travel sector, and also in newer target sectors such as utilities, shipping & logistics, and healthcare, aiming to differentiate both with domain-specific technology and with domain-specific people. The domain focus of its personnel is underpinned not just by its organizational structure but also, for example, by the adoption of domain universities.

Accordingly, WNS is investing in digital frameworks, AI models, and “assisting clients in achieving the art of the possible” but within a strongly domain-centric framework. WNS’ overall technology strategy is strongly focused on domain IP, and combining this domain IP with analytics and RPA. WNS sees analytics as key; it has won a number of recent engagements leading with analytics, and is embedding analytics into its horizontal and vertical solutions as well as offering analytics services on a standalone basis. It currently has ~2,500 FTEs deployed on research and analytics, of whom ~1,600 are engaged on “pure” analytics.

But the overriding theme for WNS within its target domains is a strong focus on domain-specific platforms and BPaaS offerings, specifically platforms that digitalize and alleviate the pain points left behind by the traditional industry solutions, and this approach is being particularly strongly applied by WNS in the travel and insurance sectors.

In the travel sector, WNS offers platforms, often combined with RPA and analytics, in support of:

  • Revenue recovery, through its Verifare fare audit platform
  • Disruption management, through its RePax platform
  • Proration, through its SmartPro platform.

It also offers a RPA-based solution in support of fulfilment, and Qbay in support of workflow management.

In addition, WNS is making bigger bets in the travel sector, investing in larger platform suites in the form of its commercial planning suite, including analytics in support of sales, code shares, revenue management, and loyalty. The emphasis is on reducing revenue leakage for travel companies, and in assisting them in balancing enhanced customer experience with their own profitability.

The degree of impact sought from these platforms is shown by the fact that WNS views its travel sector platforms as having ~$30m revenue potential within three years, though the bulk of this is still expected to come from the established Verifare revenue recovery platform.

WNS’ platforms and BPaaS offerings for the insurance sector include:

  • Claims eAdjudicator, a RPA-based solution based on Fusion for classifying incoming insurance claims into categories such as ‘no touch’, ‘light touch’, ‘high touch requiring attention of senior personnel’, and ‘potentially fraudulent’ by combining information from multiple sources. WNS estimates that this tool can deliver 70% reduction in support FTEs
  • Broker Connect, a mobile app supporting broker self-service.

In addition, WNS offers two approaches to closed block policy servicing:

  • InsurAce, a desktop aggregation tool supporting unified processing across the range of legacy platforms
  • A BPaaS service based on the LIDP Titanium platform, which supports a wide range of life and annuity product types. This offering is also targeted at companies seeking to introduce new products, channels, or territories, and at spin-offs and start-ups.

The BPaaS service is underpinned by WNS’ ability to act as a TPA across all states in the U.S.

WNS’ vertical focus is not limited to traditional industry-specific processes. The company has also developed 10 industry-specific F&A services, with 50%+ industry-specific scope in F&A, with the domain-specific flavor principally concentrated within O2C.

In summary

Both Infosys and WNS are enhancing their technology and people capabilities with the aim of assisting organizations in implementing next generation digital business models. However, while Infosys is taking the horizontal route of developing new tools with cross-domain applicability and encouraging staff development via design thinking, WNS’ approach is strongly vertical centric, developing domain-specific platforms and personnel with strong vertical knowledge and loyalties. So, two different approaches and differing trajectories, but with the same goal and no single winning route.

]]>
<![CDATA[HCL Increasing BPaaS Emphasis to Target Debt Reduction & Smart Metering Realization in the Utilities Sector]]> The utilities sector is an important one for HCL. The company has worked with ~100 utilities and currently has 25 active utilities clients, with its utilities revenue primarily from the U.S., U.K., and Australia.

To focus further on this sector, HCL has enhanced its capability to offer end-to-end and BPaaS-based services through a number of partnerships and is now going to market targeting four principal pain points for utilities, namely:

  • Utility payment being a low payment priority for customers leading to ongoing high levels of debt within the utilities sector
  • The need to introduce AMI and smart metering in the water and electricity industries
  • The increasing importance of enhancing customer engagement, particularly in moving to a multi-channel environment
  • Back-office processes that are misaligned, either with the utility front-office or their roadmap for back-office transformation.

Debt Management

Starting with debt management, HCL has carried out consulting assignments around debt for a number of utilities, with the emphasis on assisting utilities in preventing their customers falling into debt, e.g. by making sure that billing is correct. In addition to consulting services, HCL offers end-to-end debt collection services and HCL’s current utilities debt management clients include:

  • Early stage to late stage debt collection for a U.K. water utility
  • Collections for an Australian utility, based on debt purchasing
  • Working with the largest electricity retailer in the U.K. to reduce their business debt by ~40%.

For the U.K. water utility, HCL:

  • Sends a soft reminder once the due date has been exceeded
  • Manages customer payment plans
  • Manages payment by charities on the customer’s behalf
  • Makes sure that the customer has good understanding of their bill.

Building on this capability, HCL has partnered with a European-based collections software and field services company in order to offer end-to-end BPaaS collections services, and HCL will increasingly seek to offer collections as a BPaaS service.

Smart Metering

Here, HCL has carried out smart metering realization for four utilities in the U.S. and has consulted in this area in the U.K., helping a utility identify what types of customer to target and how to manage campaigns to increase adoption of smart meters.

HCL’s offering for smart metering consists of an end-to-end service from planning & forecasting through data cleaning & enhancement, customer contact planning & contact, engineer scheduling & co-ordination, smart meter installation, updating supplier records & processing meter exchange, to first bill production, exceptions management, and collections and bill shock management.

To enhance its end-to-end offering for smart meter introduction & management, HCL has partnered with:

  • A field services company globally to cover utilities installation of smart meters and drive-by meter reading
  • A meter-to-cash automation software provider.

Enhancing Customer Engagement

In customer engagement, HCL offers multi-channel coverage across the customer lifecycle. In particular, HCL is assisting utilities in deflecting as much functionality as possible to web self-service, e.g. working with a U.S. utility to ensure that home moves and meter reads can be largely handled through the utility self-serve, increasing self-serve by ~7%. In support of this customer engagement offering, HCL has partnered with:

  • A global document management company to provide inbound and outbound document management services including mail room services
  • Transera, for omni-channel contact center software.

In particular, the partner software assists HCL in identifying the root cause of customer calls so that the appropriate functionality can be provided on the web site to deflect calls to self-serve. Overall, HCL aims to ensure that the customer receives a standardized and consistent response across channels, while deflecting to self-serve whenever possible.

Aligning the Back-Office and Front-Office

HCL’s final theme is aligning the back-office with the front-office. For example, HCL has, for a U.K. water utility, implemented a workflow establishing specialist queues, so that tasks are automatically queued to the right person (e.g. a home move specialist or billing exceptions specialist). For this utility, HCL guaranteed a reduction in bad debt of $8m and reduced the cost of debt collection by 40% over 5 years while reducing the number of customer disputes by 50%.

Back-office offerings from HCL for the utilities sector include exceptions processing, new account processing, sales maximization investigations, provisioning & processing of optant meter readings, property searches, and handling business utility accounts.

‘Engage’ Service Delivery Model

In addition to targeting these specific pain points, HCL also offers ‘Engage’, an integrated service delivery model for utilities, suitable for smaller utilities and start-ups that require a complete utility solution covering both services and infrastructure, and typically operating on a per subscriber pricing model. ‘Engage’ includes:

  • SAP implementation, where HCL has extensive prior implementation experience within the utilities sector in both SAP and SAP HANA
  • Data center management
  • Business process services
  • R&D services.

HCL has undertaken this complete service, using a BPaaS service based on a customer and billing platform utilizing SAP, SAP HANA, OpenText, and Genesys, for a U.S. utility using per customer account based pricing.

How HCL Differentiates BPS for Utilities

HCL differentiates its BPS services for the utilities sector around six capabilities:

  • ‘Employees First’ culture
  • Its utilities domain expertise and experience in SAP and Oracle for Utilities
  • HCL Toscana automation engine, incorporating document management, workflow, RPA, & analytics
  • Analytics capabilities
  • ‘Utili-Best’ benchmarking framework
  • ‘Wow & Waste’ framework.

For example, HCL Toscana has delivered 20% cost savings for a U.K. water utility over 5-years, by improving the accuracy and transparency of its workflow. HCL Toscana now incorporates RPA and, going forward, HCL perceives that RPA can be applied to the meter to cash cycle for utilities to assist in areas such as:

  • Identifying the most accurate reading for customers where multiple readings have been submitted based on consumption trends to generate a believable bill
  • Associating the correct meter with the correct customer where input errors have occurred
  • Correcting errors where consumption levels appear too high or too low
  • Checking that prepay customers are on the correct tariff.

Other possibilities for automation include:

  • Meter reading validation
  • Identifying which customers can’t pay and which customers won’t pay and applying appropriate collection strategies
  • Support for customer service agents
  • Resolving discrepancies in data between meter serial numbers, customers, customer address, etc.
  • Assisting in maintaining data integrity in system migrations.

In analytics, HCL carries out analytics assignments for a number of utilities and has mapped out the utilities customer journey, identifying appropriate analytics for each of the lifecycle stages of ‘sale’, ‘consume’, ‘debt’, and ‘move’.

HCL has also responded to the demand for process maturity benchmarks by developing its ‘Utili-Best’ benchmarking framework. This modelling framework is based on client data from its consulting engagements, together with third-party data including that from OFWAT, OFGEM, FERC, and AER. The framework covers benchmarking for billing, metering, customer service, and debt levels.

The ‘Utili-Best’ benchmarking framework is further complemented by HCL’s ‘Wow & Waste’ framework. This is used to identify ‘wow’ factors that utilities should aim for, identifying potential sources of competitive advantage by matching the wows to opportunities, and the ‘waste’ elements that they should aim to eliminate. Examples of ‘wow’ factors are ‘accurate bills sent to customers’ and ‘customer pays in full’. Examples of ‘waste’ factors are billing errors or late bills. These ‘wow’ and ‘waste’ factors can then be translated into KPIs such as proportion of bills collected and NPS during ‘bill to collect’.

]]>
<![CDATA[The Truth About Freemium Research]]> For over a decade, freemium has been the ubiquitous business model for fledgling internet firms and the developers of smartphone apps. Users sign up for free to enable basic features, and are then drawn into subscribing to various levels of premium functionality. More recently, the freemium model has been the subject of considerable attention in the B2B market research space, with some rather extravagant claims and unsound thinking being used to herald it. Let’s have a closer look.

Some of the self-styled new breed analyst firms, who hail from either a blogging or offshore background, would have you believe that freemium is a key differentiator. But freemium research is far from new, and is hardly ground-breaking. Have a look at almost any analyst firm’s website, and you’ll find free material. On the NelsonHall research portal you’ll find masses of high-quality analyst commentary, blog articles, webcasts, research summaries, and opt-in research newsletters (and for the buy-side, hundreds of NEAT vendor evaluation quadrants too) – all there for free. What matters is how good and impartial the free research is, and also what you find when you scrape beneath the surface of the free stuff.

One thing’s for sure. With over 40,000 global research reports across a large number of service line and industry-specific programs, NelsonHall has a wealth of high-impact research available, all of it based on original interviews with the buyers and vendors of outsourcing services around the world. And every piece of research tackles the tough questions and delivers insightful answers that help clients narrow the risk of decision-making in complex sourcing environments. Some of this insight we choose to give away. We always have done. We just don’t see this as a differentiator.

What is a differentiator is meticulous attention to detail, and a refusal to adopt anything other than the best methods of gathering and analyzing research data, and delivering insight to our clients. For example, we do not use on-line surveys, which are hopelessly inadequate for extensive data gathering in complex B2B markets. Nor do we believe that it’s possible to write outsourcing vendor profiles based on information entered into standard forms by the vendors themselves. That’s a sure-fire way of collecting lots of vendor marcomms and wishful thinking, but certainly no basis for accurate data gathering, let alone objective analysis.

And now for a reality check. Doing research the right way is not an inexpensive business, and has to be paid for one way or another. One way is to set a reasonable fee for all clients who wish to access premium research, and to grant them enterprise-wide access. Another way is to find a sponsor who’s prepared to pay a much larger fee for the privilege of first access to (or co-hosting of) the research. We take the former approach.

But here’s another very important way to look at the freemium issue. We challenge any organization to build a successful sourcing strategy based on free research alone. Googling for free information is a non-starter, and relying entirely on free analyst research doesn’t get you much further. We wouldn’t advise it, much as we wouldn’t advise web-based self-diagnosis of that nagging abdominal pain you’ve had for the last two months.

Some of the newer players point to their freemium model as evidence that they are changing the established order of the ‘legacy’ analyst firms. Really? Other than the fact that freemium is nothing new, the reality is that analyst firms founded on well-established principles, rigorous methodologies, and a refusal to dumb down what they do, are more important today than they ever were.

You may ask, does any of this really matter?

Well, not unless you happen to believe that substance is more important than form. In which case it matters rather a lot – your business could depend on it.

]]>
<![CDATA[RPA: What BPS Can Learn from Software Testing and DevOps]]> For professionals involved in BPS activities, it might be a surprise that software testing (ST) shares a lot of similarities with BPS. Like many forms of BPS, ST is labor intensive – the largest software testing vendors now have up to 30k career testers and 20k has become the new norm. Like BPS, ST has adopted process improvement in a big way. And like BPS, ST is looking at process automation as a way of reducing its labor intensity and eventually move into non-linear growth, improving accuracy and speed as well as efficiency.

It may also come as a surprise to BPS professionals that the software testing industry has already massively invested in task automation, and created a high number of IPs and accelerators. This is such a significant trend in ST that one large IT service firm now says it has ~70,000 testing reusable artifacts.

So perhaps ST has experience to share with BPS? What can we learn?

  1. Automation starts small: the holy grail of the BPS industry may be the automation of a whole process but before achieving full process automation, clients can benefit from limited scope and still cut times on repetitive tasks
  2. Process improvement is not just about implementing best practices; it goes towards implementing business process modeling i.e. capturing business processes in diagrams. Those diagrams, running on specific modeling applications, are a starting point for automating processes, helping in identifying pockets of manual activities and prioritizing the roll out of IPs and accelerators
  3. One area where the ST industry is investing massively is around the dev-to-test-and-roll-out-into-production process known as DevOps. DevOps poses many challenges to the ST community, with its reliance on many different software tools, whether COTS, IPs or open source software. There is another element to this complexity: the traditional alliances with major ISVs are no longer sufficient and the ST industry has had to drive partnerships with much smaller ISVs than in the past, including tech start-ups: something unthinkable five years ago. Some of the innovative vendors have made bets on technology vendors and pre-integrated heterogeneous sets of tools and IPs. The good news is that clients seem ready to accept such pre-integrated DevOps platforms
  4. Most automation IPs and accelerators created by the ST industry are free-of-charge to their clients and are meant to drive service differentiation
  5. Client organizations do not look overwhelmingly concerned about potential technology lock-up. Credit to ST vendors that have avoided lock-ups and systematically provided integration with major COTS and open sources tools in the market
  6. Machine learning, cognitive intelligence (CI) and even AI are also on the roadmap but evidence of current usage is scant. There is some rebranding of agent software sitting on a server or a desktop and collecting data over time to drive dynamically analytics as machine learning.

All in all, we think software testing provides a good number of insights and direction to the BPS industry.

Final observation: has the software industry in spite of its investment in automation become non-linear? Not yet - but this may change with mainstream adoption of DevOps.

]]>
<![CDATA[Robotics & BPaaS Must Evolve to Take BPO Automation to the Next Level]]> In my previous blog on automation in BPO, I argued that, despite all the hype, the current implementation of robotics process automation (RPA) is little more than a labor arbitrage play, enabling the business to run with increased efficiency using existing technology and offshoring frameworks. So where does automation in BPO go from here?

RPA is essentially execution of repeatable, rule-based tasks which require little or no cognition or human expertise or human intervention (though RPA can be used to support agents within relatively hybrid tasks) by a bot mimicking human action. The bot operates enterprise software and applications through existing user interfaces based on pre-defined rules and inputs and is best suited to relatively heavy-duty transaction processing.

The next stage is to complement RPA with newer technologies such as AI where judgment-based tasks are starting to be supported with cognitive platforms. Examples of cognitive technologies include adaptive learning, speech recognition, natural language processing, and pattern identification algorithms. While RPA has typically been in full-swing for about two years and is currently reaching its peak of roll-out, cognitive technologies typically won’t reach wide-scale adoption for another few years. When they do, they promise to have most impact not in data-centric transactional processing activities but around unstructured sales and customer service content and processes.

O.K., so what about BPaaS?

Typically, BPaaS consists of a platform hosted by the vendor, ideally on a one-to-many basis, similarly to SaaS, complemented by operations personnel. These BPaaS platforms have been around for some time in areas such as finance & accounting in the form of systems of engagement surrounding core systems such as ERPs. In this context it is common for ERPs to be supplemented by specialist systems of engagement in support of processes such as order-to-cash and record-to-report. However, initially these implementations tended to be client-specific and one-to-one rather than one-to-many and true BPaaS.

Indeed, BPaaS remains a major trend within finance & accounting. While only start-ups and spin-offs seem likely to use BPaaS to support their full finance & accounting operations in the short-term, suppliers are increasingly spinning off individual towers such as accounts payable in BPaaS form.

However, where BPaaS is arguably coming into its own is in the form of systems of engagement (SoE) to tackle particular pain points and a number of vendors are developing systems of engagement that can be embedded with analytics to provide packaged, typically BPaaS, services. These systems of engagement and BPaaS services are sitting on top of systems of record in the form of, for example, core banking platforms or ERPs to tackle very specific pain points. Examples in the BFSI sector that are becoming increasingly common are BPaaS services around mortgage origination and KYC. Other areas currently being tackled by BPaaS include wealth management and triage around property & casualty underwriting.

In the same way that systems of engagement are currently required in the back-office to support ERPs, systems of engagement are starting to emerge in the front-office to provide a single view of the customer and to recommend “next best actions” both to agents and, increasingly, direct to consumers via digital channels. While automation in BPO in the form of BPaaS and systems of engagement is still largely centered in the back-office and beginning to be implemented in the middle-office, in future this approach and the more advanced forms of automation will be highly important in supporting sales and service in the front-office.

This change in BPaaS strategy from looking to replace core systems to providing systems of engagement surrounding the core systems offers a number of benefits, namely:

  • It directly tackles the subject of improving specific KPIs and sub-processes within the organization in a focused and manageable manner, one at a time
  • Secondly it offers the potential when combined with analytics to build an element of ongoing process improvement and learning directly into the sub-processes concerned
  • Thirdly, it enables a modern digital interface to be implemented on top of the systems of record in support of agents, customers, suppliers, and employees
  • Finally, it avoids the need to replace legacy systems on a wholesale basis and can be used to introduce transactional and gainshare pricing rather than licence and FTE-based pricing.

I started with the question “where does automation in BPO go from here?” In summary, BPO automation will only take a significant step forward when RPA becomes AI, and BPaaS emerges from the back-office to support sales and service in the front-office.

]]>