Conduent has partnered with Microsoft to use Microsoft Azure OpenAI to underpin its GenAI innovation initiatives with clients.
Its GenAI journey includes:
Use Case Selection Criteria Focused on Improving Quality, Throughput, and Cycle Times
Conduent recognizes that GenAI is an expensive technology and that its adoption will typically incur costs in changing existing processes and technology stacks. This makes it difficult to build GenAI business cases on already optimized operations based solely on cost reduction. Hence, Conduent is focusing on “innovative additive opportunities” to make the business case work.
The outcomes that Conduent is targeting from GenAI initiatives are:
At the same time, Conduent’s client relationships tend to involve relatively deep end-to-end service provision across a range of processes rather than single-process support. These combinations of services are tailored for specific clients rather than being standalone commoditized services.
Accordingly, Conduent’s document management services and CX services are generally supplied as part of a wider capability rather than as standalone services. Within this pattern, most of its solutions include elements of:
These three areas are regarded as core competencies by Conduent and all its GenAI use cases for the immediate future will fall into one of these areas and will be capable of delivering improved quality, throughput, and cycle times.
Initial GenAI PoCs Focus on Healthcare Claims Adjudication, Fraud Detection, and Customer Service Enhancement
Conduent has announced three GenAI pilot areas covering healthcare claims adjudication, state government program fraud detection, and customer service enhancement.
Conduent is a major provider of healthcare claims adjudication services. Here, it is working on a PoC with several healthcare clients to apply GenAI to reduce the error rate in data extraction and achieve faster cycle times in claims adjudication. GenAI is being used within document management to summarize highly unstructured documents, such as appeals documents and medical records, using its contextualization capabilities, including image-to-text. The technologies used are Azure AI Document Intelligence and Azure OpenAI Service.
Secondly, Conduent is working on a fraud detection PoC to support social programs in the U.S. state government sector. This PoC uses GenAI for search & analytics across multiple structured and unstructured data sets to increase the volume and speed of fraud detection. The technologies used here are Azure Data Factory and Azure OpenAI Service.
Finally, Conduent is using GenAI to enhance the use of agent assist and virtual agents by training virtual agents on existing data so that they can be deployed much faster.
Moving to MVPs and Industrializing GenAI Use Cases
Conduent initially undertook PoCs in the above areas to get both Conduent and the client comfortable with the results of applying GenAI and prove the use case. The PoC process is iterative and very granular, and Conduent perceives that organizations need to be extremely prescriptive to get the right results. This typically means defining the specific inputs and outputs of the use case very tightly, including defining what GenAI should do in the absence of individual inputs.
Conduent is now moving towards MVP and building client business cases in several of these pilots, including in some pilots in the document management space. Since Conduent has taken a horizontal approach to its use case selection, many of these have the potential to scale across multiple industry segments and clients.
In addition, Conduent provides BPaaS services to many of its clients, so it is looking to embed proven GenAI use cases in its proprietary technology platforms.
Existing use cases are business unit-sponsored but curated centrally, with the center providing enablement and cross-pollination across business units. However, the intention is to incubate future GenAI capability within individual business units.
]]>
Capgemini has launched a new digital transformation service, One Operations, with the specific goal of driving client revenue growth.
One Operations: Key Principles
Some of One Operations’ principles, such as introducing benchmark-driven best practice operations models, taking an end-to-end approach to operations across silos, and using co-invested innovation funds, are relatively well established in the industry. However, what is new is building on these principles to incorporate an overriding focus on delivering revenue growth. The business case for a One Operations assignment focuses on facilitating the client’s revenue growth and taking a B2B2C approach focused on the end customer, emphasizing the delivery of insights that enable client personnel to make earlier decisions focused on the enterprise’s customers.
Capgemini’s One Operations account teams involve consulting and operations working together, with Capgemini Invent contributing design and consulting and the operational RUN organization provided by Capgemini’s Business Services global business line.
Implementing a One Operations philosophy across the client organization and Capgemini is achieved through shared targets to reduce vendor/client friction and co-invested innovation funds. One Operations assignments involve setting joint targets with a continuously replenished co-invested innovation fund of ~10–15% of Capgemini revenues used to fund digital transformation.
One Operations is very industry-focused, and Capgemini is initially targeting selected clients within the CPG sector, looking to assist them in growing within an individual country or small group of countries by localizing within their global initiatives. The key to this approach is demonstrating to clients that it understands and can support both the ’grow’ and ’run’ elements of their businesses and having an outcome-based conversation. Capgemini is typically looking to enable enterprises to achieve 4X growth by connecting the sales organization to the supply chain.
Assignments commence with working sessions brainstorming the possibilities with key decision-makers. The One Operations client team is jointly led by a full-time executive from Capgemini Invent and an executive from Capgemini’s Business Services. The Capgemini Invent executive remains part of the One Operations client team until go-live. The appropriate business sector expertise is drawn more widely from across the Capgemini group.
One Operations assignments typically have three phases:
At this stage, Capgemini has two live One Operations assignments with further discussions taking place with clients.
Using End-to-End Process Integration to Speed Up Growth-Oriented Insights
Capgemini’s One Operations has three key design principles:
These transformations involve:
Changing the mindset within the enterprise involves freeing personnel from tactical transactional activities and providing relevant information supporting their new goals.
Capgemini aims to achieve the growth mindset in client enterprises by enabling an integrated end-to-end view from sales to delivery, facilitating teams with digital tools for process execution and growth-oriented data insights. Within this growth focus, Capgemini offers an omnichannel model to drive sales, augmented teams to enable better customer interactions, predictive technology to identify the next best customer actions, and data orchestration to reduce customer friction.
One Operations also enables touchless planning to improve forecast accuracy, increase the order fill rate, reduce time spent planning promotions, and accelerate cash collections to reduce DSO, while improving promotions accuracy and product availability are also key to revenue growth within CPG and retail environments.
Shortening Forecasting Process & Enhancing Quality of Promotional Decisions: Keys to Growth in CPG
The overriding aim within One Operations is to free enterprise employees to focus on their customers and business growth. In one example, Capgemini is looking to assist an enterprise in increasing its sales within one geography from ~$1bn to $4bn.
The organization needed to free up its operational energies to focus on growth and create an insight-driven consumer-first mindset. However, the organization faced the following issues:
Capgemini took a multidisciplinary approach end-to-end across plan-to-cash. One key to growth is the provision of timely information. Capgemini is aiming to improve the transparency of business decisions. For example, the company has rationalized the coding of PoS data so that it can be directly interfaced with forecasting, shortening the forecasting process from weeks to days and enhancing the quality of promotional decisions.
Capgemini also implemented One Operations, leveraging D-GEM to develop a best-in-class operating model resulting in a €150m increase in revenue, 15% increase in forecasting accuracy, 50% decrease in time spent on setting up marketing promotions, and a 20% increase in order fulfillment rate.
]]>
Digital transformation and the associated adoption of Intelligent Process Automation (IPA) remains at an all-time high. This is to be encouraged, and enterprises are now reinventing their services and delivery at a record pace. Consequently, enterprise operations and service delivery are increasingly becoming hybrid, with delivery handled by tightly integrated combinations of personnel and automations.
However, the danger with these types of transformation is the omnipresent risk in intelligent process automation projects of putting the technology first, regarding people as secondary considerations, and alienating the workforce through reactive communication and training programs. As many major IT projects have discovered over the decades, the failure to adopt professional organizational change management procedures can lead to staff demotivation, poor system adoption, and significantly impaired ROI.
The greater the organizational transformation, the greater the need for professional organizational change management. This requires high workforce-centricity and taking a structured approach to employee change management.
In the light of this trend, NelsonHall's John Willmott interviewed Capgemini's Marek Sowa on the company’s approach to organizational change management.
JW: Marek, what do you see as the difference between organizational change management and employee communication?
MS: Employee communication tends to be seen as communicating a top-down "solution" to employees, whereas organizational change management is all about empowering employees and making them part of the solution at an individual level.
JW: What are the best practices for successful organizational change management?
MS: Capgemini has identified three best practices for successful organizational change management, namely integrated OCM, active and visible sponsorship, and developing a tailored case for change:
JW: So how should organizations make this approach relevant at the workgroup and individual level?
MS: A key step in achieving the goals of organizational change management is identifying and understanding all the units and personnel in the organization that will be impacted both directly and indirectly by the transformation. Each stakeholder or stakeholder group will likely find itself in a different place when it comes to perspective, concerns, and willingness to accept new ways of working. It is critical to involve each group in the transformation and get them involved in shaping and driving the transformation. One useful concept in OCM for achieving this is WIIFM (What's In It For Me), with WIIFM identified at a granular level for each stakeholder group.
Much of the benefit and expected ROI is tied to people accepting and taking ownership for the new approach and changing their existing ways of working. Successfully deployed OCM motivates personnel by empowering employees across the organization to improve and refine the new solution continually, stimulating revenue growth, and securing ROI. People need to be both aware of how the new solution is changing their work and that they are active in driving it – and thanks to that, they are actively making the organization a "powerhouse" for continuous innovation.
How an enterprise embeds change across its various siloes is very important. In fact, in the context of AI, automatization is not only about adopting new tools and software but mostly about changing the way the enterprise's personnel think, operate and do business.
JW: How do you overcome employees' natural fear of new technology?
MS: To generate enthusiasm within the organization while avoiding making the vision seem unattainable or scary, enterprises need to frame and sell transformations incorporating, for example, AI as evolutions of something the employees are doing already, not merely as "just the next logical step" but reinventions of the whole process – from both the business and experience perspective. They need to retain the familiarity which gives people comfort and confidence but, on the other hand, reassure them that the new tool/solution adds to their existing capability, allowing them to fulfill their true potential – something that is not automatable.
]]>
The pandemic has changed organizations’ attitudes towards the need for change, greatly increasing their emphasis on adopting new digital process models and digital transformation. Partly this is driven by the need to enhance their transactional efficiency and effectiveness rapidly, but at least equally importantly, it has brought a much greater requirement for real-time information and analytics to drive the business. All these pressures are keenly felt within the finance department.
WNS introduces Quote-to-Sustain
In response, WNS has looked to reinvent order-to-cash in the form of Quote-to-Sustain. Some of the issues that WNS is aiming to address with its Quote-to-Sustain offering include:
In addition to delivering enhanced end-to-end visibility, stakeholder experience, and analytics, WNS has also reimagined its Quote-to-Sustain service to deliver greater variability in F&A process costs as business volumes fluctuate and become more unpredictable as a result of the pandemic. Its new Quote-to-Sustain offering bundles technology and services and allows clients to “pay by the drink”.
Specifically, the goal is for clients to remain cash neutral and, subject to some volume commitment, to pay only for transactions, with a decrease in costs emerging from year two onwards. WNS funds all change management.
Quote-to-Sustain modules
The Quote-to-Sustain module structure is:
Each of these modules takes the form of a system of engagement sitting on top of the client’s existing ERP systems and systems of record.
In this respect, the use of unified master data is important in bringing together the various commercial and financial elements from multiple databases to ensure accuracy, for example, in billing the right person and identifying the appropriate person for each type of query. The unified master data aims to be a single source of the truth using data authentication from external sources and providing an element of real-time data cleansing.
WNS’ cognitive credit offering aims to take credit management beyond the periodic review of credit bureau reports and base its recommendations for credit eligibility on its own analyses of financial ratios, customer behavior (including any changes in payment pattern), and news triggers from external sources. WNS believes this approach to be particularly effective in addressing credit management within small businesses. The service incorporates technology from HighRadius and Emagia.
WNS’ digital contracts and smart orders modules utilize its Skense and Agilius platforms to combine and analyze data from various sources and integrate quotations, orders, contracts, and billing to reduce the errors that typically arise between disparate sources of information.
WNS’ Revenue Assurance and Analytics modules include some industry-specific modules to monitor and minimize revenue dilution, using analytics for improved collections and to reduce revenue losses arising from upstream process errors.
Quote-to-Sustain adoption plan
WNS has always approached F&A from a sector-specific viewpoint. Having developed all the modules within Quote-to-Sustain, WNS is now in the process of integrating this capability with its industry-specific processes in line with client demand. Powered by an exclusive partnership with EvoluteIQ, WNS’ domain-led hyperautomation platform suite is designed to accelerate the adoption of process automation and drive enterprise-wide digital transformation. The sectors WNS will focus on are the ones where it has already developed industry-specific expertise and IP and include airlines, travel agencies, trucking, shipping & logistics, insurance, telecom, media, CPG, manufacturing, and utilities.
Nonetheless, the initial clients of WNS Quote-to-Sustain have typically started by purchasing a single module such as cognitive credit & collections, and WNS expects a typical sequence of deployment to be cognitive credit & collections, followed by unified master data, followed by revenue assurance.
WNS has already applied its cognitive credit, touchless cash applications, botified queries, and analytics-as-a-service modules of Quote-to-Sustain for a media client. This company’s cash application process was automated but only achieved a 75% auto-match rate due to delays in the receipt of remittance advice notes. WNS deployed touchless cash apps via EIPP (electronic invoice presentment and payments) to achieve an auto-match rate of 88% and introduced intelligent chatbots and predictive disputes management to reduce the time for resolution significantly. The chatbots resolve most disputes without human intervention, with all trade promotion issues resolved through chatbots.
Overall, the media company has achieved a potential $38m uplift in free cash flow by optimizing payments from late-paying customers and an 11% reduction in bad debts by improving late-stage collection.
In addition to this modular approach being taken with mid-sized organizations, WNS is also targeting start-ups, where the company is in discussion with some organizations for the entire suite of end-to-end services.
Conclusion
Many existing F&A operations have incorporated best-of-breed point solutions and subsequently applied RPA in support of point automations. However, these organizations are often still using disparate data sources and have not fully reimagined their F&A processes into an integrated framework using a single source of the truth and analytics for improved operational and business intelligence. WNS’ Quote to Sustain offering aims to provide this reimagined finance model and help organizations become more agile and analytical in their approach to order-to-cash.
]]>
For some time, life & annuities carriers have suffered from a multitude of legacy platforms, with each implemented to handle a particular style of product that was either not handled by its predecessor or was added through the acquisition of a set of blocks from another carrier. The resulting stable of platforms has always been expensive to maintain. In recent years, this has been compounded by the increasing importance of digital customer experience and the ability to launch new products quickly.
The pandemic has further emphasized these needs with consumers increasingly moving online, the need for new types of insurance products, and the vast majority of companies increasing their digital transformation emphasis. While most of Infosys McCamish’s current pipeline is driven by mergers & acquisitions and platform rationalization and optimization to modernize legacy environments, they have also onboarded new clients to provide end-to-end services for open blocks, becoming a viable partner for organizations looking to expand their new business pipeline.
Infosys McCamish Focuses on Client Interaction
Indeed, while life companies need to launch new products at speed, insurance product functionality is now increasingly taken as table stakes. Life & annuities producers are now much more focused on client interaction functionality. This includes the omni-channel experience and the ability to deliver a zero-touch digital engagement, incorporating, for example, machine learning to deliver straight-through processing and next-best actions.
In line with these requirements, Infosys McCamish has taken a 3-tiered approach:
Infosys McCamish’s preference is to convert policies from client legacy platforms to its own VPAS platform. Its conversion accelerator identifies data cleanliness and produces balance and control reports before moving the policy data to VPAS. Not all data is moved to VPAS, with data beyond the reinstatement period being moved to a separate source data repository. Infosys McCamish will aim to have 13-24 months of re-processable data on its platform, converting all the history as it was processed on the original platform so that in the future, it is possible to view exactly what happened on the prior platform.
VPAS supports a wide range of life & annuity products, including term life, traditional life, universal life, deferred annuities, immediate annuities, and flexible spending accounts, and Infosys McCamish estimates that on mapping a carrier’s current products with the current configurations in VPAS, there is typically around 97% full compatibility. VPAS currently supports ~40m policies across 22 distinct product families.
However, where necessary or where conversion for some policy types is impossible, it can also wrap its customer experience tools around legacy insurance platforms to provide a common and digital customer experience. Infosys McCamish platforms make extensive use of an API library that supports synchronous and asynchronous communication between Infosys McCamish systems and customer systems.
Incorporating “Smart Video” into the Customer Experience
Infosys McCamish has enhanced its customer experience to enable policy/contract owners to go beyond viewing policies online and transact in real-time, further introducing:
Customers can view their billing and premium information and obtain policy quotes online, with personalized smart video used to enhance the customer experience. They can also initiate policy surrenders online. Depending on carrier policy, surrenders to a certain value are handled automatically, with higher value surrenders being passed to a senior person for verification. Similarly, if a customer is seeking to extend their coverage online, the request is routed by the workflow to an underwriter or senior manager. DocuSign is used to facilitate the use of e-signatures rather than paper documents. All correspondence can be viewed online by customers, with AI-enabled web chat used to support customer queries.
Digital adoption depends on carrier policy and is running at ~25%, with customers being prompted to use digital in all correspondence. Single-touch and no-touch processing account for ~75% of transactions.
Workflow & Dashboards Guiding Agents to Reduce Time to Onboard
Infosys McCamish has integrated BPM and workflow and low-code development to support the back-office and call center service layers to provide operations with inbuilt automation to achieve increased levels of straight-through processing and fewer opportunities for manual errors. It incorporates business rules so that data is only keyed once with, for example, relevant customer updates applied to one policy type being applied across all of their policies.
The VPAS customer service work desk is built on Pega, with the workflow configured for contact center and back-office services and supporting the customer and agent self-service portals.
The agent dashboard is dynamic with the view shown based on the agent role, and the call center dashboard provides drill-downs on service requests by type, SLA performance details such as average handling times, and the full audit trails of each transaction.
The workflow also guides the call center agent through the steps in a transaction, provides scripting, and uses AI to recommend additional actions when communicating with a customer. This improves the quality of each interaction and significantly reduces the time taken to train new agents.
The above is supported by experience enablers underpinned by the data warehouse, which is updated in real-time as changes are made in the policy administration system. The data warehouse is accessed via APIs by Infosys Nia analytics or third-party tools such as PowerBI or Tableau.
Product Configuration Based on Cloning Existing Products
New products are typically created within the product management module by cloning an existing product or template and business rules; for example, customizing to add or remove certain features or coverages, rather than by creating new product features and functionality.
VPAS new business supports digital new business, including E-App and underwriting case management, and integrations with other new business platforms such as iPipeline and FireLight.
Agent Management & Compensation Increasingly Bundled with Product Administration
In addition to the VPAS life and annuity product administration system, Infosys McCamish’s life & annuity platforms also include PMACS, a producer management & compensation system, supporting agent onboarding, licensing, and commission management.
Infosys McCamish is experiencing a greater requirement for end-to-end capability, with PMACS increasingly being bundled with VPAS. The emphasis within PMACS has moved beyond commission management, where the system shows the agent how each commission was calculated, to agent onboarding, licensing, and appointments, allowing agents to view their pipelines and their client policy portfolios.
PMACS has also moved beyond supporting life & annuities and group & critical illness to support property & casualty.
Summary
Infosys McCamish is increasingly looking to assist life & annuities carriers in the adoption of modern digital platforms, and their VPAS ecosystem emphasizes:
John Willmott and Rachael Stormonth
]]>
Capgemini has just launched version 2 of the Capgemini Intelligent Automation Platform (CIAP) to assist organizations in offering an enterprise-wide and AI-enabled approach to their automation initiatives across IT and business operations. In particular, CIAP offers:
Reduced TCO & increased ability to scale through use of a common automation platform
A common problem with automation initiatives is their distributed nature across the enterprise, with multiple purchasing points and a diverse set of tools and governance, reducing overall RoI and the enterprise's ability to scale automation at speed.
Capgemini aims to address these issues through CIAP, a multi-tenanted cloud-based automation solution that can be used to deliver "automation on tap." It consists of an orchestration and governance platform and the UiPath intelligent automation platform. Each enterprise has a multi-tenanted orchestrator providing a framework for invoking APIs and client scripts together with dedicated bot libraries and a segregated instance of UiPath Studio. A central source of dashboards and analytics is built into the front-end command tower.
While UiPath is provided as an integral part of CIAP, CIAP also provides APIs to integrate other Intelligent Automation platforms with the CIAP orchestration platform, enabling enterprises to continue to optimize the value of their existing use cases.
The central orchestration feature within CIAP removes the need for a series of point solutions, allowing automations to be more end-to-end in scope and removing the need for integration by the client organization. For example, within CIAP, event monitoring can trigger ticket creation, which in turn can automatically trigger a remediation solution.
Another benefit of this shared component approach is reducing TCO by improved sharing of licenses. The client no longer has to duplicate tool purchasing and dedicate components to individual automations; the platform and its toolset can be shared across each of infrastructure, applications, and business services departments within the enterprise.
CIAP is offered on a fixed-price subscription-based model based on "typical" usage levels, with additional charges only applicable where client volumes necessitate additional third-party execution licenses or storage beyond those already incorporated in the package.
Support for AIOps & DevSecOps
CIAP began life focused on application services, and the platform provides support for AIOps and DevSecOps, not just business services.
In particular, CIAP incorporates AIOps using the client's application infrastructure logs for reactive and predictive resolutions. In terms of reactive resolutions, the AIOps can identify the dependent infrastructure components and applications, identify the root cause, and apply any automation available.
CIAP also ingests logs and alerts and uses algorithms to correlate them, so that the resolver group only needs to address a smaller number of independent scenarios rather than each alert individually. The platform can also incorporate the enterprise's known error databases so that if an automated resolution does not exist, the platform can still recommend the most appropriate knowledge objects for use in resolution.
Future enhancements include increased emphasis on proactive capacity planning, including proactive simulation of the impact of change in an estate and enhancing the platform's ability to predict a greater range of possible incidents in advance. Capgemini is also enhancing the range of development enablers within the platform to establish CIAP as a DevSecOps platform, supporting the life cycle from design capture through unit and regression testing, all the way to release within the platform, initially starting with the Java and .NET stacks.
A strong focus on problem elimination & functional health checks
Capgemini perceives that repetitive task automation is now well understood by organizations, and the emphasis is increasingly on using AI-based solutions to analyze data patterns and then trigger appropriate actions.
Accordingly, to extend the scope of automation beyond RPA, CIAP provides built-in problem management capability, with the platform using machine learning to analyze historical tickets to identify the causes and recurring problems and, in many cases, initiate remediation automatically. CIAP then aims to reduce the level of manual remediation automation on an ongoing basis by recommending emerging automation opportunities.
In addition to bots addressing incident and problem management, the platform also has a major emphasis within its bot store on sector-specific bots providing functional health checks for sectors including energy & utilities, manufacturing, financial services, telecoms, life sciences, and retail & CPG. One example in retail is where prices are copied from a central system to store PoS systems daily. However, unreported errors during this process, such as network downtime, can result in some items remaining incorrectly priced in a store PoS system. In response to this issue, Capgemini has developed a bot that compares the pricing between upstream and downstream systems at the end of each batch pricing update, alerting business users, and triggering remediation where discrepancies are identified. Finally, the bot checks that remediation was successful and updates the incident management tool to close the ticket.
Similarly, Capgemini has developed a validation script for the utilities sector, which identifies possible discrepancies in meter readings leading to revenue leakage and customer dissatisfaction. For the manufacturing sector, Capgemini has developed a bot that identifies orders that have gone on credit hold, and bots to assist manufacturers in shop floor capacity planning by analyzing equipment maintenance logs and manufacturing cycle times.
CIAP has ~200 bots currently built into the platform library.
A final advantage of using platforms such as CIAP beyond their libraries and cost advantages is that they provide operational resilience by providing orchestrated mechanisms for plugging in the latest technologies in a controlled and cost-effective manner while unplugging or phasing out previous generations of technology, all of which further enhances time to value. This is increasingly important to enterprises as their automation estates grow to take on widespread and strategic operational roles.
]]>
Q&A Part 2
JW: What are the main supply chain flows that supply chain executives should look to address?
JJ: Traditionally, there are three main supply chain flows that benefit from automation:
JW: Can you give me examples of where Capgemini has deployed elements of an autonomous supply chain?
JJ: Capgemini has developed capabilities to help our clients not only design but also run their services following best-practice methodologies blending optimal competencies, location mix, and processes powered by intelligent automation, analytics, and world-renowned platforms. We have helped clients transform their processes, and we have run them from our centers of excellence/delivery centers to maximize productivity.
Two examples spring to mind:
Touchless planning for an international FMCG company:
Our client had maxed out their forecasting capabilities using standard ERP embedded forecasting modules. Capgemini leveraged our Demand Planning framework powered by intelligent automation and combined it with best-in-class machine learning platforms to increase the client’s forecasting accuracy and lower planning costs by over 25%, and this company is now moving to a touchless planning function.
Automated order validation and delivery note for an international chemical manufacturing company:
Our client was running fulfillment operations internally at a high operating cost and low productivity. Capgemini transformed the client’s operations and created a lean team in a cost-effective nearshore location. On top of this, we leveraged intelligent automation to create a touchless purchase/sales order to delivery note creation flow, checking that all required information is correct, and either raising exceptions or passing on the data further down the process to trigger the delivery of required goods.
JW: What are the key success factors for enterprises starting the journey to autonomous supply chains?
JJ: Moving to an autonomous supply chain is a major business and digital transformation, not a standalone technology play, and so corporate culture is highly important in terms of the enterprise being prepared to embrace significant change and disruption and to operate in an agile and dynamic manner.
To ensure business value, you also need a consistent and holistic methodology such as Capgemini’s Digital Global Enterprise Model, which combines Six Sigma-based optimization approaches with a five senses-driven automation model, a framework for the deployment of intelligent automation and analytics technology.
Also, a lot depends on the quality of the supply chain data. Enterprises need to get the data right and master their supply chain data because you can’t drive autonomy if the data is not readily available, up-to-date in real-time, consistent, and complete. Supply chain and logistics is not so much about moving physical goods; it's been about moving information for decades. A bit of automation here and there will not make your supply chain touchless and autonomous. It requires integration and consolidation first before you can aim for autonomy.
JW: And how should enterprises start to undertake the journey to autonomous supply chains?
JJ: The first step is to build the right level of skill and expertise within the supply chain personnel. Scaling too fast without considering the human factor will result in a massive mess and a dip in supply chain performance. Also, it is important to set a culture of continuous improvement and constant innovation, for example, by leveraging a digitally augmented workforce.
Secondly, the right approach is to make elements of the supply chain touchless. Autonomy will happen as a staged approach, not as a big bang. It’s a journey. Focus on high-impact areas first, enable quick wins, and start with prototyping. So, supply chain executives should identify those pockets of excellence that are close to being ready, or which can be made ready, to be made touchless, and where you can drive supply chain autonomy.
One approach to identifying the most appropriate initiatives is to plot them against two axes: the y-axis being the effort to get there and the x-axis being the impact that can be achieved. This will help identify pockets of value that can be addressed relatively quickly, harvesting some quick wins first. As you progress down this journey, further technologies may mature that allow you to address the last pieces of the puzzle and get to an extensively autonomous supply chain.
JW: Which technologies should supply chain executives be considering to underpin their autonomous supply chains in the future?
JJ: Beyond fundamental technologies such as RPA, machine learning has considerable potential to help, for example, in demand planning to increase accuracy, and in fulfillment to connect interaction and decision-making.
Technologies now exist that can, for example, both recognize and interpret the text in an email and automatically respond and send all the information required; for example, for order processing, populating orders automatically, with the order validated against inventory and with delivery prioritized according to corporate rules – and all this without human intervention. This can potentially be extended further with automated carrier bookings against rules. Of course, this largely applies to the “happy flows” at the moment, but there are also proven practices to increase the proportion of “happy orders”.
The level of autonomy in supply chain fulfillment can also be increased by using analytics to monitor supply chain fulfillment and predict potential exceptions and problems, then either automating mitigation or proposing next-best actions to supply chain decision-makers.
This is only the beginning, as AI and blockchain still have a long way to go to reach their potential. Companies that harness their power now and are prepared to scale will be the ones coming out on top.
JW: Thank you, Joerg. I’m sure our readers will find considerable food for thought here as they plan and undertake their journeys to autonomous supply chains.
]]>
Introduction
Supply chain management is an area currently facing considerable pressure and is a key target for transformation. NelsonHall research shows that less than a third of supply chain executives in major enterprises are highly satisfied with, for example, their demand forecasting accuracy and their logistics planning and optimization, and that the majority perceive there to be considerable scope to reduce the levels of manual touchpoints and hand-offs within their supply chain processes as they look to move to more autonomous supply chains.
Accordingly, NelsonHall research shows that 86% of supply chain executives consider the transformation of their supply chains over the next two years to be highly important. This typically involves a redesign of the supply chain to maximize available data sources to deliver more efficient workflow and goods handling, improving connectivity within the supply chain to enable more real-time decision-making, and improving the competitive edge with better decision-making tools, analytics, and data sources supporting optimized storage and transport services.
Key supply chain transformation characteristics critical for driving supply chain autonomy that are sought by the majority of supply chain executives include supply chain standardization, end-to-end visibility of supply chain performance, ability to predict, sense, and adjust in real-time, and closed-loop adaptive planning across functions.
At the KPI level, there are particularly high expectations of high demand forecasting accuracy, improved logistics planning and optimization, leading to higher levels of fulfillment reliability; and enhanced risk identification leading to operational cost and working capital reduction.
So, overall, supply chain executives are typically seeking a reduction in supply chain costs, more effective supply chain processes and organization, and improved service levels.
Q&A Part 1
JW: Joerg, to what extent do you see existing supply chains under pressure?
JJ: From a manufacturer looking for increased supply chain resilience and lower costs to a B2C end consumer obsessed with speed, visibility, and aftersales services, supply chains are now under great pressure to transform and adapt themselves to remain competitive in an increasingly demanding and volatile environment.
Supply chain pressure results from increasing levels of supply chain complexity, higher customer expectations, a more volatile environment (e.g., trade wars, Brexit), difficulty in managing costs, and lack of visibility. In particular, global trade has been in a constant state of exception since 2009, creating a need to increase supply chain resilience via increased agility and flexibility and, in sectors such as fast-moving consumer goods and even automotive, hyper-personalization can mean a lot size of one, starting from procurement all the way through production and fulfillment. At the same time, supply chains are no longer simple “chains” but have talent, financial, and physical flows all intertwined in a DNA-like spiral resulting in a (supply chain) ecosystem with high complexity. All this is often compounded by the low level of transparency caused by manual processes. In response, enterprises need to start the journey to autonomous supply chains. However, many supply chains are still not digitized, so there’s a lot of homework to be done before introducing digitalization and autonomous supply chains.
JW: What do you understand by the term “autonomous supply chain”?
JJ: The end game in an “autonomous supply chain” is a supply chain that operates without human intervention. Just imagine a parcel reaching your home, knowing it didn’t take any human intervention to fulfill your order? How much of this is fiction and how much reality?
Well, some of this certainly depends on major investments and changes to regulations in areas such as sending drones to deliver your parcels, flying over your neighborhood, or loading automated trucks crisscrossing the country with nobody behind the steering wheel; major steps in lowering costs and improving customer satisfaction can already be undertaken using current technologies. Recent surveys show that only a quarter of supply chain leaders perceive that they have reached a satisfactory automation level, leveraging the most innovative end-to-end solutions currently available.
JW: What benefits can companies expect from the implementation of an “autonomous supply chain”?
JJ: Our observations and experience link autonomous supply chains to:
We have seen that automation can do far more than simply cut costs and that there are many ways to implement automation at scale without relying on infrastructure/regulation changes (e.g., drones) – for example, by leveraging a digitally augmented workforce. Companies have been launching proofs of concept (POCs) but often struggle to reap the true benefits due to talent shortages, siloed processes, and a lack of a long-term holistic vision.
JW: What hurdles do organizations need to overcome to achieve an autonomous supply chain?
JJ: We have observed that companies often face the following hurdles when trying to create a more autonomous supply chain:
One thing that many companies have in common is a lack of ability to deploy automation solutions at scale, cost-effectively. Too often, these projects remain at a POC stage and are parked until a new POC (often technology-driven) comes along and yet again fails to scale properly due to high costs, lack of resources, and lack of strategic vision tied to business outcomes.
In Part 2 of the interview, Joerg Junghanns discusses the supply chain flows that benefit from automation, describes client case examples, and highlights the success factors, adoption approach, and key technologies behind autonomous supply chains.
]]>
NelsonHall recently attended the IPsoft Digital Workforce Summit in New York and its analyst events in NY and London. For organizations unfamiliar with IPsoft, the company has around 2,300 employees, approximately 70% of these based in the U.S. and 20% in Europe. Europe is responsible for aproximately 30% of the IPsoft client base with clients relatively evenly distributed over the six regions: U.K., Spain & Iberia, France, Benelux, Nordics, and Central Europe.
The company began life with the development of autonomics for ITSM in the form of IPcenter, and in 2014 launched the first version of its Amelia conversational agent. In 2018, the company launched 1Desk, effectively combining its cognitive and autonomic capabilities.
The events outlined IPsoft’s positioning and plans for the future, with the company:
Enhancing Contextual Understanding to Maintain Amelia’s Differentiation from Chatbots
Amelia has often suffered from being seen at first glance as "just another chatbot". Nonetheless, IPsoft continues to position Amelia as “your digital companion for a better customer service” and to invest heavily to maintain Amelia’s lead in functionality as a cognitive agent. Here, IPsoft is looking to differentiate by stressing Amelia’s contextual awareness and ability to switch contexts within a conversation, thereby “offering the capability to have a natural conversation with an AI platform that really understands you.”
Amelia goes through six pathways in sequence within a conversation to understand each utterance and the pathway with highest probability wins. The pathways are:
The platform also separates “entities” from “intents”, capturing both of these using Natural Language Understanding. Both intent and entity recognition is specific to the language used, though IPsoft is now simplifying implementation further by making processes language-independent and removing the need for the client to implement channel-specific syntax.
A key element in supporting more natural conversations is the use of stochastic business process networks, which means that Amelia can identify the required information as it is provided by the user, rather than having to ask for and accept items of information in a particular sequence as would be the case in a traditional chatbot implementation.
Context switching is also supported within a single conversation, with users able to switch between domains, e.g. from IT support to HR support and back again in a single conversation, subject to the rules on context switching defined by the organization.
Indeed, IPsoft has always had a strong academic and R&D focus and is currently further enhancing and differentiating Amelia through:
The company is also looking to incorporate sentiment analysis within voice. While IPsoft regards basic speech-to-text and text-to-speech as commodity technologies, the company is looking to capture sentiment analysis from voice, differentiate through use of SLM/SRGS technology, and improve Amelia’s emotional intelligence by capturing aspects of mood and personality.
Launching Co-pilot to Remove the Demarcation Between Automated Handling and Agent Handling
Traditionally, interactions have either been handled by Amelia or by an agent if Amelia failed to identify the intent or detected issues in the conversation. However, IPsoft is now looking to remove this strong demarcation between chats handled solely by Amelia and chats handled solely by (or handed off in their entirety) to agents. The company has just launched “Co-pilot”, positioned as a platform to allow hybrid levels of automation and collaboration between Amelia, agents, supervisors, and coaches. The platform is currently in beta mode with a major telco and a bank.
The idea is to train Amelia on everything that an agent does to make hand-offs warmer and to increase Amelia’s ability to automate partially, and ultimately handle, edge cases rather than just pass these through to an agent in their original form. Amelia will learn by observing agent interactions when escalations occur and through reinforcement learning via annotations during chat.
When Amelia escalates to an agent using Co-pilot, it will no longer just pass conversation details but will now also offer suggested responses for the agent to select. These responses are automatically generated by crowdsourcing every utterance that every agent has created and then picking those that apply to the particular context, with digital coaches editing the language and content of the preferred responses as necessary.
In the short term, this assists the agent by providing context and potential responses to queries and, in the longer term as this process repeats over queries of the same type, Amelia then learns the correct answers, and ultimately this becomes a new Amelia skill.
Co-pilot is still at an early stage with lots of developments to come and, during 2019, the Co-pilot functionality will be enhanced to recommend responses based on natural language similarity, enable modification of responses by the agent prior to sending, and enable agents to trigger partial automated conversations.
This increased co-working between humans and digital chat agents is key to the future of Amelia since it starts to position Amelia as an integral part of the future contact center journey rather than as a standalone automation tool.
Building Use Cases & Partner Program to Reduce Time to Value
Traditionally, Amelia has been a great cognitive chat technology but a relatively heavy-duty technology seeking a use case rather than an easily implemented general purpose tool, like the majority of the RPA products.
In response, IPsoft is treading the same path as the majority of automation vendors and is looking to encourage organizations (well at least mid-sized organizations) to hire a “digital worker” rather than build their own. The company estimates that its digital marketplace “1Store” already contains 672 digital workers, which incorporate back-office automation in addition to the Amelia conversational AI interface. For example, for HR, 1Store offers “digital workers” with the following “skills”: absence manager, benefits manager, development manager, onboarding specialist, performance record manager, recruiting specialist, talent management specialist, time & attendance manager, travel & expense manager, and workforce manager.
At the same time, IPsoft is looking to increase the proportion of sales and service through channel partners. Product sales currently make up 56% of IPsoft revenue, with 44% from services. However, the company is looking to steer this ratio further in support of product, by targeting 60% per annum growth in product sales and increasing the proportion of personnel, currently approx. two-thirds, in product-related positions with a contribution from reskilling existing services personnel.
IPsoft has been late to implement its partner strategy relative to other automation software vendors, attributing this early caution in part to the complexity of early implementations of Amelia. Early partners for IPcenter included IBM and NTT DATA, who embedded IPsoft products directly within their own outsourcing services and were supported with “special release overlays” by IPsoft to ensure lack of disruption during product and service upgrades. This type of embedded solution partnership is now increasingly likely to expand to the major CX services vendors as these contact center outsourcers look to assist their clients in their automation strategies.
So, while direct sales still dominate partner sales, IPsoft is now recruiting a partner/channel sales team with a view to reversing this pattern over the next few years. IPsoft has now established a partner program targeting alliance and advisory (where early partners included major consultancies such as Deloitte and PwC), implementation, solution, OEM, and education partners.
1Desk-based End-to-End Automation is the Future for IPsoft
IPsoft has about 600 clients, including approx. 160 standalone Amelia clients, and about a dozen deployments of 1Desk. However, 1Desk is the fastest-growing part of the IPsoft business with 176 enterprises in the pipeline for 1Desk implementations, and IPsoft increasingly regards the various 1Desk solutions as its future.
IPsoft is positioning 1Desk by increasingly talking about ROAI (the return on AI) and suggesting that organizations can achieve 35% ROAI (rather than the current 6%) if they adopt integrated end-to-end automation and bypass intermediary systems such as ticketing systems.
Accordingly, IPsoft is now offering end-to-end intelligent automation capability by combining the Amelia cognitive agent with “an autonomic backbone” courtesy of IPsoft’s IPcenter heritage and with its own RPA technology (1RPA) to form 1Desk.
1Desk, in its initial form, is largely aimed at internal SSC functions including ITSM, HR, and F&A. However, over the next year, it will increasingly be tailored to provide solutions for specific industries. The intent is to enable about 70% of the solution to be implemented “out of the box”, with vanilla implementations taking weeks rather than many months and with completely new skills taking approx.. three 3 months to deploy.
The initial industry solution from IPsoft is 1Bank. As the name implies, 1Bank has been developed as a conversational banking agent for retail banking and contains preformed solutions/skills covering the account representative, e.g. for support with payments & bills; the mortgage processor; the credit card processor; and the personal banker, to answer questions about products, services, and accounts.
1Bank will be followed during 2019 by solutions for healthcare, telecoms, and travel.
]]>
Components of Blue Prism's connected-RPA
Blue Prism is positioning by offering mature companies the promise of closing the gap with digital disruptors, both technically and culturally. The cultural aspect is important, with Blue Prism technology positioned as a lever to help organizations attract and inspire their workforce and give digitally-savvy entrepreneurial employees the technology to close the “digital entrepreneur gap” and also close the gap between senior executives and the workforce.
Within this vision, the Blue Prism roadmap is based around helping organizations to:
Introducing intelligent document processing capability
When analyzing the interactions on its Digital Exchange (DX), Blue Prism unsurprisingly found that the single biggest use, with 60% of the items being downloaded from DX, was related to unstructured document processing.
Accordingly, Blue Prism has just announced a beta intelligent document processing program, Decipher. Decipher is positioned as an easy on-ramp to document processing and is a document processing workflow that can be used to ingest & classify unstructured documents. It can be used “out-of-the-box” without the need to purchase additional licenses or products, and organizations can also incorporate their own document capture technologies, such as Abbyy, or document capture services companies within the Decipher framework.
Decipher will clean documents to ensure that they are ready for processing, apply machine learning to classify the documents, and then to extract the data. Finally, it will apply a confidence score to the validity of the data extracted and pass to a business user where necessary, incorporating human-in-the-loop assisted learning.
Accordingly, Decipher is viewed by Blue Prism as a first step in the increasingly important move beyond rule-based RPA to introduce machine learning-based human-in-the-loop capability. Not surprisingly, Blue Prism recognizes that, as machine learning becomes more important, people will need to be brought into the loop much more than at present to validate “low-confidence” decisions and to provide assisted learning to the machine learning.
Decipher is starting with invoice processing and will then expand to handle other document types.
Improving control of assets within Digital Exchange (DX)
The Digital Exchange (DX) is another vital component in Blue Prism’s vision of connected-RPA.
Enhancements planned for DX include making it easier for organizations to collaborate and share knowledge and facilitating greater security and control of assets by enabling an organization to control the assets available to itself. Assets will be able to be marked as private, effectively providing an enterprise-specific version of the Blue Prism digital exchange and within DX, there will be a “skills” drag-and-drop toolbar so that users, and not just partners, will be able to publish skills.
Blue Prism, like Automation Anywhere, is also looking to bring an e-commerce flavor to its DX: developers will be able to create skills and then sell them. Initially, Blue Prism will build some artifacts themselves. Others will be offered free-of-charge by partners in the short-term, with a view in the near term to enabling partners to monetize their assets.
Re-aligning architecture & introducing AI-related skills
Blue Prism has been working closely with cloud vendors to re-align its architecture, and in particular to rework its UI to appeal to a broader range of users and make Blue Prism more accessible to business users.
Blue Prism is also improving its underlying architecture to make it more scalable as well as more cloud-friendly. There will be a new, more native and automated means of controlling bots via a browser interface available on mobiles and tablets that will show the health of the environment in terms of meeting SLAs, and provide notifications showing where interventions are required. Blue Prism views this as a key step in moving towards provision of a fully autonomous digital workforce that manages itself.
Data gateways (available on April 30, 2019 in v6.5) are also being introduced to make Blue Prism more flexible in its use of generated data. Organizations will be able to take data from the Blue Prism platform and send it to ML for reporting, etc.
However, Blue Prism will continue to use commodity AI and is looking to expand the universe of technologies available to organizations and bring them into the Blue Prism platform without the necessity for lots of coding. This is being done via continuing to expand the number of Blue Prism partners and by introducing the concept of Blue Prism skills.
At Blue Prism World, the company announced five new partners:
At the same time, the company announced six AI-related skills:
Going forward
Blue Prism recognizes that while the majority of users presenting at its conferences may still be focused on introducing rule-based processes (and on a show of hands, a surprisingly high proportion of attendees were only just starting their RPA journeys), the company now needs to take major strides in making automation scalable, and in more directly embracing machine learning and analytics.
The company has been slightly slow to move in this direction, but launched Blue Prism labs last year to look at the future of the digital worker, and the labs are working on addressing the need for:
So far, downloads from the Bot Store have been free-of-charge, but Automation Anywhere perceives that this approach potentially limits the value achievable from the Bot Store. Accordingly, the company is now introducing monetization to provide value back to developers contributing bots and Digital Workers to the Bot Store, and to increase the value that clients can receive. In effect, Automation Anywhere is looking to provide value as a two-way street.
The timing for introducing monetization to the Bot Store will be as follows:
Pricing, initially in US$ only, will be per bot or Digital Worker, with a 70:30 revenue split between the developer and Automation Anywhere, with Automation Anywhere handling the billing and paying the developer monthly. Buyers will have a limited free trial period, initially 30 days but under review, but IP protection is being introduced so that buyers will not have access to the source code. The original developer will retain responsibility for building, supporting, maintaining, and updating their bots and Digital Workers. Automation Anywhere is developing some Digital Workers itself in order to seed the Bot Store with some examples, but Automation Anywhere has no desire to develop Digital Workers medium-term itself and may, once the concept is well-proven, hand over/license the Digital Workers it has developed to third-party developers.
Automation Anywhere clearly expects that a number of smaller systems integrators will switch their primary business model from professional services to a product model, building bots for the Bot Store, and is offering developers the promise of a recurring revenue stream and global distribution ultimately through not only the Bot Store but through Automation Anywhere and its partners. Although payment will be monthly, developers will receive real-time transaction reporting to assist them in their financial management. For professional services firms retaining a strong professional services focus, but used to operating on a project basis, Automation Anywhere perceives that licensing and updating Digital Workers within this model could provide both a supplementary revenue stream, and possibly, more importantly, a means to maintain an ongoing relationship with the client organization.
In addition to systems integrators, Automation Anywhere is targeting ISVs who, like Workday, can use the Bot Store and Automation Anywhere to facilitate deployment and operation of their software by introducing Digital Workers that go way beyond simple connectors. Although the primary motivation of these firms is likely to be to reduce the time to value for their own products, Automation Anywhere expects ISVs to be cognizant of the cost of adoption and to price their Digital Workers at levels that will provide both a reduced cost of adoption to the client and a worthwhile revenue stream to the ISV. Pricing of Digital Workers in the range $800 to as high as $12k-$15K per annum has been mentioned.
So far, inter-enterprise bot libraries have largely been about providing basic building blocks that are commonly used across a wide range of processes. The individual bots have typically required little or no maintenance and have been disposable in nature. Automation Anywhere is now looking to transform the concept of bot libraries to that of bot marketplaces to add a much higher, and long-lived, value add and to put bots on a similar footing to temporary staff with updateable skills.
The company is also aiming to steal a lead in the development of such bots and, preferably Digital Workers, by providing third-parties with the financial incentive to develop for its own, rather than a rival, platform.
]]>The company was initially slow to go-to-market in Europe relative to Blue Prism and UiPath, but estimates it has more than tripled its number of customers in Europe in the past 12 months.
NelsonHall attended the recent Automation Anywhere conference in Europe, where the theme of the event was “Delivering Digital Workforce for Everyone” with the following sub-themes:
Automate Everything
Automation Anywhere is positioning as “the only multi-product vendor”, though it is debatable whether this is entirely true and also whether it is desirable to position the various components of intelligent automation as separate products.
Nonetheless, Automation Anywhere is clearly correct in stating that, “work begins with data (structured and unstructured) – then comes analysis to get insight – then decisions are made (rule-based or cognitive) – which leads to actions – and then the cycle repeats”.
Accordingly, “an Intelligent RPA platform is a requirement. AI cannot be an afterthought. It has to be part of current processes” and so Automation Anywhere comes to the following conclusion:
Intelligent digital workforce = RPA (attended + unattended) + AI + Analytics
Translated into the Automation Anywhere product range, this becomes:
Adopted by Everyone
Automation Everywhere clearly sees the current RPA market as a land grab and is working hard to scale adoption fast, both within existing clients and to new clients, and for each role within the organization.
The company has traditionally focused on the enterprise market with organizations such as AT&T, ANZ, and Bank of Columbia using 1,000s of bots. For these companies, transformation is just beginning as they now look to move beyond traditional RPA, and Automation Anywhere is working to include AI and analytics to meet their needs. However, Automation Anywhere is now targeting all sizes of organization and sees much of its future growth coming from the mid-market (“automation has to work for all sizes of organization”) and so is looking to facilitate adoption here by introducing a cloud version and a Bot Store.
The company sees reduced “time to value” as key to scaling adoption. In addition to a Bot Store of preconfigured bots, the company has now introduced the concept of downloadable “Digital Workers” designed around personas, e.g. Digital SAP Accounts Payable Clerk. Automation Anywhere had 14 Digital Workers available from its Bot Store as at mid-March 2019. These go beyond traditional preconfigured bots and include pretrained cognitive capability that can process unstructured data relevant to the specific process, e.g. accounts payable.
In addition, Automation Anywhere believes that to automate at the enterprise-wide level you have to onboard your workforce very fast, so that you can involve more of the workforce sooner. Accordingly, the company is providing role-based in-product learning and interfaces.
To enable the various types of user to ramp up quickly, the coming version of Automation Anywhere will provide a customizable user interface to support the differing requirements of the business, IT, and developers, providing unique views for each. For example:
The Automation Anywhere University remains key to adoption for all types of user. Overall, Automation Anywhere estimates that it has trained ~100K personnel. The Automation Anywhere University has:
An increased emphasis on channel sales is also an important element in increasing adoption, with Automation Anywhere looking to increase the proportion of sales through partners from 50% to 70%. The direct sales organization consists of 13 field operating units broken down into pods, and this sales force will be encouraged to leverage partners with a “customer first/partner preferred” approach.
Partner categories include:
In addition, Automation Anywhere is now starting to target ISVs. The company has a significant partnership with Workday to help the ISV automate implementation and reduce implementation times by, for example, assisting in data migration, and the company is hoping that this model can be implemented widely across ISVs.
Automation Anywhere is also working on a partner enablement platform, again seen as a requisite for achieving scale, incorporating training, community+, etc. together with a demand generation platform.
Customer success is also key to scaling. Here, Automation Anywhere claims a current NPS of 67 and a goal to exceed the NPS of 72 achieved by Apple. With that in mind, Automation Anywhere has created a customer success team of 250 personnel, expected to grow to 600+ as the team tries to stay ahead of customer acquisition in its hiring. All functions with Automation Anywhere get their feedback solely through this channel, and all feedback to clients is through this channel. In addition, the sole aim of this organization is to increase the adoptability of the product and the organization’s NPS. The customer success team does not get involved in up-selling, cross-selling, or deal closure.
Available Everywhere
“Available Everywhere” encompasses both a technological and a geographic perspective. From a hosting perspective, Automation Anywhere is now available on cloud or on-premise, with the company clearly favoring cloud where its clients are willing to adopt this technology. In particular, the company sees cloud hosting as key to facilitating its move from the enterprise to increasingly address mid-market organizations.
At the same time, Automation Anywhere has “taken installation away” with the platform, whether on-premise or on cloud, now able to be accessed via a browser. The complete cloud version “Intelligent Automation Cloud” is aimed at allowing organizations to start their RPA journey in ~4 minutes, while considerably reducing TCO.
In terms of languages, the user interface is now available in eight languages (including French, German, Japanese, Spanish, Chinese, and Korean) and will adjust automatically to the location selected by the user. At the same time, the platform can process documents in 190 languages.
Automation Anywhere also provides a mobile application for bot management.
Summary
In summary, Automation Anywhere regards the keys to winning a dominant market share in the growth phase of the RPA market as being about simultaneously facilitating rapid adoption in its traditional large enterprise market and moving to the mid-market and SMEs at speed.
The company is facilitating ongoing RPA scaling in large enterprises by recognizing the differing requirements of business users, IT, and developers, and establishing separate UIs to increase their acceptance of the platform while increasingly supporting their need to incorporate machine learning and analytics as their use cases become more sophisticated. For the smaller organization, Automation Anywhere has facilitated adoption by introducing free trials, a cloud version to minimize any infrastructure hurdles, and a Bot Store to reduce development time and time to value.
]]>
Platforms have been increasingly important in B2C digital transformation in recent years and have been used to disintermediate and create a whole raft of well-known consumer business opportunities. B2B platforms have been less evident during this period outside the obvious ecosystems built up in the IT arena by the major cloud and software companies. However, with blockchain now emerging to complement the increasing power of cognitive and automation technologies, the B2B platform is now once again on the agenda of major corporations.
One IT services vendor assisting corporations in establishing B2B platforms to reimagine certain of their business processes is Capgemini, where B2B platform development is a major initiative alongside smart automation. In this interview, NelsonHall CEO John Willmott talks with Manuel Sevilla, Capgemini’s Chief Digital Officer, about the company’s B2B platform initiatives.
JW: Manuel, welcome. As Chief Digital Officer of Capgemini, what do you regard as your main goals in 2019?
MS: I have two main goals:
JW: What do you see as the keys to success in building a B2B platform?
MS: The investment required to establish a B2B platform is significant by nature and has to be seen in the long-term. This significant and long-term investment is required across the following three areas:
JW: How do the ecosystem requirements differ for a B2B platform as opposed to a B2C platform?
MS: B2B and B2C are very different. In B2C environments, a partial solution is often sufficient for consumers to start using it. In B2B, corporates will not use a partial platform. For example, for corporates to input their private data, the platform has to be fully secured. Also, it is important to bring a service that delivers enough value either by simplifying and reducing process costs or by providing access to new markets, or both. For example, a B2B supply chain platform with a single auto manufacturer will undoubtedly fail. The big components suppliers will only join a platform that provides access to a range of auto manufacturers, not a separate platform for each manufacturer.
Building the ecosystem is perhaps the most difficult task when creating a B2B platform. The value of Capgemini is that the company is neutral and can take the lead in driving the initiatives to make the platform happen. Capgemini recognizes humbly that for a platform to scale, it needs not only a diverse range of partners but also that Capgemini cannot be the only provider; it is critical to involve Capgemini’s partners and competitors.
JW: How does governance differ for a B2B platform?
MS: In a fast-moving B2B environment, defining the governance has to proceed alongside building the ecosystem, and it is essential to have processes in place for taking decisions regarding the platform roadmap in both the short and long-term.
B2B platform governance is not the usual two-way client/vendor governance; it is much more complex. For a B2B platform, you need to have a clear definition of who is a member and how members take decisions. It then needs enough large corporates as founder members to drive initial functionalities and to ensure that the platform will bring value and will be able to scale. Once the platform has critical mass, then the governance mechanism needs to adapt itself to support the future scaling of the platform, often with an accompanying dilution of the influence of the founder members.
The governance for a B2B platform often involves creating a separate legal entity, which can be a consortium, a foundation, or even multiple legal entities.
JW: Can you give me an example of where Capgemini is currently developing a B2B platform?
MS: Capgemini is currently developing four B2B platforms, including one with the R3 consortium to build a B2B platform called KYC Trust that aims to solve the corporate KYC problem between corporates and banks. Capgemini started work on KYC Trust in early 2016 and it is expected to go into scaled production in the next 12-24 months.
JW: What is the corporate KYC problem and how is Capgemini addressing this?
MS: Corporate KYC starts with the data collection process, with, at present, each bank typically asking the corporate several hundred questions. As each bank typically asks its own unique questions, this creates a substantial workload for the corporate across banks. Typically, it takes a month to collect the information for each bank. Then, once a bank has collected the information on the corporate, it needs to check it, which means paying third-parties to validate the data. The bank then typically uses an algorithm to score the acceptability of the corporate as a customer. This process needs to be repeated regularly. Also, the corporate typically has to wait, say, 30 days for its account to be opened.
To simplify and speed up this process, Capgemini is now building the KYC Trust B2B platform. This platform incorporates a standard KYC taxonomy to remove redundancy from, and standardize, data requests and submission, and each corporate will store the documents required for KYC in its own nodes on the platform. Based on the requests received from banks, a corporate can then decide which documents will be shown to whom and when. All these transactions will be traceable in blockchain so that the usage of each document can be tracked in terms of which bank accessed it and when.
The advantage for a bank in onboarding a new corporate using this platform is that a significant proportion of the information required from a corporate will already exist, having already been supplied to another bank. The benefits to corporates include reducing the effort in submitting information and in being able to identify which information has been used by which bank and when, where, and how.
This will speed up the KYC process and simplify data collection operations. It will also simplify how corporates manage their own data such as shareholder information and information on new beneficial owners.
JW: How does governance work in the case of KYC Trust?
MS: A foundation will be established in support of the governance of KYC Trust. The governance has two main elements:
Key principles of the foundation are respect for openness and interoperability, since there cannot be a single B2B platform that meets all the business needs. In order to build scale, it is important to encourage interoperability with other B2B platforms, such as (in this case) the Global Legal Entity Identifier Foundation (GLEIF), to maximize the usefulness and adoption of the platform.
JW: How generally applicable is the approach that Capgemini has taken to developing KYC Trust?
MS: There are a lot of commonalities. Sharing of documents in support of certification & commitments is the first step in many business processes. This lends itself to a common solution that can be applied across processes and industries. Capgemini is building a structure that would allow platforms to be built in support of a wide range of B2B processes. For example, the structure used within KYC Trust could be used to support various processes within supply chain management. Starting with sourcing, it could be used to ensure, for example, that no children are being employed in a factory by asking the factory to submit a document certified by an NGO to this effect every six months. Further along the supply chain, it could also be used, for example, to support the correct use of clinical products sold by pharmaceutical companies.
And across all four B2B platforms currently being developed by Capgemini, the company is incorporating interoperability, openness, and a taxonomy as standard features.
JW: Thank you Manuel, and good luck. The emergence of B2B platforms will be a key development over the next few years as organizations seek to reimagine and digitalize their supply chains, and I look forward to hearing more about these B2B platform initiatives as they mature.
]]>
In 2016, Atos was awarded a 13-year life & pensions BPO contract by Aegon, taking over from the incumbent Serco and involving the transfer of ~300 people in a center in Lytham St Annes.
The services provided by Atos within this contract include managing end-to-end operations, from initial underwriting through to claims processing, for Aegon's individual protection offering, which comprises life assurance, critical illness, disability, and income protection products (and for which Aegon has 500k customers).
Alongside this deal, Aegon was separately evaluating the options for its closed book life & pensions activity and subsequently went to market to outsource its U.K. closed book business covering 1.4m customers across a range of group and individual policy types. The result was an additional 15-year deal with Atos, signed recently.
Three elements were important factors in the award of this new contract to Atos:
Leveraging Edinburgh-Based Delivery to Offer Onshore L&P BPS Service
The transfer of the existing Aegon personnel and maintaining their presence in Edinburgh was of high importance to Aegon, the union, and the Scottish government. The circa 800 transferred personnel will continue to be housed at the existing site when transfer takes place in summer 2019, with Atos sub-leasing part of Aegon’s premises. This is possible for Atos since it is the company’s first life closed block contract and the company is looking to win additional deals in this space over the next few years (and will be going to market with an onshore rather than offshore-based proposition).
Partnering with Sapiens to Offer Platform-Based Service
While (unlike some other providers of L&P BPS services) Atos does not own its own life platform, the company does recognize that platform-based services are the future of closed book L&P BPS. Accordingly, the company has partnered with Sapiens, and the Sapiens insurance platform will be used as a common platform and integrated with Pega BPM across both Aegon’s protection and closed book policies.
Atos has undertaken to transfer all of the closed block policies from Aegon’s two existing insurance platforms to Sapiens, and these will be transferred over the 24-month period following service transfer. The new Sapiens-based system will be hosted and maintained by Atos.
Aiming for Customer-Centric Operational Excellence
The third consideration is a commitment by Atos to implement customer-centric operational excellence. While Aegon had already begun to measure customer NPS and assess ways of improving the service, Atos has now begun to employ further the customer journey mapping techniques deployed in its Lytham center to identify customer effort and pain points. Use of the Sapiens platform will enable customer self-service and omni-channel service, while this and further automation will be used to facilitate the role of the agent and enhance the number of policies handled per FTE.
The contract is priced using the fairly traditional pricing mechanisms of a transition and conversion charge (£130m over a 3-year period) followed by a price per policy, with Atos aiming for efficiency savings of up to £30m per annum across the policy book.
Atos perceives that this service will become the foundation for a growing closed block L&P BPS business, with Atos challenging the incumbents such as TCS Diligenta, Capita, and HCL. Edinburgh will become Atos’ center of excellence for closed book L&P BPS, with Atos looking to differentiate from existing service providers by offering an onshore-based alternative with the digital platform and infrastructure developed as part of the Aegon relationship, offered on a multi-client basis. Accordingly, Atos will be increasingly targeting life & pensions companies, both first-time outsourcers and those with contracts coming up for renewal, as it seeks to build its U.K. closed book L&P BPS business.
]]>
Wipro has a long history of building operations platforms in support of shared services and has been evolving its Base))) platform for 10 years. The platform started life as a business process management (i.e. workflow) platform and includes Base))) Core, a BPM platform in use within ~25 Wipro operations floors, and Base))) Prism, providing operational metrics.
These elements of Base))) are now complemented by Base))) Harmony, a SaaS-based process capture and documentation platform.
So why is this important? Essentially, Harmony is appropriate where major organizations are looking to stringently capture and document their processes across multiple SSCs to further harmonize or automate these processes. It is particularly suitable for use where:
Supporting Process Harmonization for SSC Consolidation & Acquisition
Harmony is most appropriate for multinational organizations with multiple SSCs looking to consolidate these further. It has been used in support of standardized process documentation, and library and version control, by major organizations in the manufacturing, telecoms and healthcare sectors.
In recent years, multinationals have typically been on a journey moving away from federated SSCs, each with their own highly customized processes, to a GBS model with more standardized processes. However, relatively few organizations have completed this journey and, typically, scope remains for further process standardization and consolidation. This situation is often exacerbated by a constant stream of acquisitions and the need to integrate the operations of acquired companies. Many multinationals are active acquirers and need to be able to standardize and integrate SSCs within acquired companies into their GBS operations as quickly and painlessly as possible.
Process documentation is a key element in this process standardization and consolidation. However, process documentation in the form of SOPs can often be a manual and time-consuming process suffering from a lack of governance and change & version control.
Harmony is a standalone SaaS platform for knowledge & process capture and harmonization that aims to address this issue. It supports process capture at the activity level, enabling process steps to be captured diagrammatically along with supporting detailed documentation, including attachments, video, and audio.
The documentation is highly codified, capturing the “why, what, who, & when” for each process in a structured form along with the possible actions for each process step, e.g. allocate, approve, or calculate, using the taxonomy developed in the MIT Process Handbook.
From a review perspective, Harmony also provides a view from the perspectives of data, roles, and systems for each process step, so that, for example, it is easy to identify which data, roles or systems are involved in each step. Similarly, the user can click on, say, a specific role to see which steps that role participates in. This assists in checking for process integrity, e.g. checking that a role entering data cannot also be an approver.
Wipro estimates that documentation of a complex process with ~300 pages of SOP takes 2-3 weeks, with documentation of a simple process such as receiving an invoice or onboarding an employee taking 2-3 days, and initial training in Harmony typically taking a couple of days.
Harmony also supports process change governance, notifying stakeholders when any process modifications are made.
Reports available include:
Support for process harmonization and adoption of reference or “golden processes” are also key aspects of Harmony functionality. For example, it enables the equivalent processes in various countries or regions to be compared with each other, or with a reference process, identifying the process differences between regions. The initial reference process can then be updated as part of this review, adding best practices from country or regional activities.
Harmony also plays a role in best practice adoption, including within its process libraries a range of golden processes, principally in finance & accounting and human resources, which can be used to speed up process capture or to establish a reference process.
Facilitating Value Extraction from BPS Contract Renewals
Despite the lack of innovation experienced within many in-force F&A BPS contracts, the lack of robust process documentation across all centers can potentially be a major inhibitor to changing suppliers. Organizations often tend to stick with their incumbent since they are aware of the time and effort that was required for them to acquire process understanding and they are scared of the length and difficulty of transferring process knowledge to a new supplier.
Harmony can potentially assist organizations facing this dilemma in running more competitive sourcing exercises and increasing the level of business value achieved on contract renewal by baselining process maturity, identifying automation potential, and by providing a mechanism for training new associates more assuredly.
Harmony provides a single version of each SOP online. As well as maintaining a single version of the truth, this assists organizations in training associates (with a new associate able to select just the appropriate section of a large SOP, relating to their specific activity, online).
Shortening the RPA Process Assessment Lifecycle
As organizations increasingly seek to automate processes, a key element in Harmony is its “botmap” module. Two of the challenges faced by organizations in adopting automation are the need for manual process knowledge capture and the discrepancies that often arise between out-of-date SOPs and associate practice. This typically leads to a 4-week process capture and documentation period at the front-end of the typically 12-week RPA assessment and implementation lifecycle.
Harmony can potentially assist in shortening, and reducing the cost of, these automation initiatives by eliminating much of the first 4 weeks of this 12-week RPA assessment and implementation lifecycle. It does this by recommending process steps with a high potential for automation. These recommendations are based on an algorithm that takes into account parameters such as the nature of the process step, the sequence of activities, the number of FTEs, the systems used, and the repeatability of the process. The resulting process recommendations assist the RPA business analyst in identifying the most appropriate areas for automation while also providing them with an up-to-date, more reliable, source of process documentation.
]]>
SCM is one of Genpact’s “invest-to-grow” service lines, where the company is looking to make disproportionate investments and scale up the business: in this case, to become one of the top two global supply chain transformation services vendors. In its “invest-to-grow” businesses, Genpact is looking to achieve at least twice the level of revenue growth achieved by Genpact overall and to do this by investing in complementary competencies rather than scale.
Genpact identified Barkawi Management Consultants, part of the Barkawi Group, as a potential target by working alongside the company (from now on referred to as Barkawi) within in its client base. Discussions began in late 2017, with the deal expected to close this month, August 2018, once the regulatory processes are complete.
The acquisition of Barkawi provides a strong platform for Genpact to deepen its supply chain consulting practice, achieve a revenue balance in SCM between transformation consulting and managed services, strengthen its relationships and expertise in key supply chain technologies, and strengthen its presence in Europe.
Deepening Supply Chain Consulting Capability
In the area of SCM, Genpact had existing capability in planning & inventory optimization & demand analytics and a couple of large managed services contracts. However, the company had limited front-end consulting capability, with just 30 supply chain management consultants. Although Genpact was organically adding SCM consultants, this relative lack of front-end expertise was limiting its ability to handle a significant number of concurrent prospect conversations. The acquisition of Barkawi brings 180 SCM consultants to Genpact, enabling the company to have not only a greater number of simultaneous client and prospect interactions but also to have deeper and more end-to-end conversations across more SCM transformation dimensions (including operating model transformation, technology transformation, digital transformation, and customer-oriented transformation).
Prior to the acquisition, Barkawi had ~200 consultants, with the bulk of these (~180) in the U.S. (principally in a center in Atlanta) and Europe (principally in a center in Munich). These are the operations being acquired by Genpact. The remaining Barkawi personnel were based in the Middle-East and China, which are not markets where Genpact actively generates business, and these personnel will not be transferring to Genpact.
Barkawi principally employs two types of consultant:
The U.S. business was slightly larger than the European business and employed a majority of personnel active as technology consultants, while the European business employed a majority of its personnel in management/process consulting.
Achieving a Balance between Transformation Consulting & Managed Services
Barkawi will be combined with Genpact’s consultants into a single SCM consulting service line, giving a broadly balanced mix across management/process consulting and technology consulting. This global service line will be headed by Mike Landry, previously head of Barkawi Management Consultants’ U.S. entity, and will be organized into supply chain consulting, aftermarket consulting, and technology, with these horizontals matrixed against the following verticals: consumer products, life sciences, industrial machinery, and product manufacturing.
Genpact is aiming to achieve a rough balance between the Genpact specialisms of consumer products and life sciences and the Barkawi specialism in industrial manufacturing. Similarly, Genpact is aiming for a roughly equal revenue split between consulting and managed services, with the CPG sector having a higher proportion of managed services contracts.
Strengthening Supply Chain Technology Relationships
Another advantage of the Barkawi acquisition is that it brings Genpact strong existing relationships with, and expertise in, supply chain planning platform companies Kinaxis and Anaplan. Barkawi is one of the leading partners of Kinaxis, and the company’s partnership with Anaplan on supply chain complements that of Genpact's with Anaplan for EPM.
Strengthening European Presence
In terms of its client base, Genpact estimates that the majority of Barkawi’s clients in the U.S. (where it was typically selling ~$200K technology consulting projects), are prospects for a wider range of Genpact supply chain transformation services. In addition, Barkawi had a strong management/process consulting presence in major manufacturers in Germany, which Genpact will seek to build on.
In addition, while the bulk of Barkawi’s European personnel are in Germany, Genpact will look to extend this capability by growing its team in both Munich and across Europe to address supply chain consulting in the wider European market. Genpact perceives there to be major consulting opportunities within the leading manufacturing companies, assisting them in implementing and optimizing technology, working with data, and creating optimization models. This applies particularly to companies with a strong element of aftermarket services, where these companies need to optimize their aftermarket models and address aftermarket fulfilment, warranty management, and forecasting.
Overall, Genpact is still looking to grow the supply chain management consulting team further, will continue to recruit, to support these growth initiatives.
]]>
This blog summarizes NelsonHall’s analysis of HCL's approach to Business Process Automation covering HCL’s 3-lever approach, its Integrated Process Discovery Technique, its AI-based information extraction tool Exacto, the company’s offerings for intelligent product support, and its use of its Toscana BPMS to drive retail banking digital transformation.
3-Lever Approach Combining Risk & Control Analysis, Lean & Six Ssigma, and Cognitive Automation
The 3 lever approach forms HCL’s basis for any “strategic automation intervention in business processes”. The automation is done using third-party RPA technologies together with a number of proprietary HCL tools including Exacto, a cutting-edge Computer Vision and Machine Learning based tool, and iAutomate for run book automation
HCL starts by conducting a 3-lever automation study and then creates comprehensive to-be process maps. As part of this 3-lever study, HCL also conducts complexity analysis to create the RPA and AI roadmap for organizations using its process discovery toolkit. For example, HCL has looked at their entire process repository for several major banks and classified their business processes into four quadrants based on scale and level of standardization
When generating the “to be” process map, HCL’s Integrated Process Discovery Technique places a high emphasis on ensuring appropriate levels of compliance for the automated processes and on avoiding the automation of process steps that can be eliminated
The orchestration of business processes is being done using HCL’s proprietary orchestration platform, Toscana©. Toscana© supports collaboration, analytics, case management, and process discovery and incorporates a content manager, a business rules management system, a process simulator, a process modeler, process execution engines, and integrated offering including social media monitoring & management.
Training Exacto AI-based Information Extraction Tool for Document Triage within Trade Processing, Healthcare, Contract Processing, and Invoice Processing
HCL’s proprietary AI enabled, machine learning solution, Exacto, is used to automatically extract and interpret information from a variety of information sources. It also has natural language and image based automated knowledge extraction capabilities
HCL has partnered with a leading U.S. University to develop its own AI algorithms for intelligent data extraction and interpretation for solving industry level problems, including specialist algorithms in support of trade processing, contract management, healthcare document triage, KYC, and invoice processing
Trade processing is one of the major areas of focus for HCL. Within capital markets trade capture, HCL has developed an AI/ML solution Exacto | Trade. This solution is able to capture inputs from incoming fax based transaction instructions for various trade classes such as Derivatives, FX, Margins, etc. with accuracy of over 99%.
Combining Watson-based Cognitive Agent with Run Book Automation to Provide “Intelligent Product Support”
HCL has developed a cognitive solution for Intelligent Product Support based on a cognitive agent LUCY, Intelligent Autonomics using for run book automation, and Smart Analytics with MyXalytics for dashboards and predictive analytics. LUCY is currently being used in support for IT services by major CPG, pharmaceuticals, and high-tech firms and in support of customer service for a major bank and a telecoms operator
HCL’s tool is used for run book automation, and HCL has already automated 1,500+ run books. uses NLP, ML, pattern matching, and text processing to recommend the “best matched” for a given ticket description. HCL estimates that it currently achieves “match rates” of around 87%-88%
HCL estimates that it can automate 20%-25% of L1 and L2 transactions and has begun automating internal IT infrastructure help-desks.
Positioning its Toscana Platform to Drive Digital Transformation in Retail Banking
HCL is embarking on digital transformation through this approach and has created predefined domain-specific templates in areas including retail banking, commercial lending, mortgages, and supply chain management. Within account opening for a bank, HCL has achieved ~ 80% reduction in AHT and a 40% reduction in headcount
In terms of bank automation, HCL has, for one major bank, reduced the absolute number of FTEs associated with card services by 48%, a 63% decrease based on the accompanying increase in the workload. Elsewhere, for another bank, HCL has undertaken a digital transformation including implementation of Toscana©, resulting in a reduction of the number of FTEs by 46%, the implementation of a single view of the customer, a reduction in cycle time of 80%, and a reduction in the “rejection rate” from 12% to 4%.
Capgemini's Carole Murphy
Capgemini has recently redefined its framework for Intelligent Automation, taking an approach based on ‘five senses’. I caught up with Carole Murphy, Capgemini’s Head of BPO Business Transformation Services, to identify how their new approach to Intelligent Automation is being applied to the finance & accounting function.
JW: What is Capgemini’s approach to Intelligent Automation and how does it apply to finance & accounting?
CM: Capgemini is using a ‘five senses’ model to help explain Intelligent Automation to our clients and to act as a design framework in developing new solutions. The ‘five senses’ are:
Capgemini's 'Five Senses' IA model
JW: Why is this approach important?
CM: It changes the fundamental nature of finance operations from reactive to proactive. Historically, within BPS contracts, the vast majority of the process has been in ‘act’ mode, supplemented by a certain amount of analytics. The introduction of Intelligent Automation enables us to move to a much more rounded and proactive approach. For example, a finance organization can now identify what is missing before it starts to cause problems for the organization. In accounts payable, for example, Intelligent Automation enables an organization to anticipate utility bills, and proactively identify any missing or unsent invoices before, say, a key facility is switched off. Similarly, in support of R2R, transactions can be monitored throughout the month and the organization can anticipate their impact on P&L well in advance of the formal month-end close. In summary, Intelligent Automation enables ongoing monitoring and analysis rather than periodic analysis and can incorporate real-time alerts, such as to identify a missing invoice and find out why it wasn’t raised. So, a much more informed and proactive approach to operations.
JW: And what impact does IA have on the roles of the people within the finance function?
CM: Intelligent Automation lifts both the role of the finance function within the organization and those of the individuals within the finance function. One of the benefits of Intelligent Automation is that it improves how we share and deploy rules and knowledge throughout the organization – making compliance more accessible and enabling colleagues to understand how to make good financial decisions. Complex queries can still be escalated but simple questions can be captured and resolved by the ‘knowbot’. This can change how we train and continually develop our people and how we interact across the organization.
JW: So how should organizations approach implementing Intelligent Automation within their finance & accounting functions?
CM: One of the exciting elements of the new technology is that it is designed for users to be able to implement more quickly and easily – it’s all about being agile. Alongside implementing point solutions, the value will ultimately be how to combine the senses to bring the best out of people and technology.
We can re-think and re-imagine how we work. Traditionally, we have organized work in sequential process steps – using the 5 senses and intelligent automation we can reconfigure processes, technology and human intervention in a much more inter-connected manner with constant interaction taking place between the ‘five senses’ discussed earlier.
This means that it’s important to reimagine traditional finance & accounting processes and fundamentally change the level of ambition for the finance function. So, for example, the finance function can now start to have much more impact on the top line, such as by avoiding leakage due to missing orders and missing payments. Here, Intelligent Automation can monitor all transactions and identify any that appear to be missing. Similarly, it’s possible to implement ‘fraud bots’ to identify, for example, duplicate invoices or payments to give much greater levels of insight and control than available traditionally.
JW: What does this involve?
CM: There are lots of ways to start engaging with the technology – we see a ‘virtuous cycle’ that follows the following steps:
Reimagining the F&A processes in the light of Intelligent Automation, rather than automating existing processes largely ‘as-is’ is especially important. In particular, it’s important to eliminate, rather than automate, any ‘unnecessary’ process steps. Here, Capgemini’s eSOAR approach is particularly important and covers:
Finally, it’s critical not to be scared of the new technology and possibilities. The return on investment is incredible and the initial returns can be used to fund downstream transformation. At the same time, the cost, and timescale, of failure is relatively low. So, it’s important for finance organizations to start applying Intelligent Automation to get first-mover advantage rather than just watch and wait.
JW: Thank you very much, Carole. That certainly ties in with current NelsonHall thinking and will really help our readership. NelsonHall is increasingly being asked by our clients what constitutes next generation services and what is the art-of-the-possible in terms of new digital process models. And finance and accounting is at the forefront of these developments. Certainly, design thinking is an important element in assisting organizations to rethink both accounting processes and how the finance function can make a greater contribution to the wider enterprise – and the ‘five senses’ approach helps to demystify Intelligent Automation by clarifying the roles of the various technologies such as RPA for process execution, analytics for root cause analysis, and knowledge bases for process knowledge.
]]>
Many of the pureplay BPS vendors have been moving beyond individual, often client-specific implementations of RPA and AI and building new digital process models to present their next generation visions within their areas of domain expertise. So, for example, models of this type are emerging strongly in the BFSI space and in horizontals such as source-to-pay.
A key feature of these new digital process models is that they are based on a design thinking-centric approach to the future and heavily utilize new technologies, with the result that the “to-be” process embodied within the new digital process model typically bears little relation to the current “as-is” process whether within a BPS service or within a shared service/retained entity.
These new digital process models are based on a number of principles, emphasizing straight-through processing, increased customer-centricity and proactivity, use of both internal and external information sources, and in-built process learning. They typically encompass a range of technologies, including cloud system of engagement platforms, RPA, NLP, machine learning and AI, computer vision, predictive and prescriptive analytics held together by BPM/workflow and command & control software.
However, while organizations are driving hard towards identifying new digital process models and next generation processes, there are a relatively limited number of examples of these in production right now, their implementations use differing technologies and frameworks, and the rate of change in the individual underlying technology components is potentially very high. Similarly, organizations currently focusing strongly on adoption of, say, RPA in the short-term realize that their future emphasis will be more cognitive and that they need a framework that facilitates this change in emphasis without a fundamental change in framework and supporting infrastructure.
Aiming for a Unifying Framework for New Digital Process Models
In response to these challenges, and in an attempt to demonstrate longevity of next generation digital process models, Genpact has launched a platform called “Genpact Cora” to act as a unifying framework and provide a solid interconnect layer for its new digital process models.
Genpact Cora is organized into three components:
One of the aims of this platform is to provide a framework into which technologies and individual products can be swapped in and out as technologies change without threatening the viability of the overall process or the command and control environment, or necessitating a change of framework. Accordingly, the Genpact Cora architecture also encompasses an application program interface (API) design and an open architecture.
Genpact is then building its new digital process models in the form of “products” on top of this platform. Genpact new digital process model “products” powered by Cora currently support a number of processes, including wealth management, commercial lending, and order management.
However, in the many process areas where these “products” are not yet formed, Genpact will typically take a consulting approach, initially building client-specific digital transformations. Then, as the number of assignments in any specific process area gains critical mass, Genpact is aiming to use the resulting cumulative knowledge to build a more standardized new digital process model “product” with largely built-in business rules that just require configuring for future clients. And each of these “products” (or new digital process models) will be built on top of the Genpact Cora platform.
Launching “Digital Solutions” Service in Support of Retained Operations
Another trend started by the desire for digital process transformation and the initial application of RPA is that organizations are keen to apply new digital process models not just to outsourced services but to their shared services and retained organizations. However, there is currently a severe shortage of expertise and capability to meet this need. Accordingly, Genpact intends to offer its Genpact Cora platform not just within BPS assignments but also in support of transformation within client retained services. Here, Genpact is launching a new “Digital Solutions” service that implements new digital process models on behalf of the client shared services and retained organizations and complements its “Intelligent Operations” BPS capability. In this way, Genpact is aiming to industrialize and speed up the adoption of new digital process models across the organization by providing a consistent and modular platform, and ultimately products, for next generation process adoption.
]]>
NelsonHall recently attended WNS analyst/adviser events in New York and the U.K., where the theme was “NEXT”, with the sub-text of assisting organizations in “thriving in a state of constant disruption”.
While the trend to digitalization of business processes using technologies such as robotics and cloud, artificial intelligence, and big data is causing concern within some quarters of the BPS industry, WNS is thriving in this new environment. And, while it is reducing the revenues in established footprints, it is certainly creating major new opportunities for digitalization in client organizations.
Complementing Traditional Domain Virtues…
So, how is WNS doing this? Well the company is positioning on driving transformation through its domain knowledge, process expertise, technology and automation, analytics, global delivery capability, and client centricity.
However, while the company is becoming increasingly strong in the development of new digital process models in support of its target industry sectors, WNS’ key differentiators continue to lie in its domain knowledge, where both its go-to-market personnel and its delivery personnel are fully aligned by industry, resulting in a depth of process knowledge in a domain context, and in client centricity. In addition, WNS has not fought shy of developing tier-2 delivery capability in support of its global delivery capability, though as automation takes hold the company is now likely to consolidate into existing locations rather than add new tier 2 cities.
WNS has also been ahead of the curve in building up analytics capability, with analytics accounting for 20% of WNS’ revenues, an increasingly important component of new digital process models. Accordingly, its 2017 analyst events focused extensively on analytics capabilities and WNS TRAC.
….with Platforms, Robotics, Analytics, and Cloud
WNS TRAC (Technology, Robotics, Analytics, and Cloud) is described by WNS as “a consolidated suite of next generation, all-encompassing BPM (for which read BPS rather than workflow) enablement technology solutions”. I’m not sure this is entirely the best phrasing, but in NelsonHall-speak it largely equates to “New Digital Process Models”. However, unlike some BPS firms that offer new digital process models in either client-operated or in BPS form, WNS intends to offer TRAC as part of its BPS engagements only.
TRAC encompasses each of WNS’ industry platforms, such as Verifare for the travel sector, increasingly complementing these with robotics, AI, and analytics. WNS has developed both “industry TRAC solutions” (18 solutions covering the travel, insurance, healthcare, shipping & logistics, utilities, and CPG & retail sectors), and “cross-industry TRAC solutions”, covering CFO (F&A), CPO, standalone robotics & digital automation, and CIS (interaction analytics, speech & text analytics, and omni-channel solutions).
However, while WNS has developed strong platform capability, particularly in areas such as the travel sector, there remains work to be done in fully incorporating cognitive technologies such as machine learning (where WNS is now beginning to partner with MIT Media Lab) to fully build out many of its nascent new digital process models. For example, while WNS has a number of sourcing-related platforms plus spend analytics capability within its “CPO TRAC”, the company has yet to fully incorporate the levels of NLP (in support of supplier and contract management) and cognitive technologies (in support of virtual procurement agents) that are beginning to emerge as part of new digital S2P process models.
Looking to Productize its Analytics Services
WNS continues to win major contracts in analytics, citing recent examples in the pharmaceuticals, FMCG, retail, and insurance sectors. While the company has assisted major organizations in establishing analytics CoEs, and offers an end-to-end analytics capability (from data aggregation, through processing, to visualization and consumption), WNS is increasingly looking to productize its analytics services, often incorporating its Brandttitude and SocioSeer platforms in support of specific use cases such as media spend guidance, price & promotion optimization, and shelf space optimization.
Further Investment in Digital Components Planned
Going forward, WNS will continue to focus on its key industry domains, looking to become “an integral partner to organizations in their digital adoption journey”, and incorporating new TRAC solutions in support of next-gen domain capability. While WNS will aim to develop its own core technology stack, the company will increasingly invest in proprietary tools and platforms and undertake acquisitions in key areas such as digital, RPA, AI, and smart meters.
John Willmott and Rachael Stormonth
As well as including WNS in relevant BPS and RPA/AI research areas, NelsonHall also covers WNS in the NelsonHall Quarterly Update Program - for details contact [email protected]
]]>
Wipro began partnering with Automation Anywhere in 2014. Here I examine the partnership, looking at examples of RPA deployments across joint clients, at how the momentum behind the partnership has continued to strengthen, and at how the partners are now going beyond rule-based RPA to build new digital business process models.
Partnership Already has 44 Joint Clients
Wipro initially selected Automation Anywhere based on the flexibility and speed of deployment of the product and the company’s flexibility in terms of support. The two companies also have a joint go-to-market, targeting named accounts with whom Wipro already has an existing relationship, plus key target accounts for both companies.
To date, Wipro has worked with 44 clients on automation initiatives using the Automation Anywhere platform, representing ~70% of its RPA client base. Of these, 17 are organizations where Wipro was already providing BPS services, 27 are clients where Wipro has assisted in-house operations or provided support in applying RPA to processes already outsourced to another vendor.
In terms of geographies, Wipro’s partnership with Automation Anywhere is currently strongest in the Americas and Australia. However, Automation Anywhere has recently been investing in a European presence, including the establishment of a services and support team in the U.K., and the two companies are now focusing on breaking into the major non-English-speaking markets in Continental Europe.
So let’s look at a few examples of deployments.
For an Australian telco, where Wipro is one of three vendors supporting the order management lifecycle, Wipro had ~330 FTEs supporting order entry to order provisioning. Wipro first applied RPA to these process areas, deploying 45 bots, replacing 143 FTEs. The next stage looked across the order management lifecycle. Since the three BPS vendors were handling different parts of the lifecycle, an error or missing information at one stage would result in the transaction being rejected further downstream. In order to eliminate, as far as possible, exceptions from the downstream BPS vendors, Wipro implemented "checker" bots which carry out validation checks on each transaction before it is released to the next stage in the process, sending failed transactions and the reasons for failure back to the processing bots for reprocessing and, where appropriate, the collection of missing information. This reduced the number of kick-backs by 73%.
Other clients where Wipro has used Automation Anywhere in RPA implementations include:
Using The Partnership to Enhance Speed-to-Benefit within Rule-Based Processes
The momentum behind the partnership has continued to strengthen, with Wipro achieving a number of significant wins in conjunction with Automation Anywhere over the past three months, including a contract which will result in the deployment of in excess of 100 bots within a single process area over the next 6 months. In the last quarter, as organizations begin to scale their RPA roll-outs, Wipro has seen average RPA deal sizes increase from 25-40 bots to 75-100 bots.
Key targets for Wipro and Automation Anywhere are banking, global media & telecoms, F&A, and increasingly healthcare and Wipro has recently been involved in discussions with new organizations across the manufacturing, retail, and consumer finance sectors in areas such F&A, order management, and industry-specific processing.
Out of its team of ~450 technical and functional RPA FTEs (~600 FTEs if we include cognitive), Wipro has ~200 FTEs dedicated to Automation Anywhere implementations. This concentration of expertise is assisting Wipro in enhancing speed-to-benefit for clients, particularly in areas where Wipro has conducted multiple assignments, for example in:
Overall, Wipro has ~400 curated and non-curated bots in its library. This has assisted in halving implementation cycle times in these areas, to around four weeks.
Wipro also perceives the ease of deployment and ease of debugging of Automation Anywhere, facilitated by the structuring of its platform into separate orchestration and task execution bots, is another factor that has helped enhance speed-to-benefit.
Wipro’s creation of a sizeable team of Automation Anywhere specialists means it has the bandwidth to respond rapidly to new opportunities and to initiate new projects within 1-2 weeks.
Speed of support to architecture queries is another important factor both in architecting in the right way and in speed-to-market. Around a third (~100) of Automation Anywhere personnel are within its support and services organization, providing 24X7 support by phone and email, and ensuring a two-day resolution time. This is of particular importance to Wipro in support of its multi-geography RPA projects.
Extending the Partnership: Tightening Integration between Automation Anywhere & Wipro Platforms to Build New Digital Business Process Models
In addition to standard rule-based RPA deployments of Automation Anywhere, Wipro is also increasingly:
In an ongoing development of the partnership, Wipro will use Automation Anywhere cognitive bots to complement Wipro HOLMES, using Automation Anywhere for rapid deployments, probably linked to OCR, and HOLMES to support more demanding cognitive requirements using a range of customized statistical techniques for more complicated extraction and understanding of data and for predictive analytics.
Accordingly, Wipro is strengthening its partnership with Automation Anywhere both to deliver tighter execution of rule-based RPA implementations and as a key platform component in the creation of future digital business process models.
]]>As well as conducting extensive research into RPA and AI, NelsonHall is also chairing international conferences on the subject. In July, we chaired SSON’s second RPA in Shared Services Summit in Chicago, and we will also be chairing SSON’s third RPA in Shared Services Summit in Braselton, Georgia on 1st to 2nd December. In the build-up to the December event we thought we would share some of our insights into rolling out RPA. These topics were the subject of much discussion in Chicago earlier this year and are likely to be the subject of further in-depth discussion in Atlanta (Braselton).
This is the third and final blog in a series presenting key guidelines for organizations embarking on an RPA project, covering project preparation, implementation, support, and management. Here I take a look at the stages of deployment, from pilot development, through design & build, to production, maintenance, and support.
Piloting & deployment – it’s all about the business
When developing pilots, it’s important to recognize that the organization is addressing a business problem and not just applying a technology. Accordingly, organizations should consider how they can make a process better and achieve service delivery innovation, and not just service delivery automation, before they proceed. One framework that can be used in analyzing business processes is the ‘eliminate/simplify/standardize/automate’ approach.
While organizations will probably want to start with some simple and relatively modest RPA pilots to gain quick wins and acceptance of RPA within the organization (and we would recommend that they do so), it is important as the use of RPA matures to consider redesigning and standardizing processes to achieve maximum benefit. So begin with simple manual processes for quick wins, followed by more extensive mapping and reengineering of processes. Indeed, one approach often taken by organizations is to insert robotics and then use the metrics available from robotics to better understand how to reengineer processes downstream.
For early pilots, pick processes where the business unit is willing to take a ‘test & learn’ approach, and live with any need to refine the initial application of RPA. Some level of experimentation and calculated risk taking is OK – it helps the developers to improve their understanding of what can and cannot be achieved from the application of RPA. Also, quality increases over time, so in the medium term, organizations should increasingly consider batch automation rather than in-line automation, and think about tool suites and not just RPA.
Communication remains important throughout, and the organization should be extremely transparent about any pilots taking place. RPA does require a strong emphasis on, and appetite for, management of change. In terms of effectiveness of communication and clarifying the nature of RPA pilots and deployments, proof-of-concept videos generally work a lot better than the written or spoken word.
Bot testing is also important, and organizations have found that bot testing is different from waterfall UAT. Ideally, bots should be tested using a copy of the production environment.
Access to applications is potentially a major hurdle, with organizations needing to establish virtual employees as a new category of employee and give the appropriate virtual user ID access to all applications that require a user ID. The IT function must be extensively involved at this stage to agree access to applications and data. In particular, they may be concerned about the manner of storage of passwords. What’s more, IT personnel are likely to know about the vagaries of the IT landscape that are unknown to operations personnel!
Reporting, contingency & change management key to RPA production
At the production stage, it is important to implement a RPA reporting tool to:
There is also a need for contingency planning to cover situations where something goes wrong and work is not allocated to bots. Contingency plans may include co-locating a bot support person or team with operations personnel.
The organization also needs to decide which part of the organization will be responsible for bot scheduling. This can either be overseen by the IT department or, more likely, the operations team can take responsibility for scheduling both personnel and bots. Overall bot monitoring, on the other hand, will probably be carried out centrally.
It remains common practice, though not universal, for RPA software vendors to charge on the basis of the number of bot licenses. Accordingly, since an individual bot license can be used in support of any of the processes automated by the organization, organizations may wish to centralize an element of their bot scheduling to optimize bot license utilization.
At the production stage, liaison with application owners is very important to proactively identify changes in functionality that may impact bot operation, so that these can be addressed in advance. Maintenance is often centralized as part of the automation CoE.
Find out more at the SSON RPA in Shared Services Summit, 1st to 2nd December
NelsonHall will be chairing the third SSON RPA in Shared Services Summit in Braselton, Georgia on 1st to 2nd December, and will share further insights into RPA, including hand-outs of our RPA Operating Model Guidelines. You can register for the summit here.
Also, if you would like to find out more about NelsonHall’s expensive program of RPA & AI research, and get involved, please contact Guy Saunders.
Plus, buy-side organizations can get involved with NelsonHall’s Buyer Intelligence Group (BIG), a buy-side only community which runs regular webinars on RPA, with your buy-side peers sharing their RPA experiences. To find out more, contact Matthaus Davies.
This is the final blog in a three-part series. See also:
Part 1: How to Lay the Foundations for a Successful RPA Project
]]>
As well as conducting extensive research into RPA and AI, NelsonHall is also chairing international conferences on the subject. In July, we chaired SSON’s second RPA in Shared Services Summit in Chicago, and we will also be chairing SSON’s third RPA in Shared Services Summit in Braselton, Georgia on 1st to 2nd December. In the build-up to the December event we thought we would share some of our insights into rolling out RPA. These topics were the subject of much discussion in Chicago earlier this year and are likely to be the subject of further in-depth discussion in Atlanta (Braselton).
This is the second in a series of blogs presenting key guidelines for organizations embarking on an RPA project, covering project preparation, implementation, support, and management. Here I take a look at how to assess and prioritize RPA opportunities prior to project deployment.
Prioritize opportunities for quick wins
An enterprise level governance committee should be involved in the assessment and prioritization of RPA opportunities, and this committee needs to establish a formal framework for project/opportunity selection. For example, a simple but effective framework is to evaluate opportunities based on their:
The business units should be involved in the generation of ideas for the application of RPA, and these ideas can be compiled in a collaboration system such as SharePoint prior to their review by global process owners and subsequent evaluation by the assessment committee. The aim is to select projects that have a high business impact and high sponsorship level but are relatively easy to implement. As is usual when undertaking new initiatives or using new technologies, aim to get some quick wins and start at the easy end of the project spectrum.
However, organizations also recognize that even those ideas and suggestions that have been rejected for RPA are useful in identifying process pain points, and one suggestion is to pass these ideas to the wider business improvement or reengineering group to investigate alternative approaches to process improvement.
Target stable processes
Other considerations that need to be taken into account include the level of stability of processes and their underlying applications. Clearly, basic RPA does not readily adapt to significant process change, and so, to avoid excessive levels of maintenance, organizations should only choose relatively stable processes based on a stable application infrastructure. Processes that are subject to high levels of change are not appropriate candidates for the application of RPA.
Equally, it is important that the RPA implementers have permission to access the required applications from the application owners, who can initially have major concerns about security, and that the RPA implementers understand any peculiarities of the applications and know about any upgrades or modifications planned.
The importance of IT involvement
It is important that the IT organization is involved, as their knowledge of the application operating infrastructure and any forthcoming changes to applications and infrastructure need to be taken into account at this stage. In particular, it is important to involve identity and access management teams in assessments.
Also, the IT department may well take the lead in establishing RPA security and infrastructure operations. Other key decisions that require strong involvement of the IT organization include:
Find out more at the SSON RPA in Shared Services Summit, 1st to 2nd December
NelsonHall will be chairing the third SSON RPA in Shared Services Summit in Braselton, Georgia on 1st to 2nd December, and will share further insights into RPA, including hand-outs of our RPA Operating Model Guidelines. You can register for the summit here.
Also, if you would like to find out more about NelsonHall’s expensive program of RPA & AI research, and get involved, please contact Guy Saunders.
Plus, buy-side organizations can get involved with NelsonHall’s Buyer Intelligence Group (BIG), a buy-side only community which runs regular webinars on sourcing topics, including the impact of RPA. The next RPA webinar will be held later this month: to find out more, contact Guy Saunders.
In the third blog in the series, I will look at deploying an RPA project, from developing pilots, through design & build, to production, maintenance, and support.
]]>
As well as conducting extensive research into RPA and AI, NelsonHall is also chairing international conferences on the subject. In July, we chaired SSON’s second RPA in Shared Services Summit in Chicago, and we will also be chairing SSON’s third RPA in Shared Services Summit in Braselton, Georgia on 1st to 2nd December. In the build-up to the December event we thought we would share some of our insights into rolling out RPA. These topics were the subject of much discussion in Chicago earlier this year and are likely to be the subject of further in-depth discussion in Atlanta (Braselton).
This is the first in a series of blogs presenting key guidelines for organizations embarking on RPA, covering establishing the RPA framework, RPA implementation, support, and management. First up, I take a look at how to prepare for an RPA initiative, including establishing the plans and frameworks needed to lay the foundations for a successful project.
Getting started – communication is key
Essential action items for organizations prior to embarking on their first RPA project are:
Communication is key to ensuring that use of RPA is accepted by both executives and staff alike, with stakeholder management critical. At the enterprise level, the RPA/automation steering committee may involve:
Start with awareness training to get support from departments and C-level executives. Senior leader support is key to adoption. Videos demonstrating RPA are potentially much more effective than written papers at this stage. Important considerations to address with executives include:
When communicating to staff, remember to:
Establish a central governance process
It is important to establish a strong central governance process to ensure standardization across the enterprise, and to ensure that the enterprise is prioritizing the right opportunities. It is also important that IT is informed of, and represented within, the governance process.
An example of a robotics and automation governance framework established by one organization was to form:
Avoid RPA silos – create a centre of excellence
RPA is a key strategic enabler, so use of RPA needs to be embedded in the organization rather than siloed. Accordingly, the organization should consider establishing a RPA center of excellence, encompassing:
Establish a bot ID framework
While establishing a framework for allocation of IDs to bots may seem trivial, it has proven not to be so for many organizations where, for example, including ‘virtual workers’ in the HR system has proved insurmountable. In some instances, organizations have resorted to basing bot IDs on the IDs of the bot developer as a short-term fix, but this approach is far from ideal in the long-term.
Organizations should also make centralized decisions about bot license procurement, and here the IT department which has experience in software selection and purchasing should be involved. In particular, the IT department may be able to play a substantial role in RPA software procurement/negotiation.
Find out more at the SSON RPA in Shared Services Summit, 1st to 2nd December
NelsonHall will be chairing the third SSON RPA in Shared Services Summit in Braselton, Georgia on 1st to 2nd December, and will share further insights into RPA, including hand-outs of our RPA Operating Model Guidelines. You can register for the summit here.
Also, if you would like to find out more about NelsonHall’s extensive program of RPA & AI research, and get involved, please contact Guy Saunders.
Plus, buy-side organizations can get involved with NelsonHall’s Buyer Intelligence Group (BIG), a buy-side only community which runs regular webinars on sourcing topics, including the impact of RPA. The next RPA webinar will be held in November: to find out more, contact Matthaus Davies.
In the second blog in this series, I will look at RPA need assessment and opportunity identification prior to project deployment.
]]>
Within its Lean Digital approach, Genpact is using digital and design thinking (DT) to assist organizations in identifying and addressing what is possible rather than just aiming to match current best-in-class, a concept now made passé by new market entrants.
At a recent event hosted at Genpact’s new center in Palo Alto, one client speaker described Genpact’s approach to DT. The company, a global consumer goods giant, had set up a separate unit within its large and mature GBS organization with a remit to identify major disruptions - with a big emphasis on “major”. It set a target of 10x improvement (rather than, say 30%) to ensure thinking differently about activities, in order to achieve major changes in approach, not simply incremental improvements within existing process frameworks. The company already had mature best-of-breed processes and was being told by shared service consultants that the GBS organization merely needed to continue to apply more technology to existing order management processes. However, the company perceived a need to “do over” its processes to target fundamental and 10x improvements rather than continue to enhance the status quo.
The establishment of a separate entity within the GBS organization to target this level of improvement was important in order to put personnel into a psychological safety zone separated from the influence of existing operations experts, existing process perceived wisdom, and a tendency to be satisfied with incremental change. The unit then mapped out 160 processes and screened them for disruption potential, using two criteria to identify potential candidates:
The exercise identified five initial areas for disruption with one of these being order management.
On order management, the company then sought external input from an organization that could contribute both subject matter expertise and DT capability. And Genpact, not an existing supplier to the GBS organization for order management, provided a team of 5-10 dedicated personnel supported by a supplementary team of ~30 personnel.
The team undertook an initial workshop of 2-3 days followed by a 6-8 week design thinking and envisioning journey. The key principles here were “to fall in love with the problem, not the solution”, with the client perceiving many DT consultancies as being too ready to lock-in to a (preferred) solution too early in the DT exercise, and to use creative inputs, not experts. In this case, personnel with experience in STP in capital markets were introduced in support of generating new thinking, and it was five weeks into the DT exercise before the client’s team was introduced to possible technologies.
This DT exercise identified two fundamental principles for changing the nature of order management:
This company identified the key criteria for selecting a design thinking partner to be a service provider that:
Genpact claims to be ready to cannibalize its own revenue (as do, indeed, all BPS providers we have spoken to – the expected quid pro quo being that the client outsources other activities to them). However, in this example, the order management “agents” being disrupted consist of 200-300 in-house client FTEs and 400-500 FTEs from other BPS service providers, so there is no immediate threat to Genpact revenues.
The Real Impact of RPA/AI is Still Some Way Off
Clearly the application of digital, RPA and AI technologies is going to have a significant impact on the nature of BPS vendor revenues in future, and, of course, on commercial models. However, at present, the level of revenue disruption facing BPS vendors is being limited by:
Nonetheless, organizations are showing considerable interest in concepts such as Lean Digital. Genpact CEO ‘Tiger’ Tyagarajan says he has been involved in 79 CEO meetings (to discuss digital process transformation/disruptive propositions as a result of the company’s lean digital positioning) in 2016 compared to fewer than 10 CEO meetings in the previous 11 years.
Order Management an Activity Where Major Disruption Will Occur
Finally, this example (one of several that we have seen) illustrates that order management, which tends to have significant manual processing and to be client or industry-specific, is becoming a major target for the disruptive application of new digital technologies.
**********************
See also Genpact Combining Design Thinking & Digital Technologies to Generate Digital Asset Utilities by Rachael Stormonth, published this week here.
]]>RPA is a great new technology and one that is yet to be widely deployed by most organizations. Nonetheless, RPA fills one very specific niche and remains essentially a band-aid for legacy processes. It is tremendous for executing on processes where each step is clearly defined, and for implementing continuous improvement in relatively static legacy process environments. However, RPA, as TCS highlights, does have the disadvantages that it fails to incorporate learning and can really only effectively be applied to processes that undergo little change over time. TCS also argues that RPA fails to scale and fails to deliver sustainable value.
These latter criticisms seem unfair in that RPA can be applied on a large scale, though frequently scale is achieved via numerous small implementations rather than one major implementation. Similarly, provided processes remain largely unchanged, the value from RPA is sustained. The real distinction is not scalability but the nature of the process environment in which the technology is being applied.
Accordingly, while RPA is great for continuous improvement within a static legacy process environment where processes are largely rule-based, it is less applicable for new business models within dynamic process environments where processes are extensively judgment-based. New technologies with built-in learning and adaptation are more applicable here. And this is where TCS is positioning Ignio.
TCS refers to Ignio as a “neural automation platform” and as a “Services-as-Software” platform, the latter arguably a much more accurate description of the impact of digital on organizations than the much-copied Accenture “as-a-Service” expression.
TCS summarizes Ignio as having the following capabilities:
TCS Ignio, like IPsoft Amelia, began life as a tool for supporting IT infrastructure management, specifically datacenter operations. TCS Ignio was launched in May 2015 and is currently used by ten organizations, which includes Nationwide Building Society in the U.K. All ten are using Ignio in support of their IT operations, though the scope of its usage remains limited at present, with Ignio being used within Nationwide in support of batch performance and capacity management. Eventually the software is expected to be deployed to learn more widely about the IT environment and predict and resolve IT issues, and Ignio is already being used for patch and upgrade management by one major financial services institution.
Nonetheless, despite its relatively low level of adoption so far within IT operations, TCS is experiencing considerable wider interest in Ignio and feels it should strike while the iron is hot and take Ignio out into the wider business process environment immediately.
The implications are that the Ignio roll-out will be rapid (expect to see the first public example in the next quarter) and will take place domain by domain, as for RPA, with initial targeted areas likely to include purchase-to-pay and order-to-cash within F&A and order management-related processes within supply chain. In order to target each specific domain, TCS is pre-building “skills” which will be downloadable from the “Ignio store”. One of the initial implementations seems likely to be supporting a major retailer in resolving the downstream implications of delivery failures due to causes such as traffic accidents or weather-related incidents. Other potential supply chain-related applications cited for Ignio include:
Machine learning technologies are receiving considerable interest right now and TCS, like other vendors, recognizes that rapid automation is being driven faster than ever before by the desire for competitive survival and differentiation, and in response is adopting a “if it can be automated, it must be automated” stance. And the timescales for implementation of Ignio, cited at 4-6 weeks, are comparable to that for RPA. So Ignio, like RPA, is a relatively quick and inexpensive route to process improvement. And, unlike many cognitive applications, it is targeted strongly at industry-specific and back office processes and not just customer-facing ones.
Accordingly, while RPA will remain a key technology in the short-term for fixing relatively static legacy rule-based processes, next generation machine learning-based “Services-as-Software” platforms such as Ignio will increasingly be used for judgment-based processes and in support of new business models. And TCS, which a year ago was promoting RPA, is now leading with its Ignio neural automation-based “Services-as-Software” platform.
]]>Sector domain focus
While Infosys is taking a horizontal approach to taking emerging technologies to the next level, WNS regards sector domain expertise as its key differentiator. Accordingly, while Infosys has moved delivery into a separate horizontal delivery organization, WNS continues to organize by vertical across both sales and delivery and looks to offer its employees careers as industry domain experts – it views its personnel as being sector experts and not just experts in a particular horizontal.
The differing philosophies of the two firms are also reflected in their approach to technology. WNS brought in a CTO nine months ago and now has a technology services organization. But where Infosys is building tools that are applicable cross-domain, WNS is building platforms and BPaaS solutions that address specific pain points within targeted sectors.
Location strategy
And while both firms are seeking (like all service providers) to achieve non-linear revenue growth, they have markedly different location strategies. Infosys is placing its bets on technology and largely leaving its delivery footprint unchanged, whereas WNS is increasingly taking its delivery network to tier 2 cities, not just in India but also within North America, Eastern Europe and South Africa, to continue to combine the cost benefits of labor arbitrage with those of technology. This dual approach should assist WNS in providing greater price-competitiveness and protection for its margins in the face of industry-wide pressure from clients for enhanced productivity improvement. Despite the industry-wide focus and investment on automation, the levels of roll-out of automation across the industry have typically been insufficient to outstrip pricing declines to generate non-linear revenue growth, and will remain so in the short-term, so location strategy still has a role to play.
At the same time, WNS is fighting hard for the best talent within India. For example, the company:
Industry domain credentials & technology strategy
Like a number of its competitors, WNS is increasingly focused on assisting organizations in adopting advanced digital business models that will offer them protection “not just from existing competitors but from competitors that don’t yet exist”. In particular, WNS is strengthening its positioning both in verticals where it is well established, such as insurance and the travel sector, and also in newer target sectors such as utilities, shipping & logistics, and healthcare, aiming to differentiate both with domain-specific technology and with domain-specific people. The domain focus of its personnel is underpinned not just by its organizational structure but also, for example, by the adoption of domain universities.
Accordingly, WNS is investing in digital frameworks, AI models, and “assisting clients in achieving the art of the possible” but within a strongly domain-centric framework. WNS’ overall technology strategy is strongly focused on domain IP, and combining this domain IP with analytics and RPA. WNS sees analytics as key; it has won a number of recent engagements leading with analytics, and is embedding analytics into its horizontal and vertical solutions as well as offering analytics services on a standalone basis. It currently has ~2,500 FTEs deployed on research and analytics, of whom ~1,600 are engaged on “pure” analytics.
But the overriding theme for WNS within its target domains is a strong focus on domain-specific platforms and BPaaS offerings, specifically platforms that digitalize and alleviate the pain points left behind by the traditional industry solutions, and this approach is being particularly strongly applied by WNS in the travel and insurance sectors.
In the travel sector, WNS offers platforms, often combined with RPA and analytics, in support of:
It also offers a RPA-based solution in support of fulfilment, and Qbay in support of workflow management.
In addition, WNS is making bigger bets in the travel sector, investing in larger platform suites in the form of its commercial planning suite, including analytics in support of sales, code shares, revenue management, and loyalty. The emphasis is on reducing revenue leakage for travel companies, and in assisting them in balancing enhanced customer experience with their own profitability.
The degree of impact sought from these platforms is shown by the fact that WNS views its travel sector platforms as having ~$30m revenue potential within three years, though the bulk of this is still expected to come from the established Verifare revenue recovery platform.
WNS’ platforms and BPaaS offerings for the insurance sector include:
In addition, WNS offers two approaches to closed block policy servicing:
The BPaaS service is underpinned by WNS’ ability to act as a TPA across all states in the U.S.
WNS’ vertical focus is not limited to traditional industry-specific processes. The company has also developed 10 industry-specific F&A services, with 50%+ industry-specific scope in F&A, with the domain-specific flavor principally concentrated within O2C.
In summary
Both Infosys and WNS are enhancing their technology and people capabilities with the aim of assisting organizations in implementing next generation digital business models. However, while Infosys is taking the horizontal route of developing new tools with cross-domain applicability and encouraging staff development via design thinking, WNS’ approach is strongly vertical centric, developing domain-specific platforms and personnel with strong vertical knowledge and loyalties. So, two different approaches and differing trajectories, but with the same goal and no single winning route.
]]>To focus further on this sector, HCL has enhanced its capability to offer end-to-end and BPaaS-based services through a number of partnerships and is now going to market targeting four principal pain points for utilities, namely:
Debt Management
Starting with debt management, HCL has carried out consulting assignments around debt for a number of utilities, with the emphasis on assisting utilities in preventing their customers falling into debt, e.g. by making sure that billing is correct. In addition to consulting services, HCL offers end-to-end debt collection services and HCL’s current utilities debt management clients include:
For the U.K. water utility, HCL:
Building on this capability, HCL has partnered with a European-based collections software and field services company in order to offer end-to-end BPaaS collections services, and HCL will increasingly seek to offer collections as a BPaaS service.
Smart Metering
Here, HCL has carried out smart metering realization for four utilities in the U.S. and has consulted in this area in the U.K., helping a utility identify what types of customer to target and how to manage campaigns to increase adoption of smart meters.
HCL’s offering for smart metering consists of an end-to-end service from planning & forecasting through data cleaning & enhancement, customer contact planning & contact, engineer scheduling & co-ordination, smart meter installation, updating supplier records & processing meter exchange, to first bill production, exceptions management, and collections and bill shock management.
To enhance its end-to-end offering for smart meter introduction & management, HCL has partnered with:
Enhancing Customer Engagement
In customer engagement, HCL offers multi-channel coverage across the customer lifecycle. In particular, HCL is assisting utilities in deflecting as much functionality as possible to web self-service, e.g. working with a U.S. utility to ensure that home moves and meter reads can be largely handled through the utility self-serve, increasing self-serve by ~7%. In support of this customer engagement offering, HCL has partnered with:
In particular, the partner software assists HCL in identifying the root cause of customer calls so that the appropriate functionality can be provided on the web site to deflect calls to self-serve. Overall, HCL aims to ensure that the customer receives a standardized and consistent response across channels, while deflecting to self-serve whenever possible.
Aligning the Back-Office and Front-Office
HCL’s final theme is aligning the back-office with the front-office. For example, HCL has, for a U.K. water utility, implemented a workflow establishing specialist queues, so that tasks are automatically queued to the right person (e.g. a home move specialist or billing exceptions specialist). For this utility, HCL guaranteed a reduction in bad debt of $8m and reduced the cost of debt collection by 40% over 5 years while reducing the number of customer disputes by 50%.
Back-office offerings from HCL for the utilities sector include exceptions processing, new account processing, sales maximization investigations, provisioning & processing of optant meter readings, property searches, and handling business utility accounts.
‘Engage’ Service Delivery Model
In addition to targeting these specific pain points, HCL also offers ‘Engage’, an integrated service delivery model for utilities, suitable for smaller utilities and start-ups that require a complete utility solution covering both services and infrastructure, and typically operating on a per subscriber pricing model. ‘Engage’ includes:
HCL has undertaken this complete service, using a BPaaS service based on a customer and billing platform utilizing SAP, SAP HANA, OpenText, and Genesys, for a U.S. utility using per customer account based pricing.
How HCL Differentiates BPS for Utilities
HCL differentiates its BPS services for the utilities sector around six capabilities:
For example, HCL Toscana has delivered 20% cost savings for a U.K. water utility over 5-years, by improving the accuracy and transparency of its workflow. HCL Toscana now incorporates RPA and, going forward, HCL perceives that RPA can be applied to the meter to cash cycle for utilities to assist in areas such as:
Other possibilities for automation include:
In analytics, HCL carries out analytics assignments for a number of utilities and has mapped out the utilities customer journey, identifying appropriate analytics for each of the lifecycle stages of ‘sale’, ‘consume’, ‘debt’, and ‘move’.
HCL has also responded to the demand for process maturity benchmarks by developing its ‘Utili-Best’ benchmarking framework. This modelling framework is based on client data from its consulting engagements, together with third-party data including that from OFWAT, OFGEM, FERC, and AER. The framework covers benchmarking for billing, metering, customer service, and debt levels.
The ‘Utili-Best’ benchmarking framework is further complemented by HCL’s ‘Wow & Waste’ framework. This is used to identify ‘wow’ factors that utilities should aim for, identifying potential sources of competitive advantage by matching the wows to opportunities, and the ‘waste’ elements that they should aim to eliminate. Examples of ‘wow’ factors are ‘accurate bills sent to customers’ and ‘customer pays in full’. Examples of ‘waste’ factors are billing errors or late bills. These ‘wow’ and ‘waste’ factors can then be translated into KPIs such as proportion of bills collected and NPS during ‘bill to collect’.
]]>RPA is essentially execution of repeatable, rule-based tasks which require little or no cognition or human expertise or human intervention (though RPA can be used to support agents within relatively hybrid tasks) by a bot mimicking human action. The bot operates enterprise software and applications through existing user interfaces based on pre-defined rules and inputs and is best suited to relatively heavy-duty transaction processing.
The next stage is to complement RPA with newer technologies such as AI where judgment-based tasks are starting to be supported with cognitive platforms. Examples of cognitive technologies include adaptive learning, speech recognition, natural language processing, and pattern identification algorithms. While RPA has typically been in full-swing for about two years and is currently reaching its peak of roll-out, cognitive technologies typically won’t reach wide-scale adoption for another few years. When they do, they promise to have most impact not in data-centric transactional processing activities but around unstructured sales and customer service content and processes.
O.K., so what about BPaaS?
Typically, BPaaS consists of a platform hosted by the vendor, ideally on a one-to-many basis, similarly to SaaS, complemented by operations personnel. These BPaaS platforms have been around for some time in areas such as finance & accounting in the form of systems of engagement surrounding core systems such as ERPs. In this context it is common for ERPs to be supplemented by specialist systems of engagement in support of processes such as order-to-cash and record-to-report. However, initially these implementations tended to be client-specific and one-to-one rather than one-to-many and true BPaaS.
Indeed, BPaaS remains a major trend within finance & accounting. While only start-ups and spin-offs seem likely to use BPaaS to support their full finance & accounting operations in the short-term, suppliers are increasingly spinning off individual towers such as accounts payable in BPaaS form.
However, where BPaaS is arguably coming into its own is in the form of systems of engagement (SoE) to tackle particular pain points and a number of vendors are developing systems of engagement that can be embedded with analytics to provide packaged, typically BPaaS, services. These systems of engagement and BPaaS services are sitting on top of systems of record in the form of, for example, core banking platforms or ERPs to tackle very specific pain points. Examples in the BFSI sector that are becoming increasingly common are BPaaS services around mortgage origination and KYC. Other areas currently being tackled by BPaaS include wealth management and triage around property & casualty underwriting.
In the same way that systems of engagement are currently required in the back-office to support ERPs, systems of engagement are starting to emerge in the front-office to provide a single view of the customer and to recommend “next best actions” both to agents and, increasingly, direct to consumers via digital channels. While automation in BPO in the form of BPaaS and systems of engagement is still largely centered in the back-office and beginning to be implemented in the middle-office, in future this approach and the more advanced forms of automation will be highly important in supporting sales and service in the front-office.
This change in BPaaS strategy from looking to replace core systems to providing systems of engagement surrounding the core systems offers a number of benefits, namely:
I started with the question “where does automation in BPO go from here?” In summary, BPO automation will only take a significant step forward when RPA becomes AI, and BPaaS emerges from the back-office to support sales and service in the front-office.
]]>These early transformations were also frequently seen by both client and provider as a foundation for utility services, often with the client retaining a stake in the resulting joint venture. So talk of automation and utility services is not new. However, the success rate of these early projects was limited, often being impeded by the challenge, and timescales, of technology implementation and the frequent over-tailoring of initial technology implementations to the client despite the goal of establishing a wider multi-client utility.
So where do we stand with automation today and will it succeed this time around? Let’s focus on the three overriding topics of the moment in the application of automation to BPO: robotics process automation (RPA), analytics, and BPaaS.
These technologies can be assessed in terms of their impact on a business framework frequently used in IT departments (but equally applicable to BPO) of “change the business” vs. “run-the-business”. The area where all the current hype is focused is “change-the business”. At its most extreme this involves organizations in aiming to become more digital and mirror the success of the “disruptors” such as Uber and Amazon. There are a number of technologies that play into this “enabling the organization to become Uber” space, including digital, the Internet-of-Things (IoT), and software apps, supported by analytics.
However, RPA does not belong in this category. Though RPA is one of a number of technologies that are changing the nature and application of BPO, the current implementation of RPA is essentially “labor arbitrage on labor arbitrage” (i.e. moving delivery to even lower cost n-tier delivery locations) in the “run-the-business” category – principally enabling the organization to run with increased efficiency using existing technology and offshoring frameworks.
The final part of this new BPO delivery framework is governance, where Global Business Services (GBS) plays a role in improving end-to-end process management and supporting the integration of outsourced operations with the retained processes or organization.
Analytics plays a role across the piece, principally by supporting identification of process improvements alongside existing lean six sigma methodologies in “run-the-business”, and by facilitating automated process learning in “change-the-business”. It is clear that drill-down dashboards and analytics now need to be built into all BPO services and to increasingly support business actions as well as much more real-time process improvement.
However, current implementations of RPA are typically not cognitive and don’t undertake judgment-based tasks. Though robotics software will often interface directly with core systems via APIs, the current implementations of RPA software largely repeat the keystrokes that a human would perform following a simple set of rules. In essence, that tends to focus RPA around data reading & extraction, data validation and enrichment, and data copying and pasting, along with reporting.
At the same time, RPA largely cannot operate where voice inputs are involved, but is principally involved in manipulating numeric data and structured text.
Accordingly, RPA has so far made most impact in transactional processes such as finance & accounting, and increasingly industry-specific processes in sectors such as banking and insurance, with adoption in insurance ahead of that in banking due to the lesser regulatory concerns. It has had less impact so far in HR processes, though it is applicable to transactional elements here, and its use within customer service is limited, since customer service typically needs more cognitive analytics capability than is readily available in standard RPA.
In the words of one leading financial analyst, RPA can be regarded “as a race to the bottom”, and in an area such as F&A, RPA has the potential to remove a further 20% of the cost base from relatively mature offshored F&A processes over a period of several years. Hence, RPA is having a dramatic impact on existing outsourcing contract commercials. Firstly, the vendors know that their mature BPO contracts in highly transactional areas such as F&A are vulnerable to competitive attack. As a result, suppliers have been rushing to renegotiate their key contracts, and to apply RPA, with typically little desire to remain on FTE-based pricing. If 20% of FTEs are to be replaced with bots, then FTE-based pricing becomes even more unattractive to suppliers than previously. So, suppliers are typically looking to move to fixed price or transactional-based pricing plus built-in productivity targets. The challenge then becomes one of using automation and robotics to drive down the cost curve as rapidly, or more rapidly, than the contractual productivity commitment so as to maintain or increase contract profit margins, with the cost per bot typically quoted as a third of the price of an offshore FTE.
Another commercial alternative is to retain FTE-based pricing within the main contract, but to run each RPA project as a separate gainshare. However, in general rule-based RPA arguably lends itself best to transaction-based pricing, while judgment-based AI technology (when it matures in a few years’ time) will arguably lend itself to more outcome-based pricing.
So, despite all the hype, RPA has a lot in common with the much maligned “lift-and-shift” approach to BPO. It makes no changes to underlying applications and does nothing to improve the overall state of underlying systems. And, as with “lift-and-shift”, it makes operational process knowledge even more important than previously. The key success factors for successful application of RPA are largely identical to those in “lift-and-shift” BPO and include well-defined process and strong standard operating procedures together with agent change management and process improvement expertise. Indeed, RPA is driving demand for lean six sigma consultants, with many of the suppliers employing this type of skill to drive RPA projects to ensure their success. And, as with “lift-and-shift” BPO, one of the key factors in delays is failure to gain the support of in-house IT at an early stage in the project.
However, despite its relative lack of sophistication, RPA has one really major benefit: it makes continuous improvement possible. One of the traditional failings of BPO and the reason why many contracts failed to achieve their promised productivity gains was that the roadmaps to new process models and ability to identify process improvements lacked a quick and easy means of realization. It is one thing to identify process improvements and potential; it is another to realize them if realization requires a significant change in technology or platform implementation, since these have traditionally required significant timescales and capital investment. RPA considerably lowers the time and investment threshold of implementing a stream of modest process improvements with the low cost of a bot license complemented by typical implementation timescales of less than three months.
So, one way of looking at RPA is that it is the new offshore, but one that operates at much lower sub-process FTE thresholds than offshoring and is achieved even more quickly. It can also be applied to retained process elements to improve end-to-end process flow, a common inhibitor to BPO effectiveness, and not just to those processes already outsourced.
In my next blog, I’ll be looking at where RPA, and automation in BPO generally, goes from here.
]]>Well, the use of workflow and platforms to surround and supplement the client’s core systems has been well-established for a period of years. BPO has worked relatively well in these environments. The vendors largely have process models and roadmaps in place for the principal process areas. However, continuous improvement has been somewhat spasmodic in the past, since analytics and lean six sigma projects have tended to be carried out as one-off exercises rather than ongoing programs of activity, and the investment hurdles for further automation in support of process improvement have often been too high to make the process improvements identified readily realizable. This is now starting to change.
Firstly, RPA now provides a mechanism, at present largely restricted to rule-based processes, whereby process improvements identified through lean exercises can now be realized at very low levels of investment and accordingly address areas involving small numbers of FTEs and not just major process areas. This, however, is largely a short-term one-off hit, with most vendors likely to have applied RPA reasonably fully to their major contracts at least by the end of 2016.
The next form of automation within BPO, at a higher level of investment, is the use of BPaaS to address sub-processes where fundamental change is required. These BPaaS implementations will incorporate best-practice processes, and increasingly incorporate analytics and elements of self-learning to ensure that process adjustments are made on a more ongoing basis than in earlier forms of BPO.
The present form of RPA is largely a one-off cost reduction measure in the same way that offshoring is a one-off cost reduction measure. The natural progression from this automation of rule-based transactional process is increased automation of judgment-based processes. This automation of judgment-based processes is still in its infancy and will frequently be used initially to support agent judgments, recommending next courses of action to be taken. This use of AI will move the focus of automation much more into sales and service in the front-office. However, while some of these technologies are beginning to handle natural language processing in text form, further improvements in voice recognition are still required.
In the middle-office, the major transformation in BPO will come from the IoT which has the potential to fundamentally change the nature of service delivery and the value driven by BPO. Arguably, Uber is a form of IoT-based service based on GPS technology. IoT will, however, drive equally fundamental changes in service delivery in areas such as insurance (where it has already started to appear in support of auto insurance), healthcare and telemedicine, and home monitoring services.
]]>For example, within F&A BPS, Hewlett Packard Enterprise (HPE) is investing in a tool to assess the automation potential of organizations’ finance & accounting processes, which is being built into HPE’s FIT (Framework for Innovation & Transformation) framework, and HPE is now developing automation and digitization assessments and roadmaps at the front-end of F&A BPS contracts.
In its targeting of F&A BPS, HPE is becoming more sector specific and incorporating metrics specific to target sectors within FIT, starting with the telecoms and oil & gas sectors.
HPE is also becoming more business metric focused in its approach to F&A BPS and highlighting that the benefits of automation extend way beyond process cost take-out. Cash acceleration and cash utilization are major areas of focus for HPE within F&A BPS. In particular, HPE is stressing that the benefits of source-to-pay automation go beyond halving the S2P headcount and start to open the door to profit improvement opportunities through dynamic discounting. HPE formerly used to advise its clients on negotiating longer payment terms with their suppliers; the company has now changed its focus to encouraging its clients to negotiate early payment discounts and automate/digitize their P2P processes to achieve rapid approval of purchase invoices so that they can optimize their early discounts against these invoices. In many cases, the purchase invoice approval process has been too slow and the knowledge of potential discounts too inaccessible to take advantage of what could amount to a profit improvement opportunity equal to up to 2% of total goods purchased. For example, HPE estimates that the HP GBS organization has saved $2.7bn in early payment discounts over the past three years by taking this approach.
Accordingly, HPE has established a center of excellence for automation in F&A, and is beginning to encourage use of data pdf technology to reduce the need for OCR or manual rekeying of invoices. The company has a number of pilots in this area.
In terms of robotics, the company is currently using UiPath and Blue Prism, the latter particularly for connecting with ERP software, and Redwood for support for R2R and month-end close, and has built-up a library of ~750 accelerators. The company is also extensively using PDF Cloud, and its own Vertica software. HPE’s Business Process Analytics Tool (BPAT) is based on Vertica, which is used to provide an F&A dashboard covering both an executive view of KPIs and drill-downs into service performance.
For example, within P2P, HPE is aiming to digitize F&A processes by:
Overall HPE is increasingly seeking to place automation strategy and vision at the forefront of F&A process design, with automation and digitization leading the way in identifying possibilities for straight-through processing. Indeed, based on HPE’s F&A services transformation journey diagram, the company expects ~60% of future F&A BPS productivity improvements to be driven by automation and 40% to be driven by process change and staff reallocation & best-shoring.
Contrary to some expectations, RPA is only one automation component. In HPE’s automation journey in F&A BPS, RPA is expected to deliver around a quarter of the total productivity benefits to be achieved from automation, with a whole range of tools and platforms contributing around 75% of the automation benefits to be achieved.
As usual, one of the major challenges over the past year has been in training the company’s solution architects in thinking digital and identifying benefits beyond those previously achievable. As HPE suggested, many of the existing F&A process benchmarks may need to be rewritten over the next 12-months.
F&A BPS is arguably the most mature of all the BPS services. However, with real-time analytics increasingly identifying the opportunities, RPA lowering the barriers to process improvement, and organizations increasingly willing to automate, F&A BPS is now off on a new journey that promises a step change in productivity. Automation plays to the strengths of HPE, and F&A digitization is an area where the company is intending to strongly invest and compete.
]]>HCL is offering robotics both in the form of robotics software plus operations to its new & existing BPO contracts. HCL is typically deploying robotics in two forms:
Virtualized workforce, directly replacing the agent with robotics (~70% of current activity by value). Here HCL estimates that 50%-70% efficiency gains are achievable
Assisted decisioning, empowering the agent by providing them with additional information through non-invasive techniques (~30% of current activity by value) and achieving estimated efficiency gains of 20%-30%.
In general HCL is aiming to co-locate its robots with client systems to avoid the wait times inherent in robots accessing client systems using a surface integration techniques a through virtual desktop infrastructure (VDI).
The principal sectors currently being targeted by HCL for RPA are retail banking, investment banking, insurance, and telecoms, with the company also planning to apply robotics to the utilities sector, supply chain management, and finance & accounting. Overall, origination support is a major theme in the application of RPA by HCL. In addition, the company has applied robotics to track-and-trace in support of the logistics sector.
HCL currently has ~10 RPA implementations & pilots underway. Examples of where RPA has been applied by HCL include:
Account Opening for a European Bank
Prior to the application of robotics, the agent, having checked that the application data was complete and that the application was eligible, was required to enter duplicate application data separately into the bank’s money laundering and account opening systems,.
The implementation of robotics still requires agents to handle AML and checklist verification manually but applying automated data entry by ToscanaBot robots and presentation layer integration thereafter removed the subsequent data entry by agents leading to a refocusing of the agents on QC-related activities and an overall reduction of 42% in agent headcount.
Change of Address for a European Bank
This bank’s “change of address” process involved a number of mandatory checks including field checks and signature verification. However, it then potentially involved the agent in accessing a range of systems covering multiple banking products such as savings accounts, credit card, mortgage, and loan. This led to a lengthy agent training cycle since the agent needed to be familiar with each system supporting each of the full range of products offered by the bank.
While as in the first example, the agent is still required to perform the initial verification checks on the customer, robotics is then used to poll the various systems and present the relevant information to the agent. Once the agent authorizes, the robot now updates the systems. This has led to a 54% reduction in agent headcount.
Financial Reporting for a Large U.S. Bank
HCL has carried out a pilot with a large U.S. bank to address the challenges inherent to the financial reporting process. Through this pilot HCL proposes to replace manual activities covering data acquisition, data validation, and preparation of the financial reporting templates. In this pilot HCL estimates that it has achieved 54% reduction in the human effort and double digit reduction in the error rates. FTEs are now largely responsible for making the manual adjustments (subject to auditor, client and fund specs) and reviewing the Robot output instead of the usual maker/checker activities.
Fund Accounting for U.S. Bank
HCL has also carried out a pilot to address fund accounting processes with a U.S. bank. The principle was again to concentrate the agent activity on review and exception handling and to use robots for data input where possible. Here once RPA was implemented, following the introduction of workflow to facilitate hand-offs between agents and robots, the following steps were handled by agents:
Upload investor transactions
Review cash reconciliation
Review monetary value reconciliation
Review net asset value package
Robotics now handled the following steps:
Book trade & non-trade
Prepare cash reconciliation
Price securities
Prepare monetary value reconciliation
Book accruals
Prepare net asset value package.
This shows the potential to automate 60% of those activities formerly handled by agents.
In addition, HCL has implemented assisted decisioning for a telecoms operator, with robots accessing information from three systems: call manager, knowledge management, & billing, and in support of order management for a telecoms operator. In the latter case, order management data entry required knowledge of a different system for each region, again making agent training a significant issue for the company.
HCL’s robotic automation software is branded ToscanaBot, and as an integral part of the Toscana Suite which also includes HCL’s BPM/workflow software.
ToscanaBot is based on partner robotic software. The current partners used are Blue Prism and jacada, with in addition Automation Anywhere currently being onboarded. In the future, HCL plans to additionally partner with IPsoft and Celaton as the market becomes more sophisticated and increasingly embraces artificial cognition within RPA.
HCL aims to differentiate its robotics capability by:
Combining robotics within a portfolio of transformational tools including for example ICR/OCR, BPM, text mining & analytics, and machine learning. In particular, HCL is looking to incorporate more intelligence into its robotics offerings, including enhancing its ability to convert non-digital documents to digital format and convert unstructured data to structured data
Process and domain knowledge, HCL has so far largely targeted specialist industry-specific processes requiring significant domain knowledge rather than horizontal services and is working on creating add-ons for specific core software applications/ERPs, to facilitate integration between ToscanaBot and these core domain-specific applications
Creation of IP on top of partner software products.
Within BPO contracts, HCL is aiming to offer outcome-based pricing in conjunction with robotics, but in some instances the company has just sold the tools to the client organization or provided robotics as part of a wider ADM service.
Overall, HCL may be lagging behind some of its competitors in the application of RPA to horizontal processes such as F&A, though HCL is applying RPA to its own in-house finance & accounting process, but is at the forefront in the application of RPA to industry-specific processes where the company has strong domain knowledge in areas such as banking and supply chain management.
]]>However, Genpact has undergone a major change in strategy and positioning since October 2012, which is now starting to bear fruit. Following investment by Bain Capital, McKinsey’s subsequent involvement with Genpact encouraged a strategy based on acquisition of key domain IP/technology supported by domain-specific consulting and go-to-market capability. Genpact has now fine-tuned this strategy; its repositioning seems largely complete, with the company also some way down the path in terms of strategy execution.
Genpact, which used to position on the worthy but somewhat static concept of process excellence, is now positioning against “Genpact Intelligent Operations” with the more forward-looking and business-oriented tag line “Generating Impact”. While, in the current digital-centric BPS environment, it continues to leverage its heritage and positioning against the very valid point that BPS suppliers need to understand process and domain in order to successfully leverage technology, Genpact has expanded its positioning by adding a business focus, and incorporating greater emphasis on analytics and automation. In particular, the company is aiming to differentiate based on combining “process expertise to deliver effectiveness, analytics to provide visibility & direction, and advanced technologies to execute more efficiently”.
Genpact was an early leader in developing process frameworks that identify how process levers impact business outcomes and assist organizations in process innovation and moving towards process excellence, through its SEP (Smart Enterprise Processes) framework, an approach now adopted by the majority of major BPS vendors. While SEP and process excellence remains an important part of Genpact’s DNA, the framework was starting to lag behind that of rivals and clearly needed a major update in line with Genpact’s new strategy and positioning.
Accordingly, the new model, SEP 2.0, has moved beyond process excellence supported by business outcomes and process schema to:
Genpact’s process design principles have concurrently been amended to:
Genpact will clearly continue this approach but with a greater intensity in terms of applying “digital” to relatively end-to-end processes, while complementing these “end-to-end services”, and differentiating, by picking specific pain areas and building services that address more focused processes with “surgical interventions”.
This dual approach is embodied in “Genpact Digital”, which, predictably, is focusing on process-centric new technology, as opposed to platform-centric legacy technology, which can be applied to industry-specific and back-office processes (rather than front-office processes). In support of Genpact’s traditional business, Genpact Digital is focusing on digital integration services, looking at how advanced technologies can be embedded in support of wider end-to-end processes. The advanced technologies that Genpact is increasingly looking to embed within processes include machine learning & process automation, natural language processing, cognitive computing & AI, and algorithmic decisioning & service orchestration. Like many BPS vendors today, Genpact is placing a considerable emphasis on robotics process automation (RPA).
In addition, Genpact is looking to apply IoT for industrial asset optimization. Areas of key impact that it will target include increased planning & budgeting effectiveness & efficiency, reductions in emergency purchases, and reduced MRO and services costs.
Genpact Digital’s second area of focus in support of targeting narrower process pain points is in developing systems of engagement (SoE) that can be embedded with analytics to provide packaged, typically BPaaS, services.
Examples of the latter SoE that have been integrated with analytics and operations to address particular processes/pain points are:
In addition to investments in new service offerings, its new strategy requires Genpact to make significant investments in domain expertise and in IP specific to each of its target domains – which in turn leads to a need for greater focus. Accordingly, the company has considerably sharpened its vertical focus, reducing its previous 23 verticals to 9: retail banking, commercial banking, capital markets, insurance, healthcare, life sciences, CPG, manufacturing, and high-tech. Genpact is deliberately not targeting industries with a predominantly front-office process focus such as retail, telecoms, and utilities.
The company’s vertical approach not only has new service offerings but also enhanced go-to-market capability of both sales and solutioning consultants. In particular, Genpact has increased its S&M expenditure to 6.6% of revenue (previously 4.7%) and changed its recruiting approach to incorporate greater levels of domain experience into its salesforce. This is reflected in the average 15 years of experience within the current Genpact salesforce.
Genpact is looking to use its upgraded go-to-market capability to target transformative discussions and larger deals, and is aiming to create demand (rather than simply reacting to demand), to generate a greater proportion of sole-sourced deals. Clearly the company’s new portfolio of BPaaS services that are targeted at focused pain points have potentially a major role to play here. While Genpact’s 2014 revenues increased by a modest 6.9%, its client revenue profile improved with 89 clients with $5m+ revenues per annum, up from 78 in 2013. Even more encouragingly, Genpact’s 2014 bookings increased 50% to $2.16bn – this increase in bookings reflected in 12.5% CC revenue growth in Q1 2015.
BFSI (including capital markets) accounts for a third of Genpact’s revised target sectors and is a key area for new initiatives. Recent ones include:
Accordingly, Genpact is once again at the forefront in establishing the model for addressing, and deriving business value from, BPS. The main threat to Genpact within this strategy is its emphasis on industry-specific and back-office processes to the exclusion of front-office services. As industries disaggregate further, and even formerly industrial companies move from B2B to B2C models, industry-specific processes are frequently merging with front-office customer service and management. This blurring of front-office and middle-office processes may prove a threat to Genpact unless the company begins to extend its capability through acquisition to handle end-to-end consumer contact through fulfilment in a number of its target sectors.
]]>Nonetheless, the need for cost reduction within the retail, consumer goods, and high-tech sectors is often a strong surrogate for a need for wider process enhancement and changes in ways of working and there is often a strong correlation between the need to improve service quality and the need to reduce service cost. Across the high-tech, CG, and retail sectors globally, improving the level of service provided by existing shared service centers is highly important to 39% of companies, while reducing the cost base of existing shared service centers is important to 45% of these companies, As such the following cost reduction targets are strong indicators of process priorities within each sector:
While increasing levels of automation is playing an ever more important role in implementing new ways of working and process cost reduction, retail, CPG, and high-tech companies also exhibit a strong need to restructure their current service delivery, both organizationally to enhance their end-to-end process perspective and in terms of shoring to optimize service quality and efficiency.Consequently, moving to a Global Business Services environment is highly important to 55% of high-tech, CPG and retail companies and particularly so to high-tech companies,
In particular:
SSC Initiative Intentions
Overall, 60% of companies express a high level of intent to increasingly move to a Global Business Services environment, accompanied by a significant level of rebalancing of activities between delivery locations. Organizations in the::
The move to GBS environments supports a more hybrid and flexible approach to use of in-sourced and outsourced processes and organizations in the high-tech, CPG and retail sectors globally are in general undergoing shifts in their sourcing strategies in favor of BPO, with this shift most pronounced in the high-tech sector.
Nonetheless, organizations are becoming more demanding in their expectations from BPO service providers and seeking a much greater contribution from their service providers in both demonstrating a process vision for the future and making a more holistic contribution in the short-term. In terms of vision, it is important for BPO service providers to addrss the client's top-line revenue growth, and not just cost-to-serve, provide process consulting, and demonstrate an end-to-end process vision for the future. In terms of taking a more holistic approach, it is important for vendors to leverage automation and robotics and to support BPO services with complementary application management and (cloud) infrastructure management services.
To read the full NelsonHall report, go to "BPO Opportunities in Retail, CPG, and High-Tech"
]]>The objectives of the quarterly “BPS Confidence Index” are to identify:
The survey is open to all BPS service providers and participation is free-of-charge. To participate, please contact Paul Connolly.
The NelsonHall BPS Confidence Index score for Q1 2015 is 156 (out of 200), reflecting strong supplier confidence in BPS in 2015 relative to 2014.
Overall, the current economic climate is in general assisting the BPS market by maintaining a significant level of cost pressure, while the combination of cost pressure and technological advance is increasingly encouraging companies to look outside their own boundaries for new business models.
Accordingly, while emphasis on cost reduction remains high, BPS is increasingly being driven by organizations seeking to achieve transformation and to achieve revenue protection and growth. However, the geographic focus of these initiatives has switched back to company's core economies, with support for growth in emerging economies becoming a much less important driver of BPS adoption.
Nonetheless, despite this increasing optimism, frozen decision-making and business uncertainty are still significant inhibitors to adoption of BPS services, with the proportion of global clients and prospects whose sourcing decision-making is frozen reported to be 26% and clients and prospects are continuing to seek aggressive cost reductions, which can act as an impediment to deal completion.
By service line, F&A BPS was reported as exhibiting above average growth in Q4 2014, ahead of HR BPS and contact center services, as were the related and emerging areas of procurement BPS and supply chain management and this pattern is broadly expected to be repeated during the remainder of 2015.
By sector, BPS vendors continue to have very high expectations of the healthcare sector, across both healthcare providers and healthcare payers. Expectations are also high for the manufacturing sector, particularly in those sub-sectors strongly impacted by new technology such as high-tech, pharmaceuticals, and automotive.
By geography, growth expectations are particularly high for North America relative to Europe where expectations are relatively muted as is the case for Latin America. Growth expectations are moderate for Asia Pacific, with the exception of an expectation of high growth in Australia.
Contract scope and value is reported as increasing over the past 12-months, as contractts become for end-to-end and support for high-end decision-making increases, though contract lengths remain largely unchanged.
If you would like to register for the next BPO Index webcast, scheduled for the 2nd July, 2015, you can do so here.
]]>In particular, Q4 2014 showed a 6% improvement in BPO TCV year-on-year, with BPO accounting for 31% of total outsourcing TCV. Q1 2015 BPO TCV performance then improved further, up 12% year-on-year, accounting for 33% of BPO TCV. Nonetheless, while the last couple of quarters are an improvement, they are still underperforming relative to the seasonal average over the past five years and so there is continuing scope for improvement.
By geography, the focus on the major economies continues. There was a significant increase in BPO TCV in North America in Q1 2015 year-on-year. Within Continental Europe, for both Q1 2015 and for the last 12-months there was significant BPO contract activity in the Netherlands with Infosys winning a life BPO contract as well as a supply chain management BPO contract here in the past quarter. Overall though, the downturn in BPO TCV in Europe continues.
If we extrapolate this quarter’s level of BPO activity to the remainder of 2015, then BPO TCV across North America and Europe in 2015 will be 9% higher than that recorded in 2014.
At the sector level, the manufacturing sector is indeed showing progress, moving along the value chain from F&A outsourcing through procurement outsourcing and supply chain management. Within the manufacturing sector over the past 12-months, there was an increase in supply chain management BPO TCV alongside the increase in procurement outsourcing TCV. In Europe, the procurement outsourcing activity was in the pharmaceuticals and food & beverages sectors.
Within customer management services, the telecoms & media sector predictably dominated over the past 12-months with the manufacturing sector taking over second place from the retail sector. And within the manufacturing sector, CMS activity strengthened in both the high-tech sector and the pharmaceuticals sector.
At the global level, the last 12-months have shown a strengthening in HR outsourcing and procurement, with a strengthening in procurement outsourcing in Europe, and a significant strengthening in both HR outsourcing and F&A outsourcing in North America.
The top three places in the league table for BPO TCV over the past 12-months remain unchanged, with Serco and Capita at the top of the table. Behind these companies, Sopra Steria remains unchanged in 8th place, with the remainder of the top ten: Capgemini, State Street, Infosys, SAIC, WNS, and Wipro all moving up the table relative to the prior 12-month period.
Serco is continuing to do well in CMS in the retail industry, particularly in fashion, adding a contract with JD Williams, while in Q1 2015, Capita’s government business was back on song with contracts with Sheffield City Council and DEFRA while its acquisition of government assets continues. Capgemini had 12-months of solid F&A BPO contract wins announcing a contract expansion with Office Depot in Q1. Infosys has also had a very solid Q1, winning not just F&A BPO contracts, including a contract expansion with AkzoNobel, but also several insurance BPO contracts, and a supply chain management BPO contract.
The NelsonHall BPO Index is complemented by the NelsonHall Self-Service Market Forecast Tool, which covers 78 BPO service lines, 30 geographies, and 33 industry sectors. This gives highly accurate and granular views of the market, and complements the NelsonHall BPO Index which gives quarterly snapshots of big deal momentum. To use the NelsonHall Self-Service Market Forecast Tool, click here.
If you would like to register for the next BPO Index webcast, scheduled for the 2nd July, 2015, you can do so here.
]]>The public sector remains at the forefront in driving more sophisticated commercial arrangements in the U.K., increasingly protecting themselves from administrative over-payments, flexing payments to adjust to levels of transactional activity, using third-party investment to drive transformation, and sharing access to contracts via framework agreements.
Starting with BPO, the rise in sophistication in HR outsourcing is demonstrated by:
The increasing sophistication of customer management outsourcing is demonstrated by the increasing adoption of multi-channel delivery. Whereas relatively recently contact center outsourcing contracts in the U.K. were typically for voice only services, in 2014 it was the norm for customer management services contracts to be multi-channel in nature, with email, web chat, and even social media support commonplace. This was true across both the private and public sectors, with the e-government initiative ensuring that all local government customer services contracts announced were widely multi-channel in nature. At the same time, the move to digital is leading to the emergence of marketing BPO services with the provision of onshore creative design services becoming more commonplace.
U.K. organizations are continuing to adopt procurement outsourcing services as the pressure on costs continues. While procurement outsourcing remains concentrated around indirect procurement categories, it has expanded in scope to place increasing emphases on supplier relationship management and supplier and procurement performance management.
Within industry-specific BPO, the financial services and government sectors have been the mainstay of the U.K. outsourcing industry for many years. Here in 2014 there was an increasingly emphasis on platform-based services, such as within the major mortgage BPO contract awarded by the Co-operative Bank and within policy administration contracts in the insurance sector.
Within local government, the emphasis on local job creation continues to be a major feature of contracts outside London. However, in 2014 London authorities were increasingly adopting service delivery from outside London, typically from the North-East. Regardless of region, local authorities were becoming more sophisticated in their commercial approaches, including for example protecting themselves from administrative over-payments and flexing payments to adjust based on levels of transactional activity. Supplier investment is also increasingly being leveraged to fund transformations that will reduce service costs. Transformation to achieve ongoing and significant cost reduction remains even more firmly on the agenda.
Within IT outsourcing, use of both Cloud and DevOps became more prevalent during 2014.
IT infrastructure outsourcing contracts are increasingly being based around private and hybrid cloud transformation, with notable examples of adoption by WPP, in support of the increasing digitization of their business, Amey, and Unipart Automotive.
At the same time, the level of adoption of IT infrastructure management was at a 4-year high within local government in 2014, with migrations to cloud-based infrastructure also beginning to take place in this sector. As in BPO, the commercial management of these local government IT infrastructure management contracts was also showing increased maturity, with contracts continuing to include local job creation, apprenticeships, and training initiatives but also including the option to purchase services on behalf of additional public sector entities such as the emergency, education, and health services. Framework contracts were also evident in the purchasing of network management services by regional public sector groupings.
Mobile–enabling of apps also continued to gather pace in both the public and private sectors.
In the SME sector, particularly in high-tech businesses the adoption of Infrastructure-as-a-Service (IaaS) contracts is accelerating as SMEs take advantage of the speed-to-market and scalability of third-party cloud infrastructure. Elsewhere SaaS continued to be adopted in support of non-core and specialist processes such as CRM, laboratory information and housing management.
Elsewhere in IT infrastructure management, end-user workplace contracts continued apace with these contracts frequently now including tablets and thin clients in infrastructure refreshes, and with email and office applications increasingly being provided via the Cloud.
Within application management, SAP application management was noticeably back in fashion during 2014. While IT outsourcing in the U.K. typically remains unbundled with separate contracts and suppliers used for application management and IT infrastructure management, these SAP outsourcing contracts are increasingly going beyond standard application maintenance to combine application re-engineering with infrastructure re-engineering, with, for example, a number of these contracts in 2014 including upgrading of SAP systems and also providing SAP hosting using private cloud infrastructures. U.K. companies are typically not yet ready to use public cloud for core applications such as SAP, but they are increasingly adopting third-party private cloud implementation and hosting in conjunction with SAP application management. These contracts potentially mark the introduction of DevOps thinking into the U.K. with application and infrastructure transformations starting to be co-ordinated within a single contract within contracts with transformational intent.
So what does this mean for 2015? On the whole, more of the same. Most of the trends described above are at early stages of development. For example in cloud, core systems typically need to complete their migration to private cloud from where they will increasingly incorporate elements of hybrid cloud. At the same time, the role of cloud-based platforms will continue to rise in importance in non-core areas of BPO extending beyond HR which is currently the prime example.
DevOps will increase in importance, not in support of minor application upgrades and maintenance, but where businesses are transforming their applications, and particularly in support of transformations to digital businesses.
Commercially, the trend to transaction-based and usage based pricing will continue, with this continuing to be supplemented by gainshare based on supplier investment, particularly by organizations such as those in the public sector that have a strong need for cost reduction but lack the means to finance the required process and IT transformations themselves.
For further details, see http://www.arvato.co.uk/sites/default/files/index-2014.pdf
]]>In its early days, BPO was a linear and lengthy process with knowledge transfer followed by labor arbitrage, followed by process improvement and standardization, followed by application of tools and automation. This process typically took years, often the full lifetime of the initial contract. More recently, BPO has speeded up with standard global process models, supported by elements of automation, being implemented in conjunction with the initial transition and deployment of global delivery. This timescale for “time to value” is now being speeded up further to enable a full range of transformation to be applied in months rather than years. Overall, BPO is moving from a slow-moving mechanism for transformation to High Velocity BPO. Why take years when months will do?
Some of key characteristics of High Velocity BPO are shown in the chart below:
Attribute |
Traditional BPO |
High-Velocity BPO |
Objective |
Help the purchaser fix their processes |
Help the purchaser contribute to wider business goals |
Measure of success |
Process excellence |
Business success, faster |
Importance of cost reduction |
High |
Greater, faster |
Geographic coverage |
Key countries |
Global, now |
Process enablers & technologies |
High dependence on third-parties |
Own software components supercharged with RPA |
Process roadmaps |
On paper |
Built into the components |
Compliance |
Reactive compliance |
Predictive GRC management |
Analytics |
Reactive process improvement |
Predictive & driving the business |
Digital |
A front-office “nice-to-have” |
Multi-channel and sensors fundamental |
Governance |
Process- dependent |
GBS, end-to-end KPIs |
As a start point, High Velocity BPO no longer focuses on process excellence targeted at a narrow process scope. Its ambitions are much greater, namely to help the client achieve business success faster, and to help the purchaser contribute not just to their own department but to the wider business goals of the organization, driven by monitoring against end-to-end KPIs, increasingly within a GBS operating framework.
However, this doesn’t mean that the need for cost reduction has gone away. It hasn’t. In fact the need for cost reduction is now greater and faster than ever. And in terms of delivery frameworks, the mish-mash of third-party tools and enablers is increasingly likely to be replaced by an integrated combination of proprietary software components, probably built on Open Source software, with built in process roadmaps, real-time reporting and analytics, and supercharged with RPA.
Furthermore, the role of analytics will no longer be reactive process improvement but predictive and driving real business actions, while compliance will also become even more important.
But let’s get back to the disruptive forces impacting BPO. What forms will the resulting disruption take in both the short-term and the long-term?
Disruption |
Short-term impact |
Long-term impact |
Robotics |
Gives buyers 35% cost reduction fast |
No significant impact on process models or technology |
Analytics |
Already drives process enhancement |
Becomes much more instrumental in driving business decisions Potentially makes BPO vendors more strategic |
Labor arbitrage on labor arbitrage |
Ongoing reductions in service costs and employee attrition
|
“Domestic BPO markets” within emerging economies become major growth opportunity |
Digital |
Improved service at reduced cost |
Big opportunity to combine voice, process, technology, & analytics in a high-value end- to-end service |
BPO “platform components” |
Improved process coherence |
BPaaS service delivery without the third-party SaaS |
The Internet of Things |
Slow build into areas like maintenance |
Huge potential to expand the BPO market in areas such as healthcare |
GBS |
Help organizations deploy GBS |
Improved end-to-end management and increased opportunity Reduced friction of service transfer |
Well robotics is here now and moving at speed and giving a short-term impact of around 35% cost reduction where applied. It is also fundamentally changing the underlying commercial models away from FTE-based pricing. However, robotics does not involve change in process models or underlying systems and technology and so is largely short-term in its impact and is a cost play.
Digital and analytics are much more strategic and longer lasting in their impact enabling vendors to become more strategic partners by delivering higher value services and driving next best actions and operational business decisions with very high levels of revenue impact.
BPO services around the Internet of Things will be a relatively slow burn in comparison but with the potential to multiply the market for industry-specific BPO services many times over and to enable BPO to move into critical services with real life or death implications.
So what is the overall impact of these disruptive forces on BPO? Well while two of the seven listed above have the potential to reduce BPO revenues in the short-term, the other five have the potential to make BPO more strategic in the eyes of buyers and significantly increase the size and scope of the global BPO market.
Part 1 The Robots are Coming - Is this the end of BPO?
Part 2 Analytics is becoming all-pervasive and increasingly predictive
Part 3 Labor arbitrage is dead - long live labor arbitrage
Part 4 Digital renews opportunities in customer management services
Part 6 The Internet of Things: Is this a New Beginning for Industry-Specific BPO?
]]>
Sector |
Examples |
Telemedicine |
Monitoring heart operation patients post-op |
Insurance |
Monitoring driver behavior for policy charging |
Energy & utilities |
Identifying pipeline leakages |
Telecoms |
Home monitoring/management - the "next big thing” for the telecoms sector |
Plant & equipment |
Predictive maintenance |
Manufacturing |
Everything-as-a-service |
So, for example, sensors are already being used to monitor U.S. heart operation patients post-op from India to detect warning signs in their pulses, while a number of insurance companies are using telematics to monitor driver behaviour in support of policy charging. Elsewhere sensors are increasingly being linked to analytics to provide predictive maintenance in support of machinery from aircraft to mining equipment, and home monitoring seems likely to be the next “big thing” for the telecoms sector. And in the manufacturing sector, there is an increasing trend to sell “everything as a service” as an alternative to selling products in their raw form.
This is a major opportunity that has the potential to massively increase the market for industry-specific or middle-office BPO way beyond its traditional more administrative role.
However, it has a number of implications for BPO vendors in that the buyers for these sensor-dependent services are often not the traditional BPO buyer, these services are often real-time in nature and have a high level of requirement for 24X7 delivery, and strong analytics capability is likely to be a pre-requisite. In addition, these services arising out of the Internet of Things potentially take the meaning of risk/reward to a whole new level, as many of them potentially have real life or death implications. Some work for the lawyers on both sides here.
Coming next: High-Velocity BPO – What the client always wanted!
Previous blogs in this series:
Part 1 The Robots are Coming - Is this the end of BPO?
Part 2 Analytics is becoming all-pervasive and increasingly predictive
Part 3 Labor arbitrage is dead - long live labor arbitrage
Part 4 Digital renews opportunities in customer management services
]]>However, BPO vendors have often tended to use their own proprietary tools in key areas such as workflow, which has had the advantage of enabling them to offer a lower price point than via COTS workflow and also enabled them to achieve more integrated real-time reporting and analytics. This approach to develop pre-assembled components is being further accelerated in conjunction with cloud-based provisioning.
Now, BPO vendors are starting to take this logic a step further and pre-assemble large numbers of BPO platform components as an alternative to COTS software. This approach potentially enables them to retain the IP in-house, an important factor in areas like robotics and AI, reduce their cost to serve by eliminating the cost of third-party licences, and achieve a much more tightly integrated and coherent combination of pre-built processes, dashboards and analytics supported by underlying best practice process models.
It also potentially enables them to offer true BPaaS for the first time and to begin to move to a wider range of utility offerings - the Nirvana for vendors.
Coming next: The Internet of Things – Is this a new beginning for industry-specific BPO
Previous blogs in this series:
Part 1 The Robots are Coming - Is this the end of BPO?
Part 2 Analytics is becoming all-pervasive and increasingly predictive
Part 3 Labor arbitrage is dead - long live labor arbitrage
Part 4 Digital renews opportunities in customer management services
]]>However, the impact of digital is such that it increases the need for voice and data convergence. A common misconception in customer service is that the number of transactions is going down. It isn’t, it is increasing. So while the majority of interactions will be handled by digital rather than voice within three years, voice will not disappear. Indeed, one disruptive impact of digital is that it increases the importance of voice calls, and with voice calls now more complex and emotional, voice agents need to be up-skilled to meet this challenge. So the voice aspect of customer service is no longer focused on cost reduction. It is now focused on adding value to complex and high value transactions, with digital largely responsible for delivering the cost reduction element.
So what are the implications for BPO vendors supporting the front-office. Essentially vendors now need a strong combination of digital, consulting, automation, and voice capability and:
On the people side, agent recruiting, training, and motivation become more important than ever before, now complicated by the fact that differing channels need different agent skills and characteristics. For example, the recruiting criteria for web chat agents are very different from those for voice agents. In addition, the web site is now a critical part of customer service delivery, and self-serve and web forms, traditionally outside the mandate of customer management services vendors, are a disruptive force that now needs to be integrated with both voice and other digital channels to provide a seamless end-to-end customer journey regard. This remains an organizational challenge for both organizations and their suppliers.
Coming next – the impact of the move to platform components
Disruptive Forces and Their Impact on BPO - previous articles in series
Part 1 The Robots are Coming - Is this the end of BPO?
Part 2 Analytics is becoming all-pervasive and increasingly predictive
Part 3 Labor arbitrage is dead - long live labor arbitrage
]]>
One side of labor arbitrage within labor arbitrage is relatively defensive, but in spite of automation and robotics, mature “International BPO” services are now being transferred to tier-n cities. Here labor arbitrage within labor arbitrage offers lower price points, reduced attrition, and business continuity. The downside is that travel to these locations might be slightly more challenging than some clients are used to, Nonetheless tier-n locations are an increasingly important part of the global delivery mix even for major outsources centered around mature geographies such as North America and Europe. And even more important as doing business in emerging markets becomes evermore business critical to these multinationals.
However, there’s also a non-defensive side to use of tier-n cities, which is to support growth in domestic markets in emerging markets, which will be an increasingly important part of the BPO market over the coming years. Lots of activity of this type is already underway in India, but let’s take South Africa as an example where cities such as Port Elizabeth and Jo’berg are emerging as highly appropriate for supporting local markets cost-effectively in local languages.So, just when you thought all BPO activity had centralized in a couple of major hubs, the spokes are fighting back and becoming more strategic.
But let’s get back to a sexier topic than labor arbitrage. The next blog looks at the impact of Digital.
Part 1 The Robots are Coming - Is this the end of BPO?
Part 2 Analytics is becoming all-pervasive and increasingly predictive
]]>However, analytics is now becoming much more pervasive, much more embedded in processes, and much more predictive & forward-looking in terms of recommending immediate business actions and not just process improvements as shown below
BPO Activity |
Areas where analytics being applied |
Contact center |
Speech and text analytics Social media monitoring & lead generation |
Marketing operations |
Price & promotion analytics |
Financial services |
Model validation |
Procurement |
Spend analytics |
Indeed, it’s increasingly important that real-time, drill down dashboards are built into all services, and what-if modelling is becoming increasingly common here in support of for example supply chain optimization.
Analytics is also helping BPO to move up the value chain and open up new areas of possibility such as marketing operations, where traditional areas like store performance reporting are now being supplemented by real-time predictive analytics in support of identifying the most appropriate campaign messaging and the most effective media, or mix of media, for issuing this messaging. So while analytics is still being used in traditional areas such as spend analytics to drive down costs, it is increasingly helping organizations take real-time decisions that have high impact on the top-line. Much more strategic and impactful.
The next disruptive force to be covered is less obvious - labor arbitrage within labor arbitrage.
]]>Analytics becoming all-pervasive and increasingly predictive
Labor arbitrage is dead – long live labor arbitrage
Digital renews opportunities in customer management services
Will Software Destroy the BPO Industry? Or Will BPO Abandon the Software Industry in Favor of Platform Components?
The Internet of Things: Is this a New Beginning for Industry-Specific BPO?
The final blog will evaluate the short- and long-term impact of each of these disruptive forces individually and collectively and their potential to deliver “High Velocity BPO” – What the Client Always Wanted.
Let’s start at the beginning.
Some of the common misconceptions about BPO are that it’s traditionally only been about people and not about innovation. Sorry, it’s always been about both. Another misconception is that BPO used to be about cost reduction but that is no longer the case and it’s now about other types of value. Well even in its infancy, BPO was always as much about service improvement as cost reduction, and to be honest one tends to go with the other anyway. There’s a huge correlation here.
Having said that, client needs do tend to become more focused over time, so let’s take a quick look at how BPO client needs are evolving. Then at six of the potential disruptive forces impacting BPO. Probably should be in a Porter analysis but let’s be less formal. Finally let’s take a look at what this means for BPO going forward. We’ve coined this as “High-Velocity BPO”. A bit of plagiarism here but I think it does the trick.
Clearly, what BPO buyers want varies considerably from service type to service type. But let’s start with the example of a very mature and conservative back-office process. In this area, organizations tend to start by asking for two things:
Within this desire for process improvement, standardization is often a key element, as is a desire for improved business agility. Then, by the time organizations get to second or third generation BPO, they have generally sorted out the first level of process standardization, have implemented lots of global delivery, and want to build on these. So they still want nirvana but they are starting to understand what nirvana looks like. So they are increasingly thinking about business outcomes on an end-to-end basis, & global process owners, & integration into, or setting up, GBS organizations. Also they’ve done labor arbitrage, so are increasingly thinking about increased automation, and controls & compliance are becoming even more important. Above all, they are looking for a process vision to support them in their business vision. Clients have always wanted nirvana, but it takes them time to work out what it might look like & how to get there.
It’s no longer about cost reduction, is it? Well ultimately, there are only three business outcomes that matter:
So the need for cost reduction is as strong as ever. Arguably the new factor here in recent years is the increased need for business agility, which increasingly demands some form of transactional pricing and a willingness to support reduced volumes as well as increased volumes. That can be a real differentiator.
BPO has always worked best when the agenda has been driven by a small number of high level business outcomes; the difficulty has been in managing and making changes to the end-to-end value chain. Certainly disruptions such as GBS should help here, providing an end-to-end process view and a single process owner.
So what are some of the disruptive forces impacting BPO?
Well the robots are coming; is this the end of BPO? One potentially disruptive force is RPA, which is certainly receiving headlines as a BPO killer. So where is RPA currently being used and what are the implications for BPO? Initially, the main usage of robotics is for getting data from one or more applications to another, making intelligent deductions & matching in support of data enrichment and filling in missing fields. A bit like macros on steroids. Loosely coupled with existing systems rather than changing them. The advantage of RPA is that it seems to be achieving a 30% plus cost take-out where employed and very quickly. Implementation times seem to be taking 1-3 months, with a further 3-month period for change management. So RPA is quick and easy.
So where vendors have sometimes been slow to implement the wider process change they knew was possible, due to potential impact on FTE-based revenues, robotics has been making vendors act fast. So partly about getting to their clients before anyone else, including their internal IT department, does. Also robotics has probably been the biggest single driver of pricing changes from FTE-based to fixed price and transactional based. No supplier wants to be caught implementing robotics with a FTE-based pricing model still in place. So robotics has probably generated a bigger change in renegotiation of pricing mechanisms than many years of process cost benchmarking.
Robotics also generates additional challenges for BPO vendors beyond service pricing, including whether to make or buy the underlying robotics tools. If as a vendor you want to develop your own IP and not share it with your competitors, then you might want to develop your own form of robotics rather than use third-party software, and indeed a number of vendors are doing just this.
The next disruptive force, which I will consider in tomorrow's blog, is analytics.
]]>HCL’s original contract with Chesnara dates back to 2005. However, HCL also had a separate L&P BPO contract with Save & Prosper, who Chesnara purchased from JPMorgan Asset Management in December 2010. More recently, Chesnara also purchased Direct Line Life, whose L&P operations are currently in process of being transitioned to HCL. This latest acquisition by Chesnara adds a further 150,000 policy holders to the portfolio currently administered by HCL.
Accordingly, these three historically separate contracts are now being consolidated by Chesnara and HCL into a single contract to provide a consistent suite of services and SLAs across policy administration services, fund accounting, investment administration, and certain actuarial valuation and reporting services.
At the service delivery level, this involves handling all policies within a similar operating model with workflow for all policies handled through HCL’s OpEX (Operational Excellence) work management and quality assurance tools. In addition, policies are being migrated, where feasible, on HCL’s ALPS insurance platform. For example, the policies transitioned from Direct Line Life are currently being migrated onto ALPS. However, as usual, it is not feasible to handle all policies on a single platform and a small number of legacy systems will remain in place across the various books, handling approx. 50,000 policies.
In addition, within the new contract, HCL is working to ensure that service levels and service metrics are consistent across all Chesnara books of business, and HCL will be enhancing the SLAs and service metrics in place to ensure consistency with regulatory conduct risk expectations over next 18-months. This involves developing an increased focus on customer experience e.g. introducing proactive calling where there may have been a customer service issue before this issue turns into a formal complaint. The benefits of this type of approach include operations cost reduction, by reducing the level of formal complaints handling, as well as delivering improved customer experience.
The pricing mechanisms used across the various books are also being standardized and moved from pure per policy charging to a combination of per policy and activity-based pricing. For example, the pricing of core policy administration services will still tend to be per policy driven but will be rationalized by policy type across books. However, the costs of many accounting-based activities are not sensitive to the number of policies managed and so will be priced differently to achieve better alignment between service pricing and the underlying cost drivers.
Elsewhere in the industry, changes in the fiscal treatment of retirement income is forcing companies to re-evaluate their retirement product strategies and their approaches to administration of annuity and retirement books. The new legislation coming into place in the U.K. means that companies with small annuity books may no longer be adding significantly to these and so may need to treat these differently in future. This potentially creates opportunities both for consolidators and companies such as HCL who support their operations.
]]>Potentially this could have a major impact on the rate of hybrid cloud adoption in highly regulated industries such as financial services, utilities, and the government sector.
The technique allows companies to mark or tag their data and use an intelligent cloud management system to store files in the appropriate location. For example, if a business needs to ensure that all of its financial data is stored in a specific cloud data center, the associated files are tagged appropriately and the cloud management system ensures that the files are stored in the correct location(s).
]]>Indeed, CSC is looking to reduce its revenue contribution from traditional IT infrastructure management services and instead drive revenue growth from an increased ‘as a service’ orientation. ‘Emerging Services’ of importance to CSC include:
In particular, the latest extension to CSC's partnership with IBM builds on CSC's acquisition of ServiceMesh, its subsequent partnership with Microsoft to integrate its ServiceMesh Agility Platform with Microsoft Systems Center, and its global partnership with AT&T, and involves incorporating IBM's SoftLayer IaaS service and Bluemix web and mobile application development platform into CSC's ServiceMesh Agility Platform. In return, IBM will add CSC's ServiceMesh Agility Platform to the IBM Cloud Marketplace. In particular, the new partnership speeds up CSC's implementation of its strategy in a number of key areas, including:
This new tool is aimed at assisting executives in accessing the precise market size, growth, and vendor share information they require rapidly and cost-effectively, in support of more-informed decision-making.
According to NelsonHall CEO John Willmott, “NelsonHall has been working hard to develop automated tools that connect decision-makers with the insight they need as quickly and directly as possible. We see this new self-service market forecasting tool as a big step forward compared to relying on standard reports lacking focus or potentially time-consuming custom requests for information.”
The new “NelsonHall Self-Service Market Forecasting tool” from NelsonHall enables executives to tailor the scope of any market forecast by selecting one or multiple service lines, geographies, and industry sectors and downloading market size, growth and vendor share information against these parameters. Furthermore, decision-makers are no longer dependent on analysts or knowledge workers to produce customized reports on their behalf.
The tool initially supports 78 BPO service lines, 30 geographies, and 33 industry sectors and will be expanded to support additional service lines including ITO shortly.
]]>The EFaaS service has arisen from HCL’s Next Gen BPO tenets, namely domain orientation, innovation and improvement focused, based on output/outcome/flexible constructs, utilizing HCL’s Integrated Global Delivery Model (IGDM), and addressing risk and compliance. In particular, the EFaaS service aims to deliver business function services as utilities by undertaking elements of business operations transformation, IT standardization (e.g. SAP/Oracle transformation, unified chart of accounts, reduced reporting platforms, data warehouses etc.), platform transformation, and infrastructure consolidation and to achieve 25%-35% cost reduction within each utility. Accordingly, HCL is:
HCL has a five-step approach, typically spread over 24-30 months, to implementing EFaaS, namely:
HCL is working with global strategic partners in the development of these utilities, with partners assisting in:
HCL initially targeted a number of major banks, all of which are looking to achieve multi-billion dollars of cost take-out from their operations. In particular, these banks typically face the following issues:
HCL has so far signed two contracts for EFaaS, both in the banking sector. In HCL’s initial contract for EFaaS, the contract scope covered four principal business processes within the client organization:
Across these four process areas, HCL undertook a multi-year contract, undertaking to take out 35% of cost, while simplifying the IT environment with no up-front IT investment required by the client organization. In addition, the client organization was looking to establish a private utility or utilities across these functions that could then be taken to wider banking organizations.
In response, HCL established a private utility for the client organization across all four of these process areas and identified external reporting as the area which could be most readily replicated and taken to market. In addition, the process knowledge but not the technology aspects of the “cost utility” processes could be replicated, whereas management reporting is typically very specific to each bank and can’t be readily replicated. Accordingly, while private utilities have been established for the initial banking client organization across all four target process areas, only external reporting is being commercialized to other organizations at this stage.
Process improvement and service delivery location shifts have been made across all four process areas. For example, prior to the contract with HCL, 60% of the reporting was done by the bank in Excel. HCL has standardized much of this reporting using various report writing tools. In addition, HCL has implemented workflow in support of the close process, enabling the life-cycle of the close process to be established as an online tool and increasing transparency on a global basis.
Within the external reporting function, the approach taken by HCL has been to use Axiom software to establish and pre-populate templates for daily, monthly, and quarterly external reporting, extracting the appropriate data from SAP and Oracle ERPs.
In terms of delivery, HCL is creating global hubs in India (~80% of activity) with regional centers in the U.S. in Cary and in Europe in Krakow. HCL has also put in place training in support of local country regulations, for example the differences between U.S. GAAP and U.K. GAAP.
HCL is continuing to take EFaaS to market by targeting major banking and insurance firms, initially approaching existing accounts. In terms of geographies, HCL is selectively targeting major banks and insurers in U.S., U.K., and Continental Europe.
The banks and insurers are expected to retain their existing ERPs. However, HCL perceives that it can assist banks and insurers in adoption of best-in-class chart-of-accounts design and governance and best practices around data management and simplifying the various instances of ERPs.
In general, within its EFaaS offering, HCL is prepared to fund projects for banks and insurers that involve cost take-out and where HCL can take fees downstream based on criteria where HCL has control of the outcomes.
HCL perceives “speed of replication” to be a key differentiator of the EFaaS approach, and the EFaaS framework initially used has now been replicated for another banking institution in support of their finance operations and external reporting processes.
This service is a timely response to the needs of capital markets firms in particular, that have been seeking to take considerable costs out of their operations and to carve-out and commercialize non-core functions into separate third-party-owned utilities. It is likely that capital markets firms will carve-out a relatively large number of narrowly-focused utilities with some of these being successfully commercialized by third-parties. The retail banks are likely to follow this pattern subsequently, though probably to a lesser extent than capital markets firms.
In addition to Finance as an enterprise function, HCL’s EFaaS model will be subsequently developed to target other enterprise functions such as procurement, HR, risk & compliance, legal, and marketing functions.
]]>