NelsonHall: F&A & Supply Chain Transformation blog feed https://research.nelson-hall.com//sourcing-expertise/f-a-supply-chain-transformation/?avpage-views=blog NelsonHall's F&A Services Program provides expert support and advice to organizations considering, or actively engaged in, the outsourcing of all or part of their finance and accounting or supply chain function. <![CDATA[What Characterizes Leading Supply Chain Transformation Vendors?]]>

 

The pandemic stress-tested many supply chains beyond previous expectations, identifying and magnifying any process shortfalls. The current economic downturn and additional disruption of supply chains by geopolitical factors have further exacerbated the difficulties enterprises face in their day-to-day supply chain management.

Accordingly, enterprises are typically seeking increased digitalization and efficiency in their supply chains, greater supply chain agility, and much, much better supply chain visibility.

However, organizations are typically not self-sufficient in driving supply chain transformation and look for partners to assist them in this journey.

So, what should organizations look for in a supply chain transformation partner?

Ability to Identify End-to-End Digital Supply Chain Operating Model

Larger organizations will be looking not just for a service provider but for a trusted advisor who can proactively deliver innovation and best practices. This typically entails finding a supply chain vendor to work with them using design thinking to identify their new target operating model that can achieve the key business outcomes sought. Such vendors typically have innovation labs where they bring domain consultants and technology experts to co-create with client executives. As well as innovation expertise, the vendor should bring its own benchmarks to measure the client’s current performance and assist in establishing target KPIs within the to-be operating model and understand the best practice processes that are key to achieving these desired KPIs and business outcomes. This level of process knowledge typically comes from operational process expertise obtained through many years of running and enhancing client supply chain processes. Hence, it is important to choose a vendor with both consulting and operational supply chain expertise to reimagine and deliver supply chain transformation.

End-to-end supply chain expertise is also becoming increasingly important. Supply chain transformation requires an ability to break down existing silos and integrate data and automation across silos, so vendors need to be able to design and implement an end-to-end operating model reaching from demand identification and planning & supply planning through order fulfillment and transport planning & optimization, to warranty & returns operations. Indeed, manufacturing optimization could even be included as part of the end-to-end supply chain view for a more enterprise-wide perspective of the supply chain. This holistic perspective is necessary to eliminate friction between departments and automate transactional activities to the extent possible.

Ability to Establish the Data Framework

Data is an immensely important part of this perspective. The existence of supply chain silos within organizations has often resulted in multiple data sources that are fragmented and potentially inconsistent and out-of-date. In the new supply chain operating model, the vendor must incorporate real-time data quality and consistency into the design to provide a solid data analytics and visualization framework. Accordingly, the new operating model should have strong master data management holding a single version of the truth. This will typically require the vendor to undertake ERP standardization and optimization, including improving existing ERPs' controls and implementing a data lake. It will probably also involve improving the level of integration downstream with distributors and customers, upstream with raw material and component suppliers, and within the supply chain itself, e.g., with logistics firms.

It may even be appropriate to undertake cloud migration of existing ERPs.

Ability to Deploy Pre-Built Plug-and-Play Components

The vendor should provide a comprehensive integration platform, control towers, and dashboards within the new supply chain operating model. This integration platform will typically access a wide range of pre-developed plug-and-play models. These plug-and-play models will be based on both proprietary tools and platforms and have the ability to seamlessly integrate specialist point platforms for areas such as transportation planning and inventory management.

While they are likely to have preferred partners for many of the point solutions required, the vendor should have a broad alliance ecosystem with pre-built APIs and supply chain integration and build in longevity to the digital supply chain operating model by providing the ability to switch platforms in and out within its integration framework as new platforms and opportunities become available. These platforms, whether proprietary or third-party, will predominantly be cloud-based to provide greater supply chain scalability and resilience.

Ability to Apply Machine Learning in Support of Supply Chain Simulations

Enhanced demand forecasting is currently a key component of supply chain transformation. Depending on the type of business, the vendor should be able to integrate customer data from point-of-sale, social platforms, and third-party data sources, ingest data from the organization's systems and apply this data within pre-built machine learning models. Machine learning models should be used to improve forecasting accuracy and to run simulations, for example, to indicate the impact of marketing campaigns and price adjustments.

However, machine learning and analytics have a much wider role in underpinning simulations across the supply chain, including in supply forecasting and logistics optimization. In addition, digital twins are starting to be deployed to test and refine transformational approaches before their adoption, while process mining should be used to check process conformance and further opportunities for process automation.

Summary

In summary, leading supply chain vendors will possess:

  • A combination of consulting, technology, and operations expertise in supply chain management supported by design thinking labs
  • Best-practice supply chain solutions based on integrated combinations of process models, industry platforms, and automation technologies
  • A supply chain integration platform and a pre-built portfolio of supply chain plug-and-play models for process automation
  • A strong alliance ecosystem for access to best-in-class supply chain tools and platforms
  • The ability to think end-to-end, breaking down supply chain silos and automating transactional activities across the supply chain
  • Dedicated supply chain talent with specialized skills and capabilities
  • Predictive and cognitive supply chain capabilities, including strong forecasting and digital twin capability.
]]>
<![CDATA[Moving to an Autonomous Supply Chain: Q&A with Capgemini’s Joerg Junghanns – Part 2]]>

Read Part 1 here.

 

Q&A Part 2

JW: What are the main supply chain flows that supply chain executives should look to address?

JJ: Traditionally, there are three main supply chain flows that benefit from automation:

  • Physical flow (flow of goods from, e.g., from a DC to a retailer, the most visible and tangible flow) – some more obvious than others, such as parcels delivered to your door or raw materials arriving at a plant. To address these issues, the industry is getting ready (or is ready) to adopt drones, automated trucking, and automated guided vehicles (AGV). But to achieve true end-to-end physical delivery, major infrastructure and regulatory changes are yet to happen to fully unleash the potential of physical automation in this field. In the short-term, however, let’s not forget the critical paper flow associated with these flows of goods, such as a courier sending Bills of Lading to a given port on time for customs clearance and vessel departure, a procedure that often leads to unexpected delays
  • Financial flow (flow of money) – here the industry is adopting new technologies to palliate common issues, e.g., interbanking communication in support of letters of credit
  • Information flow (flow of information connecting systems and stakeholders alike and ensuring that relevant data is shared, ideally in real-time, between, e.g., a supplier, a manufacturer, and its end customers) – this is the information you share via email/spreadsheets or through a platform connecting you with your ecosystem partners. This flow is also a perfect candidate for automation, starting with a platform to break silos or for smaller transformation with tactical RPA deployments. More ambitious firms will also want to look into blockchain solutions to, for instance, transparently access information about their suppliers and ensure that they are compliant (directly connecting to the blockchain containing information provided by the certification institution such as ISO). While the need for drones and automated trucking/shipping is largely contingent on infrastructure changes, regulations, and incremental discoveries, the financial and information flows have reached a degree of maturity at scale that has already been generating significant quantifiable benefits for years.

JW: Can you give me examples of where Capgemini has deployed elements of an autonomous supply chain?

JJ: Capgemini has developed capabilities to help our clients not only design but also run their services following best-practice methodologies blending optimal competencies, location mix, and processes powered by intelligent automation, analytics, and world-renowned platforms. We have helped clients transform their processes, and we have run them from our centers of excellence/delivery centers to maximize productivity.

Two examples spring to mind:

Touchless planning for an international FMCG company:

Our client had maxed out their forecasting capabilities using standard ERP embedded forecasting modules. Capgemini leveraged our Demand Planning framework powered by intelligent automation and combined it with best-in-class machine learning platforms to increase the client’s forecasting accuracy and lower planning costs by over 25%, and this company is now moving to a touchless planning function.

Automated order validation and delivery note for an international chemical manufacturing company:

Our client was running fulfillment operations internally at a high operating cost and low productivity. Capgemini transformed the client’s operations and created a lean team in a cost-effective nearshore location. On top of this, we leveraged intelligent automation to create a touchless purchase/sales order to delivery note creation flow, checking that all required information is correct, and either raising exceptions or passing on the data further down the process to trigger the delivery of required goods.

JW: What are the key success factors for enterprises starting the journey to autonomous supply chains?

JJ: Moving to an autonomous supply chain is a major business and digital transformation, not a standalone technology play, and so corporate culture is highly important in terms of the enterprise being prepared to embrace significant change and disruption and to operate in an agile and dynamic manner.

To ensure business value, you also need a consistent and holistic methodology such as Capgemini’s Digital Global Enterprise Model, which combines Six Sigma-based optimization approaches with a five senses-driven automation model, a framework for the deployment of intelligent automation and analytics technology.

Also, a lot depends on the quality of the supply chain data. Enterprises need to get the data right and master their supply chain data because you can’t drive autonomy if the data is not readily available, up-to-date in real-time, consistent, and complete. Supply chain and logistics is not so much about moving physical goods; it's been about moving information for decades. A bit of automation here and there will not make your supply chain touchless and autonomous. It requires integration and consolidation first before you can aim for autonomy.

JW: And how should enterprises start to undertake the journey to autonomous supply chains?

JJ: The first step is to build the right level of skill and expertise within the supply chain personnel. Scaling too fast without considering the human factor will result in a massive mess and a dip in supply chain performance. Also, it is important to set a culture of continuous improvement and constant innovation, for example, by leveraging a digitally augmented workforce.

Secondly, the right approach is to make elements of the supply chain touchless. Autonomy will happen as a staged approach, not as a big bang. It’s a journey. Focus on high-impact areas first, enable quick wins, and start with prototyping. So, supply chain executives should identify those pockets of excellence that are close to being ready, or which can be made ready, to be made touchless, and where you can drive supply chain autonomy.

One approach to identifying the most appropriate initiatives is to plot them against two axes: the y-axis being the effort to get there and the x-axis being the impact that can be achieved. This will help identify pockets of value that can be addressed relatively quickly, harvesting some quick wins first. As you progress down this journey, further technologies may mature that allow you to address the last pieces of the puzzle and get to an extensively autonomous supply chain.

JW: Which technologies should supply chain executives be considering to underpin their autonomous supply chains in the future?

JJ: Beyond fundamental technologies such as RPA, machine learning has considerable potential to help, for example, in demand planning to increase accuracy, and in fulfillment to connect interaction and decision-making.

Technologies now exist that can, for example, both recognize and interpret the text in an email and automatically respond and send all the information required; for example, for order processing, populating orders automatically, with the order validated against inventory and with delivery prioritized according to corporate rules – and all this without human intervention. This can potentially be extended further with automated carrier bookings against rules. Of course, this largely applies to the “happy flows” at the moment, but there are also proven practices to increase the proportion of “happy orders”.

The level of autonomy in supply chain fulfillment can also be increased by using analytics to monitor supply chain fulfillment and predict potential exceptions and problems, then either automating mitigation or proposing next-best actions to supply chain decision-makers.

This is only the beginning, as AI and blockchain still have a long way to go to reach their potential. Companies that harness their power now and are prepared to scale will be the ones coming out on top.

JW: Thank you, Joerg. I’m sure our readers will find considerable food for thought here as they plan and undertake their journeys to autonomous supply chains.

 

Read Part 1 here.

]]>
<![CDATA[Moving to an Autonomous Supply Chain: Q&A with Capgemini’s Joerg Junghanns – Part 1]]>

 

Introduction

Supply chain management is an area currently facing considerable pressure and is a key target for transformation. NelsonHall research shows that less than a third of supply chain executives in major enterprises are highly satisfied with, for example, their demand forecasting accuracy and their logistics planning and optimization, and that the majority perceive there to be considerable scope to reduce the levels of manual touchpoints and hand-offs within their supply chain processes as they look to move to more autonomous supply chains.

Accordingly, NelsonHall research shows that 86% of supply chain executives consider the transformation of their supply chains over the next two years to be highly important. This typically involves a redesign of the supply chain to maximize available data sources to deliver more efficient workflow and goods handling, improving connectivity within the supply chain to enable more real-time decision-making, and improving the competitive edge with better decision-making tools, analytics, and data sources supporting optimized storage and transport services.

Key supply chain transformation characteristics critical for driving supply chain autonomy that are sought by the majority of supply chain executives include supply chain standardization, end-to-end visibility of supply chain performance, ability to predict, sense, and adjust in real-time, and closed-loop adaptive planning across functions.

At the KPI level, there are particularly high expectations of high demand forecasting accuracy, improved logistics planning and optimization, leading to higher levels of fulfillment reliability; and enhanced risk identification leading to operational cost and working capital reduction.

So, overall, supply chain executives are typically seeking a reduction in supply chain costs, more effective supply chain processes and organization, and improved service levels.

 

Q&A Part 1

JW: Joerg, to what extent do you see existing supply chains under pressure?

JJ: From a manufacturer looking for increased supply chain resilience and lower costs to a B2C end consumer obsessed with speed, visibility, and aftersales services, supply chains are now under great pressure to transform and adapt themselves to remain competitive in an increasingly demanding and volatile environment.

Supply chain pressure results from increasing levels of supply chain complexity, higher customer expectations, a more volatile environment (e.g., trade wars, Brexit), difficulty in managing costs, and lack of visibility. In particular, global trade has been in a constant state of exception since 2009, creating a need to increase supply chain resilience via increased agility and flexibility and, in sectors such as fast-moving consumer goods and even automotive, hyper-personalization can mean a lot size of one, starting from procurement all the way through production and fulfillment. At the same time, supply chains are no longer simple “chains” but have talent, financial, and physical flows all intertwined in a DNA-like spiral resulting in a (supply chain) ecosystem with high complexity. All this is often compounded by the low level of transparency caused by manual processes. In response, enterprises need to start the journey to autonomous supply chains. However, many supply chains are still not digitized, so there’s a lot of homework to be done before introducing digitalization and autonomous supply chains.

JW: What do you understand by the term “autonomous supply chain”?

JJ: The end game in an “autonomous supply chain” is a supply chain that operates without human intervention. Just imagine a parcel reaching your home, knowing it didn’t take any human intervention to fulfill your order? How much of this is fiction and how much reality?

Well, some of this certainly depends on major investments and changes to regulations in areas such as sending drones to deliver your parcels, flying over your neighborhood, or loading automated trucks crisscrossing the country with nobody behind the steering wheel; major steps in lowering costs and improving customer satisfaction can already be undertaken using current technologies. Recent surveys show that only a quarter of supply chain leaders perceive that they have reached a satisfactory automation level, leveraging the most innovative end-to-end solutions currently available.

JW: What benefits can companies expect from the implementation of an “autonomous supply chain”?

JJ: Our observations and experience link autonomous supply chains to:

  • Lower costs – it is no surprise that supply chain automation already helps to lower costs (and will do even more so in the future), combining FTE savings and lower exception handling costs coupled with productivity and quality gains
  • Improved customer satisfaction – as a customer you may ask, why should I care that the processes leading to the delivery of my products are “no touch”, that it required hardly any human intervention? Well, you will when your products are delivered faster, and that from order to delivery your experience was transparent and seamless, requiring no tedious phone calls to locate your product(s) or complains about delivery or invoicing errors!
  • Increased revenue – as companies process more, faster, with fewer handling and processing errors along the way, they create added value for their customers and benefit from capacity gains that eventually affect their top line, particularly when operational savings are passed on to lower delivery/product prices, thus allowing for a healthy combination of margin and revenue increase.

We have seen that automation can do far more than simply cut costs and that there are many ways to implement automation at scale without relying on infrastructure/regulation changes (e.g., drones) – for example, by leveraging a digitally augmented workforce. Companies have been launching proofs of concept (POCs) but often struggle to reap the true benefits due to talent shortages, siloed processes, and a lack of a long-term holistic vision.

JW: What hurdles do organizations need to overcome to achieve an autonomous supply chain?

JJ: We have observed that companies often face the following hurdles when trying to create a more autonomous supply chain:

  • Lack of visibility and transparency – due to 1) outdated process flows, and 2) siloed information systems often requiring email-based information exchange (back and forth non-standardized spreadsheets, flat files)
  • Lack of agility (influencing/impacting the overall resilience of the supply chain) – the inability to execute on insights due to slow information velocity and stiffness in their processes, often focused on functions as opposed to value-added processes cutting across the organization
  • Lack of the right talent – difficulty in finding talent in a very competitive industry with new technologies making typical supply chain profiles less relevant and new digital profiles often costly to train and hard to retain
  • Lack of centralization and consolidation – leading to high costs, poor productivity, and disjointed technology landscapes, often unable to scale across the organization due to a lack of a holistic transformation approach and proper governance.

One thing that many companies have in common is a lack of ability to deploy automation solutions at scale, cost-effectively. Too often, these projects remain at a POC stage and are parked until a new POC (often technology-driven) comes along and yet again fails to scale properly due to high costs, lack of resources, and lack of strategic vision tied to business outcomes.

 

In Part 2 of the interview, Joerg Junghanns discusses the supply chain flows that benefit from automation, describes client case examples, and highlights the success factors, adoption approach, and key technologies behind autonomous supply chains.

]]>
<![CDATA[Genpact Acquires Barkawi Management Consultants, Targets 25%+ Growth in Supply Chain Management]]>

 

SCM is one of Genpact’s “invest-to-grow” service lines, where the company is looking to make disproportionate investments and scale up the business: in this case, to become one of the top two global supply chain transformation services vendors. In its “invest-to-grow” businesses, Genpact is looking to achieve at least twice the level of revenue growth achieved by Genpact overall and to do this by investing in complementary competencies rather than scale.

Genpact identified Barkawi Management Consultants, part of the Barkawi Group, as a potential target by working alongside the company (from now on referred to as Barkawi) within in its client base. Discussions began in late 2017, with the deal expected to close this month, August 2018, once the regulatory processes are complete.

The acquisition of Barkawi provides a strong platform for Genpact to deepen its supply chain consulting practice, achieve a revenue balance in SCM between transformation consulting and managed services, strengthen its relationships and expertise in key supply chain technologies, and strengthen its presence in Europe.

Deepening Supply Chain Consulting Capability

In the area of SCM, Genpact had existing capability in planning & inventory optimization & demand analytics and a couple of large managed services contracts. However, the company had limited front-end consulting capability, with just 30 supply chain management consultants. Although Genpact was organically adding SCM consultants, this relative lack of front-end expertise was limiting its ability to handle a significant number of concurrent prospect conversations. The acquisition of Barkawi brings 180 SCM consultants to Genpact, enabling the company to have not only a greater number of simultaneous client and prospect interactions but also to have deeper and more end-to-end conversations across more SCM transformation dimensions (including operating model transformation, technology transformation, digital transformation, and customer-oriented transformation).

Prior to the acquisition, Barkawi had ~200 consultants, with the bulk of these (~180) in the U.S. (principally in a center in Atlanta) and Europe (principally in a center in Munich). These are the operations being acquired by Genpact. The remaining Barkawi personnel were based in the Middle-East and China, which are not markets where Genpact actively generates business, and these personnel will not be transferring to Genpact.

Barkawi principally employs two types of consultant:

  • Management/process consultants active in supply chain and aftermarket services
  • Digital/technology consultants where the larger part of the practice consisted of assessment/implementation/optimization projects around partner technologies such as Kinaxis and Anaplan.

The U.S. business was slightly larger than the European business and employed a majority of personnel active as technology consultants, while the European business employed a majority of its personnel in management/process consulting.

Achieving a Balance between Transformation Consulting & Managed Services

Barkawi will be combined with Genpact’s consultants into a single SCM consulting service line, giving a broadly balanced mix across management/process consulting and technology consulting. This global service line will be headed by Mike Landry, previously head of Barkawi Management Consultants’ U.S. entity, and will be organized into supply chain consulting, aftermarket consulting, and technology, with these horizontals matrixed against the following verticals: consumer products, life sciences, industrial machinery, and product manufacturing.

Genpact is aiming to achieve a rough balance between the Genpact specialisms of consumer products and life sciences and the Barkawi specialism in industrial manufacturing. Similarly, Genpact is aiming for a roughly equal revenue split between consulting and managed services, with the CPG sector having a higher proportion of managed services contracts.

Strengthening Supply Chain Technology Relationships

Another advantage of the Barkawi acquisition is that it brings Genpact strong existing relationships with, and expertise in, supply chain planning platform companies Kinaxis and Anaplan. Barkawi is one of the leading partners of Kinaxis, and the company’s partnership with Anaplan on supply chain complements that of Genpact's with Anaplan for EPM.

Strengthening European Presence

In terms of its client base, Genpact estimates that the majority of Barkawi’s clients in the U.S. (where it was typically selling ~$200K technology consulting projects), are prospects for a wider range of Genpact supply chain transformation services. In addition, Barkawi had a strong management/process consulting presence in major manufacturers in Germany, which Genpact will seek to build on.

In addition, while the bulk of Barkawi’s European personnel are in Germany, Genpact will look to extend this capability by growing its team in both Munich and across Europe to address supply chain consulting in the wider European market. Genpact perceives there to be major consulting opportunities within the leading manufacturing companies, assisting them in implementing and optimizing technology, working with data, and creating optimization models. This applies particularly to companies with a strong element of aftermarket services, where these companies need to optimize their aftermarket models and address aftermarket fulfilment, warranty management, and forecasting.

 

Overall, Genpact is still looking to grow the supply chain management consulting team further, will continue to recruit, to support these growth initiatives.

 

]]>
<![CDATA[RPA Operating Model Guidelines, Part 3: From Pilot to Production & Beyond – The Keys to Successful RPA Deployment]]>

As well as conducting extensive research into RPA and AI, NelsonHall is also chairing international conferences on the subject. In July, we chaired SSON’s second RPA in Shared Services Summit in Chicago, and we will also be chairing SSON’s third RPA in Shared Services Summit in Braselton, Georgia on 1st to 2nd December. In the build-up to the December event we thought we would share some of our insights into rolling out RPA. These topics were the subject of much discussion in Chicago earlier this year and are likely to be the subject of further in-depth discussion in Atlanta (Braselton).

This is the third and final blog in a series presenting key guidelines for organizations embarking on an RPA project, covering project preparation, implementation, support, and management. Here I take a look at the stages of deployment, from pilot development, through design & build, to production, maintenance, and support.

Piloting & deployment – it’s all about the business

When developing pilots, it’s important to recognize that the organization is addressing a business problem and not just applying a technology. Accordingly, organizations should consider how they can make a process better and achieve service delivery innovation, and not just service delivery automation, before they proceed. One framework that can be used in analyzing business processes is the ‘eliminate/simplify/standardize/automate’ approach.

While organizations will probably want to start with some simple and relatively modest RPA pilots to gain quick wins and acceptance of RPA within the organization (and we would recommend that they do so), it is important as the use of RPA matures to consider redesigning and standardizing processes to achieve maximum benefit. So begin with simple manual processes for quick wins, followed by more extensive mapping and reengineering of processes. Indeed, one approach often taken by organizations is to insert robotics and then use the metrics available from robotics to better understand how to reengineer processes downstream.

For early pilots, pick processes where the business unit is willing to take a ‘test & learn’ approach, and live with any need to refine the initial application of RPA. Some level of experimentation and calculated risk taking is OK – it helps the developers to improve their understanding of what can and cannot be achieved from the application of RPA. Also, quality increases over time, so in the medium term, organizations should increasingly consider batch automation rather than in-line automation, and think about tool suites and not just RPA.

Communication remains important throughout, and the organization should be extremely transparent about any pilots taking place. RPA does require a strong emphasis on, and appetite for, management of change. In terms of effectiveness of communication and clarifying the nature of RPA pilots and deployments, proof-of-concept videos generally work a lot better than the written or spoken word.

Bot testing is also important, and organizations have found that bot testing is different from waterfall UAT. Ideally, bots should be tested using a copy of the production environment.

Access to applications is potentially a major hurdle, with organizations needing to establish virtual employees as a new category of employee and give the appropriate virtual user ID access to all applications that require a user ID. The IT function must be extensively involved at this stage to agree access to applications and data. In particular, they may be concerned about the manner of storage of passwords. What’s more, IT personnel are likely to know about the vagaries of the IT landscape that are unknown to operations personnel!

Reporting, contingency & change management key to RPA production

At the production stage, it is important to implement a RPA reporting tool to:

  • Monitor how the bots are performing
  • Provide an executive dashboard with one version of the truth
  • Ensure high license utilization.

There is also a need for contingency planning to cover situations where something goes wrong and work is not allocated to bots. Contingency plans may include co-locating a bot support person or team with operations personnel.

The organization also needs to decide which part of the organization will be responsible for bot scheduling. This can either be overseen by the IT department or, more likely, the operations team can take responsibility for scheduling both personnel and bots. Overall bot monitoring, on the other hand, will probably be carried out centrally.

It remains common practice, though not universal, for RPA software vendors to charge on the basis of the number of bot licenses. Accordingly, since an individual bot license can be used in support of any of the processes automated by the organization, organizations may wish to centralize an element of their bot scheduling to optimize bot license utilization.

At the production stage, liaison with application owners is very important to proactively identify changes in functionality that may impact bot operation, so that these can be addressed in advance. Maintenance is often centralized as part of the automation CoE.

Find out more at the SSON RPA in Shared Services Summit, 1st to 2nd December

NelsonHall will be chairing the third SSON RPA in Shared Services Summit in Braselton, Georgia on 1st to 2nd December, and will share further insights into RPA, including hand-outs of our RPA Operating Model Guidelines. You can register for the summit here.

Also, if you would like to find out more about NelsonHall’s expensive program of RPA & AI research, and get involved, please contact Guy Saunders.

Plus, buy-side organizations can get involved with NelsonHall’s Buyer Intelligence Group (BIG), a buy-side only community which runs regular webinars on RPA, with your buy-side peers sharing their RPA experiences. To find out more, contact Matthaus Davies.  

This is the final blog in a three-part series. See also:

Part 1: How to Lay the Foundations for a Successful RPA Project

Part 2: How to Identify High-Impact RPA Opportunities

]]>
<![CDATA[RPA Operating Model Guidelines, Part 2: How to Identify High-Impact RPA Opportunities]]>

 

As well as conducting extensive research into RPA and AI, NelsonHall is also chairing international conferences on the subject. In July, we chaired SSON’s second RPA in Shared Services Summit in Chicago, and we will also be chairing SSON’s third RPA in Shared Services Summit in Braselton, Georgia on 1st to 2nd December. In the build-up to the December event we thought we would share some of our insights into rolling out RPA. These topics were the subject of much discussion in Chicago earlier this year and are likely to be the subject of further in-depth discussion in Atlanta (Braselton).

This is the second in a series of blogs presenting key guidelines for organizations embarking on an RPA project, covering project preparation, implementation, support, and management. Here I take a look at how to assess and prioritize RPA opportunities prior to project deployment.

Prioritize opportunities for quick wins

An enterprise level governance committee should be involved in the assessment and prioritization of RPA opportunities, and this committee needs to establish a formal framework for project/opportunity selection. For example, a simple but effective framework is to evaluate opportunities based on their:

  • Potential business impact, including RoI and FTE savings
  • Level of difficulty (preferably low)
  • Sponsorship level (preferably high).

The business units should be involved in the generation of ideas for the application of RPA, and these ideas can be compiled in a collaboration system such as SharePoint prior to their review by global process owners and subsequent evaluation by the assessment committee. The aim is to select projects that have a high business impact and high sponsorship level but are relatively easy to implement. As is usual when undertaking new initiatives or using new technologies, aim to get some quick wins and start at the easy end of the project spectrum.

However, organizations also recognize that even those ideas and suggestions that have been rejected for RPA are useful in identifying process pain points, and one suggestion is to pass these ideas to the wider business improvement or reengineering group to investigate alternative approaches to process improvement.

Target stable processes

Other considerations that need to be taken into account include the level of stability of processes and their underlying applications. Clearly, basic RPA does not readily adapt to significant process change, and so, to avoid excessive levels of maintenance, organizations should only choose relatively stable processes based on a stable application infrastructure. Processes that are subject to high levels of change are not appropriate candidates for the application of RPA.

Equally, it is important that the RPA implementers have permission to access the required applications from the application owners, who can initially have major concerns about security, and that the RPA implementers understand any peculiarities of the applications and know about any upgrades or modifications planned.

The importance of IT involvement

It is important that the IT organization is involved, as their knowledge of the application operating infrastructure and any forthcoming changes to applications and infrastructure need to be taken into account at this stage. In particular, it is important to involve identity and access management teams in assessments.

Also, the IT department may well take the lead in establishing RPA security and infrastructure operations. Other key decisions that require strong involvement of the IT organization include:

  • Identity security
  • Ownership of bots
  • Ticketing & support
  • Selection of RPA reporting tool.

Find out more at the SSON RPA in Shared Services Summit, 1st to 2nd December

NelsonHall will be chairing the third SSON RPA in Shared Services Summit in Braselton, Georgia on 1st to 2nd December, and will share further insights into RPA, including hand-outs of our RPA Operating Model Guidelines. You can register for the summit here.

Also, if you would like to find out more about NelsonHall’s expensive program of RPA & AI research, and get involved, please contact Guy Saunders.

Plus, buy-side organizations can get involved with NelsonHall’s Buyer Intelligence Group (BIG), a buy-side only community which runs regular webinars on sourcing topics, including the impact of RPA. The next RPA webinar will be held later this month: to find out more, contact Guy Saunders.  

In the third blog in the series, I will look at deploying an RPA project, from developing pilots, through design & build, to production, maintenance, and support.

]]>
<![CDATA[RPA Operating Model Guidelines, Part 1: Laying the Foundations for Successful RPA]]>

 

As well as conducting extensive research into RPA and AI, NelsonHall is also chairing international conferences on the subject. In July, we chaired SSON’s second RPA in Shared Services Summit in Chicago, and we will also be chairing SSON’s third RPA in Shared Services Summit in Braselton, Georgia on 1st to 2nd December. In the build-up to the December event we thought we would share some of our insights into rolling out RPA. These topics were the subject of much discussion in Chicago earlier this year and are likely to be the subject of further in-depth discussion in Atlanta (Braselton).

This is the first in a series of blogs presenting key guidelines for organizations embarking on RPA, covering establishing the RPA framework, RPA implementation, support, and management. First up, I take a look at how to prepare for an RPA initiative, including establishing the plans and frameworks needed to lay the foundations for a successful project.

Getting started – communication is key

Essential action items for organizations prior to embarking on their first RPA project are:

  • Preparing a communication plan
  • Establishing a governance framework
  • Establishing a RPA center-of-excellence
  • Establishing a framework for allocation of IDs to bots.

Communication is key to ensuring that use of RPA is accepted by both executives and staff alike, with stakeholder management critical. At the enterprise level, the RPA/automation steering committee may involve:

  • COOs of the businesses
  • Enterprise CIO.

Start with awareness training to get support from departments and C-level executives. Senior leader support is key to adoption. Videos demonstrating RPA are potentially much more effective than written papers at this stage. Important considerations to address with executives include:

  • How much control am I going to lose?
  • How will use of RPA impact my staff?
  • How/how much will my department be charged?

When communicating to staff, remember to:

  • Differentiate between value-added and non value-added activity
  • Communicate the intention to use RPA as a development opportunity for personnel. Stress that RPA will be used to facilitate growth, to do more with the same number of people, and give people developmental opportunities
  • Use the same group of people to prepare all communications, to ensure consistency of messaging.

Establish a central governance process

It is important to establish a strong central governance process to ensure standardization across the enterprise, and to ensure that the enterprise is prioritizing the right opportunities. It is also important that IT is informed of, and represented within, the governance process.

An example of a robotics and automation governance framework established by one organization was to form:

  • An enterprise robotics council, responsible for the scope and direction of the program, together with setting targets for efficiency and outcomes
  • A business unit governance council, responsible for prioritizing RPA projects across departments and business units
  • A RPA technical council, responsible for RPA design standards, best practice guidelines, and principles.

Avoid RPA silos – create a centre of excellence

RPA is a key strategic enabler, so use of RPA needs to be embedded in the organization rather than siloed. Accordingly, the organization should consider establishing a RPA center of excellence, encompassing:

  • A centralized RPA & tool technology evaluation group. It is important not to assume that a single RPA tool will be suitable for all purposes and also to recognize that ultimately a wider toolset will be required, encompassing not only RPA technology but also technologies in areas such as OCR, NLP, machine learning, etc.
  • A best practice for establishing standards such as naming standards to be applied in RPA across processes and business units
  • An automation lead for each tower, to manage the RPA project pipeline and priorities for that tower
  • IT liaison personnel.

Establish a bot ID framework

While establishing a framework for allocation of IDs to bots may seem trivial, it has proven not to be so for many organizations where, for example, including ‘virtual workers’ in the HR system has proved insurmountable. In some instances, organizations have resorted to basing bot IDs on the IDs of the bot developer as a short-term fix, but this approach is far from ideal in the long-term.

Organizations should also make centralized decisions about bot license procurement, and here the IT department which has experience in software selection and purchasing should be involved. In particular, the IT department may be able to play a substantial role in RPA software procurement/negotiation.

Find out more at the SSON RPA in Shared Services Summit, 1st to 2nd December

NelsonHall will be chairing the third SSON RPA in Shared Services Summit in Braselton, Georgia on 1st to 2nd December, and will share further insights into RPA, including hand-outs of our RPA Operating Model Guidelines. You can register for the summit here.

Also, if you would like to find out more about NelsonHall’s extensive program of RPA & AI research, and get involved, please contact Guy Saunders.

Plus, buy-side organizations can get involved with NelsonHall’s Buyer Intelligence Group (BIG), a buy-side only community which runs regular webinars on sourcing topics, including the impact of RPA. The next RPA webinar will be held in November: to find out more, contact Matthaus Davies.  

 

In the second blog in this series, I will look at RPA need assessment and opportunity identification prior to project deployment.

 

]]>
<![CDATA[HPE: Digitizing F&A BPS to Realize Profit Maximization]]> NelsonHall recently attended the Hewlett Packard Enterprise (HPE) “Empowering the Customer to Win in the Digital Age” event hosted by HPE BPS. The theme was strongly around digital and empowering organizations to own the (increasingly digital) interface between customers, suppliers, and employees. In support of this theme HPE is investing heavily in automation, both in its own platforms, and in centers of excellence, partnerships, and methodologies.

For example, within F&A BPS, Hewlett Packard Enterprise (HPE) is investing in a tool to assess the automation potential of organizations’ finance & accounting processes, which is being built into HPE’s FIT (Framework for Innovation & Transformation) framework, and HPE is now developing automation and digitization assessments and roadmaps at the front-end of F&A BPS contracts.

In its targeting of F&A BPS, HPE is becoming more sector specific and incorporating metrics specific to target sectors within FIT, starting with the telecoms and oil & gas sectors.

HPE is also becoming more business metric focused in its approach to F&A BPS and highlighting that the benefits of automation extend way beyond process cost take-out. Cash acceleration and cash utilization are major areas of focus for HPE within F&A BPS. In particular, HPE is stressing that the benefits of source-to-pay automation go beyond halving the S2P headcount and start to open the door to profit improvement opportunities through dynamic discounting. HPE formerly used to advise its clients on negotiating longer payment terms with their suppliers; the company has now changed its focus to encouraging its clients to negotiate early payment discounts and automate/digitize their P2P processes to achieve rapid approval of purchase invoices so that they can optimize their early discounts against these invoices. In many cases, the purchase invoice approval process has been too slow and the knowledge of potential discounts too inaccessible to take advantage of what could amount to a profit improvement opportunity equal to up to 2% of total goods purchased. For example, HPE estimates that the HP GBS organization has saved $2.7bn in early payment discounts over the past three years by taking this approach.

Accordingly, HPE has established a center of excellence for automation in F&A, and is beginning to encourage use of data pdf technology to reduce the need for OCR or manual rekeying of invoices. The company has a number of pilots in this area.

In terms of robotics, the company is currently using UiPath and Blue Prism, the latter particularly for connecting with ERP software, and Redwood for support for R2R and month-end close, and has built-up a library of ~750 accelerators. The company is also extensively using PDF Cloud, and its own Vertica software. HPE’s Business Process Analytics Tool (BPAT) is based on Vertica, which is used to provide an F&A dashboard covering both an executive view of KPIs and drill-downs into service performance.

For example, within P2P, HPE is aiming to digitize F&A processes by:

  • Reducing use of paper and scanning through use of PDF data capture and its partnerships with Tungsten and Tradeshift
  • Further automation of invoice data entry and processing using RPA
  • Identifying further opportunities for automation via BPAT.

Overall HPE is increasingly seeking to place automation strategy and vision at the forefront of F&A process design, with automation and digitization leading the way in identifying possibilities for straight-through processing. Indeed, based on HPE’s F&A services transformation journey diagram, the company expects ~60% of future F&A BPS productivity improvements to be driven by automation and 40% to be driven by process change and staff reallocation & best-shoring.

Contrary to some expectations, RPA is only one automation component. In HPE’s automation journey in F&A BPS, RPA is expected to deliver around a quarter of the total productivity benefits to be achieved from automation, with a whole range of tools and platforms contributing around 75% of the automation benefits to be achieved.

As usual, one of the major challenges over the past year has been in training the company’s solution architects in thinking digital and identifying benefits beyond those previously achievable. As HPE suggested, many of the existing F&A process benchmarks may need to be rewritten over the next 12-months.

F&A BPS is arguably the most mature of all the BPS services. However, with real-time analytics increasingly identifying the opportunities, RPA lowering the barriers to process improvement, and organizations increasingly willing to automate, F&A BPS is now off on a new journey that promises a step change in productivity. Automation plays to the strengths of HPE, and F&A digitization is an area where the company is intending to strongly invest and compete.

]]>
<![CDATA[NelsonHall Business Process Services Confidence Index Shows High Expectations for 2015]]> NelsonHall’s Business Process Services (BPS) Confidence Index is a quarterly index of confidence to monitor changes in industry confidence in the global business process services market.

The objectives of the quarterly “BPS Confidence Index” are to identify:

  • Whether the demand for business process services has strengthened or softened in the past quarter
  • Expectations of how the relative demand for business process services will change over the coming year
  • The factors that are influencing change in demand
  • The characteristics of contracts that are showing relatively high and low acceptance
  • To produce a “Business Process Services Index of Confidence” that will be widely reported and used in reference to the business process services market.

The survey is open to all BPS service providers and participation is free-of-charge. To participate, please contact Paul Connolly.

The NelsonHall BPS Confidence Index score for Q1 2015 is 156 (out of 200), reflecting strong supplier confidence in BPS in 2015 relative to 2014.

Overall, the current economic climate is in general assisting the BPS market by maintaining a significant level of cost pressure, while the combination of cost pressure and technological advance is increasingly encouraging companies to look outside their own boundaries for new business models.

Accordingly, while emphasis on cost reduction remains high, BPS is increasingly being driven by organizations seeking to achieve transformation and to achieve revenue protection and growth. However, the geographic focus of these initiatives has switched back to company's core economies, with support for growth in emerging economies becoming a much less important driver of BPS adoption.

Nonetheless, despite this increasing optimism, frozen decision-making and business uncertainty are still significant inhibitors to adoption of BPS services, with the proportion of global clients and prospects whose sourcing decision-making is frozen reported to be 26% and clients and prospects are continuing to seek aggressive cost reductions, which can act as an impediment to deal completion.

By service line, F&A BPS was reported as exhibiting above average growth in Q4 2014, ahead of HR BPS and contact center services, as were the related and emerging areas of procurement BPS and supply chain management and this pattern is broadly expected to be repeated during the remainder of 2015.

By sector, BPS vendors continue to have very high expectations of the healthcare sector, across both healthcare providers and healthcare payers. Expectations are also high for the manufacturing sector, particularly in those sub-sectors strongly impacted by new technology such as high-tech, pharmaceuticals, and automotive.

By geography, growth expectations are particularly high for North America relative to Europe where expectations are relatively muted as is the case for Latin America. Growth expectations are moderate for Asia Pacific, with the exception of an expectation of high growth in Australia.

Contract scope and value is reported as increasing over the past 12-months, as contractts become for end-to-end and support for high-end decision-making increases, though contract lengths remain largely unchanged.

If you would like to register for the next BPO Index webcast, scheduled for the 2nd July, 2015, you can do so here.

]]>
<![CDATA[NelsonHall BPO Index Shows Continuing Upturn in BPO Contract Activity in Q1 2015]]> BPO contract TCV in Q1 2015 continued the improvement in contract activity seen in Q4 2014, with BPO TCV picking up and gaining momentum over the past two quarters following relatively low levels of BPO TCV awarded in Q2 and Q3 2014.

In particular, Q4 2014 showed a 6% improvement in BPO TCV year-on-year, with BPO accounting for 31% of total outsourcing TCV. Q1 2015 BPO TCV performance then improved further, up 12% year-on-year, accounting for 33% of BPO TCV. Nonetheless, while the last couple of quarters are an improvement, they are still underperforming relative to the seasonal average over the past five years and so there is continuing scope for improvement.

By geography, the focus on the major economies continues. There was a significant increase in BPO TCV in North America in Q1 2015 year-on-year. Within Continental Europe, for both Q1 2015 and for the last 12-months there was significant BPO contract activity in the Netherlands with Infosys winning a life BPO contract as well as a supply chain management BPO contract here in the past quarter. Overall though, the downturn in BPO TCV in Europe continues.

If we extrapolate this quarter’s level of BPO activity to the remainder of 2015, then BPO TCV across North America and Europe in 2015 will be 9% higher than that recorded in 2014.

At the sector level, the manufacturing sector is indeed showing progress, moving along the value chain from F&A outsourcing through procurement outsourcing and supply chain management. Within the manufacturing sector over the past 12-months, there was an increase in supply chain management BPO TCV alongside the increase in procurement outsourcing TCV. In Europe, the procurement outsourcing activity was in the pharmaceuticals and food & beverages sectors.

Within customer management services, the telecoms & media sector predictably dominated over the past 12-months with the manufacturing sector taking over second place from the retail sector. And within the manufacturing sector, CMS activity strengthened in both the high-tech sector and the pharmaceuticals sector.

At the global level, the last 12-months have shown a strengthening in HR outsourcing and procurement, with a strengthening in procurement outsourcing in Europe, and a significant strengthening in both HR outsourcing and F&A outsourcing in North America.

The top three places in the league table for BPO TCV over the past 12-months remain unchanged, with Serco and Capita at the top of the table. Behind these companies, Sopra Steria remains unchanged in 8th place, with the remainder of the top ten: Capgemini, State Street, Infosys, SAIC, WNS, and Wipro all moving up the table relative to the prior 12-month period.

Serco is continuing to do well in CMS in the retail industry, particularly in fashion, adding a contract with JD Williams, while in Q1 2015, Capita’s government business was back on song with contracts with Sheffield City Council and DEFRA while its acquisition of government assets continues. Capgemini had 12-months of solid F&A BPO contract wins announcing a contract expansion with Office Depot in Q1. Infosys has also had a very solid Q1, winning not just F&A BPO contracts, including a contract expansion with AkzoNobel, but also several insurance BPO contracts, and a supply chain management BPO contract.

The NelsonHall BPO Index is complemented by the NelsonHall Self-Service Market Forecast Tool, which covers 78 BPO service lines, 30 geographies, and 33 industry sectors. This gives highly accurate and granular views of the market, and complements the NelsonHall BPO Index which gives quarterly snapshots of big deal momentum. To use the NelsonHall Self-Service Market Forecast Tool, click here.

If you would like to register for the next BPO Index webcast, scheduled for the 2nd July, 2015, you can do so here.

]]>
<![CDATA[Disruptive Forces and Their Impact on BPO: Part 7 - High Velocity BPO - What the Client Always Wanted]]> This is the final in a series of short blogs that look at various disruptive forces and their impact on BPO. The impact of all these disruptive factors is that BPO is now changing into something that the client has always wanted namely “High Velocity BPO”.

In its early days, BPO was a linear and lengthy process with knowledge transfer followed by labor arbitrage, followed by process improvement and standardization, followed by application of tools and automation. This process typically took years, often the full lifetime of the initial contract. More recently, BPO has speeded up with standard global process models, supported by elements of automation, being implemented in conjunction with the initial transition and deployment of global delivery. This timescale for “time to value” is now being speeded up further to enable a full range of transformation to be applied in months rather than years. Overall, BPO is moving from a slow-moving mechanism for transformation to High Velocity BPO. Why take years when months will do?

Some of key characteristics of High Velocity BPO are shown in the chart below:

Attribute

Traditional BPO

High-Velocity BPO

 Objective

 Help the  purchaser fix  their processes

 Help the purchaser contribute to  wider business goals

 Measure of  success

 Process  excellence

 Business success, faster

 Importance of  cost reduction

 High

 Greater, faster

 Geographic  coverage

 Key countries

 Global, now

 Process  enablers &  technologies

 High  dependence on  third-parties

 Own software components  supercharged with RPA

 Process  roadmaps

 On paper

 Built into the components

 Compliance

 Reactive  compliance

 Predictive GRC management

 Analytics

 Reactive process  improvement

 Predictive & driving the  business

 Digital

 A front-office  “nice-to-have”

 Multi-channel and sensors  fundamental

 Governance

 Process-  dependent

 GBS, end-to-end KPIs

 

As a start point, High Velocity BPO no longer focuses on process excellence targeted at a narrow process scope. Its ambitions are much greater, namely to help the client achieve business success faster, and to help the purchaser contribute not just to their own department but to the wider business goals of the organization, driven by monitoring against end-to-end KPIs, increasingly within a GBS operating framework.

However, this doesn’t mean that the need for cost reduction has gone away. It hasn’t. In fact the need for cost reduction is now greater and faster than ever. And in terms of delivery frameworks, the mish-mash of third-party tools and enablers is increasingly likely to be replaced by an integrated combination of proprietary software components, probably built on Open Source software, with built in process roadmaps, real-time reporting and analytics, and supercharged with RPA.

Furthermore, the role of analytics will no longer be reactive process improvement but predictive and driving real business actions, while compliance will also become even more important.

But let’s get back to the disruptive forces impacting BPO. What forms will the resulting disruption take in both the short-term and the long-term?

 Disruption

 Short-term impact

 Long-term impact

 Robotics

 Gives buyers 35%  cost reduction fast
 Faster introduction  of non-FTE based  pricing

 No significant impact on  process models or technology

 Analytics

 Already drives  process  enhancement

 Becomes much more  instrumental in driving business  decisions

 Potentially makes BPO vendors  more strategic

 Labor  arbitrage on  labor  arbitrage

 Ongoing reductions  in service costs and  employee attrition


 Improved business  recovery

 “Domestic BPO markets”  within emerging economies  become major growth  opportunity

 Digital

 Improved service at  reduced cost

 Big opportunity to combine  voice, process, technology, &  analytics in a high-value end-  to-end service

 BPO  “platform  components”

 Improved process  coherence

 BPaaS service delivery without  the third-party SaaS

 The Internet  of Things

 Slow build into  areas like  maintenance

 Huge potential to expand the  BPO market in areas such as  healthcare

 GBS

 Help organizations  deploy GBS

 Improved end-to-end  management and increased  opportunity

Reduced friction of service transfer

 

Well robotics is here now and moving at speed and giving a short-term impact of around 35% cost reduction where applied. It is also fundamentally changing the underlying commercial models away from FTE-based pricing. However, robotics does not involve change in process models or underlying systems and technology and so is largely short-term in its impact and is a cost play.

Digital and analytics are much more strategic and longer lasting in their impact enabling vendors to become more strategic partners by delivering higher value services and driving next best actions and operational business decisions with very high levels of revenue impact.

BPO services around the Internet of Things will be a relatively slow burn in comparison but with the potential to multiply the market for industry-specific BPO services many times over and to enable BPO to move into critical services with real life or death implications.

So what is the overall impact of these disruptive forces on BPO? Well while two of the seven listed above have the potential to reduce BPO revenues in the short-term, the other five have the potential to make BPO more strategic in the eyes of buyers and significantly increase the size and scope of the global BPO market.

 

Part 1 The Robots are Coming - Is this the end of BPO?

Part 2 Analytics is becoming all-pervasive and increasingly predictive

Part 3 Labor arbitrage is dead - long live labor arbitrage

Part 4 Digital renews opportunities in customer management services

Part 5 Will Software Destroy the BPO Industry? Or Will BPO Abandon the Software Industry in Favor of Platform Components?

Part 6 The Internet of Things: Is this a New Beginning for Industry-Specific BPO?

]]>
<![CDATA[Disruptive Forces and Their Impact on BPO: Part 5 - Will Software Destroy the BPO Industry? Or Will BPO Abandon the Software Industry in Favor of Platform Components?]]> BPO has always depended on partnerships with third-party software providers to provide supplementary platforms around client core systems to provide specialist functionality in areas like procurement, collections, & reconciliation handling. However, there is a danger that this can lead to a Heath Robinson (or Rube Goldberg) combination of applications, involving expensive software or SaaS licences, that can be difficult to integrate with process models and analytics, and where IP is shared or handed to the software company. So BPO vendors are increasingly looking at alternatives to COTS software.

However, BPO vendors have often tended to use their own proprietary tools in key areas such as workflow, which has had the advantage of enabling them to offer a lower price point than via COTS workflow and also enabled them to achieve more integrated real-time reporting and analytics. This approach to develop pre-assembled components is being further accelerated in conjunction with cloud-based provisioning.

Now, BPO vendors are starting to take this logic a step further and pre-assemble large numbers of BPO platform components as an alternative to COTS software. This approach potentially enables them to retain the IP in-house, an important factor in areas like robotics and AI, reduce their cost to serve by eliminating the cost of third-party licences, and achieve a much more tightly integrated and coherent combination of pre-built processes, dashboards and analytics supported by underlying best practice process models.

It also potentially enables them to offer true BPaaS for the first time and to begin to move to a wider range of utility offerings - the Nirvana for vendors.

Coming next: The Internet of Things – Is this a new beginning for industry-specific BPO

Previous blogs in this series:

Part 1 The Robots are Coming - Is this the end of BPO?

Part 2 Analytics is becoming all-pervasive and increasingly predictive

Part 3 Labor arbitrage is dead - long live labor arbitrage

Part 4 Digital renews opportunities in customer management services

]]>
<![CDATA[Disruptive Forces and Their Impact on BPO: Part 4 - Digital Renews Opportunities in Customer Management Services]]> There has always been a big divide between those suppliers that are comfortable handling voice and those suppliers that were comfortable handling data with very few comfortable with both.

However, the impact of digital is such that it increases the need for voice and data convergence. A common misconception in customer service is that the number of transactions is going down. It isn’t, it is increasing. So while the majority of interactions will be handled by digital rather than voice within three years, voice will not disappear. Indeed, one disruptive impact of digital is that it increases the importance of voice calls, and with voice calls now more complex and emotional, voice agents need to be up-skilled to meet this challenge. So the voice aspect of customer service is no longer focused on cost reduction. It is now focused on adding value to complex and high value transactions, with digital largely responsible for delivering the cost reduction element.

So what are the implications for BPO vendors supporting the front-office. Essentially vendors now need a strong combination of digital, consulting, automation, and voice capability and:

  • Need to be able to provide a single view of the customer & linked multi-channel delivery
  • Need to able to analyse and optimize customer journeys
  • Need the analytics to be able to recommend next best actions both to agents and through digital channels.

On the people side, agent recruiting, training, and motivation become more important than ever before, now complicated by the fact that differing channels need different agent skills and characteristics. For example, the recruiting criteria for web chat agents are very different from those for voice agents. In addition, the web site is now a critical part of customer service delivery, and self-serve and web forms, traditionally outside the mandate of customer management services vendors, are a disruptive force that now needs to be integrated with both voice and other digital channels to provide a seamless end-to-end customer journey regard. This remains an organizational challenge for both organizations and their suppliers.

Coming next – the impact of the move to platform components

Disruptive Forces and Their Impact on BPO - previous articles in series

Part 1 The Robots are Coming - Is this the end of BPO?

Part 2 Analytics is becoming all-pervasive and increasingly predictive

Part 3 Labor arbitrage is dead - long live labor arbitrage

 

 

 

]]>
<![CDATA[Disruptive Forces and Their Impact on BPO: Part 3 - Labor arbitrage is dead – long live labor arbitrage]]> There’s another disruptive force in BPO that no-one likes to talk about. It’s called labor arbitrage. Everyone is keeping a bit quiet about this one. It’s nothing like as sexy as robotics, or analytics, or SMAC, but it’s also a disruptive force.

One side of labor arbitrage within labor arbitrage is relatively defensive, but in spite of automation and robotics, mature “International BPO” services are now being transferred to tier-n cities. Here labor arbitrage within labor arbitrage offers lower price points, reduced attrition, and business continuity. The downside is that travel to these locations might be slightly more challenging than some clients are used to, Nonetheless tier-n locations are an increasingly important part of the global delivery mix even for major outsources centered around mature geographies such as North America and Europe. And even more important as doing business in emerging markets becomes evermore business critical to these multinationals.

However, there’s also a non-defensive side to use of tier-n cities, which is to support growth in domestic markets in emerging markets, which will be an increasingly important part of the BPO market over the coming years. Lots of activity of this type is already underway in India, but let’s take South Africa as an example where cities such as Port Elizabeth and Jo’berg are emerging as highly appropriate for supporting local markets cost-effectively in local languages.So, just when you thought all BPO activity had centralized in a couple of major hubs, the spokes are fighting back and becoming more strategic.

But let’s get back to a sexier topic than labor arbitrage. The next blog looks at the impact of Digital.

Part 1 The Robots are Coming - Is this the end of BPO?

Part 2 Analytics is becoming all-pervasive and increasingly predictive

]]>
<![CDATA[Disruptive Forces and Their Impact on BPO: Part 2 - Analytics is becoming all-pervasive and increasingly predictive]]> Robotics has moved incredibly fast over the past year, but so has analytics. Analytics has been around in support of process improvement initiatives & Lean Six sigma projects for many years. It has also been present in areas like fraud analytics, which means at a personal level that you now have to re-instate your credit card most weeks.

However, analytics is now becoming much more pervasive, much more embedded in processes, and much more predictive & forward-looking in terms of recommending immediate business actions and not just process improvements as shown  below 

 BPO Activity

 Areas where analytics being applied

 Contact center

 Speech and text analytics
 Next best action/propensity engines

 Social media monitoring  & lead generation

 Marketing operations

 Price & promotion analytics
 Campaign and media mix analysis
 Store performance

 Financial services

 Model validation
 KYC/compliance

 Procurement

 Spend analytics

 

Indeed, it’s increasingly important that real-time, drill down dashboards are built into all services, and what-if modelling is becoming increasingly common here in support of for example supply chain optimization.

Analytics is also helping BPO to move up the value chain and open up new areas of possibility such as marketing operations, where traditional areas like store performance reporting are now being supplemented by real-time predictive analytics in support of identifying the most appropriate campaign messaging and the most effective media, or mix of media, for issuing this messaging. So while analytics is still being used in traditional areas such as spend analytics to drive down costs, it is increasingly helping organizations take real-time decisions that have high impact on the top-line. Much more strategic and impactful.

The next disruptive force to be covered is less obvious - labor arbitrage within labor arbitrage.

Part 1 The Robots are Coming - Is this the end of BPO?

]]>
<![CDATA[Disruptive Forces and Their Impact on BPO: Part 1 The Robots are Coming – Is this the end of BPO?]]> This blog is the first of seven in a series looking at six disruptive forces and their implications for BPO. Some of these are widely talked about, others less so. This first blog sets the scene and looks at the impact of robotics. Subsequent blogs will consider the implications of:

Analytics  becoming all-pervasive and increasingly predictive

Labor arbitrage is dead – long live labor arbitrage

Digital renews opportunities in customer management services

Will Software Destroy the BPO Industry? Or Will BPO Abandon the Software Industry in Favor of Platform Components?

The Internet of Things: Is this a New Beginning for Industry-Specific BPO?

The final blog will evaluate the short- and long-term impact of each of these disruptive forces individually and collectively and their potential to deliver “High Velocity BPO” – What the Client Always Wanted.

Let’s start at the beginning. 

Some of the common misconceptions about BPO are that it’s traditionally only been about people and not about innovation. Sorry, it’s always been about both. Another misconception is that BPO used to be about cost reduction but that is no longer the case and it’s now about other types of value. Well even in its infancy, BPO was always as much about service improvement as cost reduction, and to be honest one tends to go with the other anyway. There’s a huge correlation here.

Having said that, client needs do tend to become more focused over time, so let’s take a quick look at how BPO client needs are evolving. Then at six of the potential disruptive forces impacting BPO. Probably should be in a Porter analysis but let’s be less formal. Finally let’s take a look at what this means for BPO going forward. We’ve coined this as “High-Velocity BPO”. A bit of plagiarism here but I think it does the trick.

Clearly, what BPO buyers want varies considerably from service type to service type. But let’s start with the example of a very mature and conservative back-office process. In this area, organizations tend to start by asking for two things:

  • Can you get my organization into the top quartile in terms of cost?
  • Can you help my organization improve my processes? I’m not quite sure what that means but you are the experts, show me.

Within this desire for process improvement, standardization is often a key element, as is a desire for improved business agility. Then, by the time organizations get to second or third generation BPO, they have generally sorted out the first level of process standardization, have implemented lots of global delivery, and want to build on these. So they still want nirvana but they are starting to understand what nirvana looks like. So they are increasingly thinking about business outcomes on an end-to-end basis, & global process owners, & integration into, or setting up, GBS organizations. Also they’ve done labor arbitrage, so are increasingly thinking about increased automation, and controls & compliance are becoming even more important. Above all, they are looking for a process vision to support them in their business vision. Clients have always wanted nirvana, but it takes them time to work out what it might look like & how to get there.

It’s no longer about cost reduction, is it? Well ultimately, there are only three business outcomes that matter:

  • Can I grow my top line?
  • Can I increase my margin?
  • Can I do both of these while maintaining a healthy cash flow and not going bankrupt?

So the need for cost reduction is as strong as ever. Arguably the new factor here in recent years is the increased need for business agility, which increasingly demands some form of transactional pricing and a willingness to support reduced volumes as well as increased volumes. That can be a real differentiator.

BPO has always worked best when the agenda has been driven by a small number of high level business outcomes; the difficulty has been in managing and making changes to the end-to-end value chain. Certainly disruptions such as GBS should help here, providing an end-to-end process view and a single process owner.

So what are some of the disruptive forces impacting BPO?

Well the robots are coming; is this the end of BPO? One potentially disruptive force is RPA, which is certainly receiving headlines as a BPO killer. So where is RPA currently being used and what are the implications for BPO? Initially, the main usage of robotics is for getting data from one or more applications to another, making intelligent deductions & matching in support of data enrichment and filling in missing fields. A bit like macros on steroids. Loosely coupled with existing systems rather than changing them. The advantage of RPA is that it seems to be achieving a 30% plus cost take-out where employed and very quickly. Implementation times seem to be taking 1-3 months, with a further 3-month period for change management. So RPA is quick and easy.

So where vendors have sometimes been slow to implement the wider process change they knew was possible, due to potential impact on FTE-based revenues, robotics has been making vendors act fast. So partly about getting to their clients before anyone else, including their internal IT department, does. Also robotics has probably been the biggest single driver of pricing changes from FTE-based to fixed price and transactional based. No supplier wants to be caught implementing robotics with a FTE-based pricing model still in place. So robotics has probably generated a bigger change in renegotiation of pricing mechanisms than many years of process cost benchmarking.

Robotics also generates additional challenges for BPO vendors beyond service pricing, including whether to make or buy the underlying robotics tools. If as a vendor you want to develop your own IP and not share it with your competitors, then you might want to develop your own form of robotics rather than use third-party software, and indeed a number of vendors are doing just this.

The next disruptive force, which I will consider in tomorrow's blog, is analytics. 

]]>