posted on Jul 19, 2018 by Dominique Raviart
Tags: Tata Consultancy Services, Application Testing Management, IT outsourcing
In the past five years, NelsonHall has observed software testing services vendors adapting their portfolio around digital testing, focusing initially on agile and DevOps, and with a sense of urgency given the accelerating adoption of agile development methodologies. The transformation towards DevOps/continuous testing is ongoing, with most vendors now having their DevOps testing building blocks in place.
Another aspect of the digital testing journey has been around UX testing. Digital testing is no longer restricted to mobile apps/responsive websites and dealing with the multitude of device/OS/browser combinations; now, the focus has shifted to UX testing activities. This brings new challenges to the way IT departments conduct testing: testing tools are different from those used in functional and non-functional testing, the tool landscape is very fragmented, and the automation level is much lower.
In this blog, I look at what TCS’ Quality Engineering & Transformation Group (QET) is doing in the digital space with its newly-launched CX Assurance Platform (CXAP), which is combining a focus on digital testing along with security and performance testing. CXAP is focusing on a web application’s five key attributes: compatibility, usability, security, accessibility, and performance (CUSAP).
TCS has structured its CXAP offering into four components, focusing on CUSAP:
- An assessment and benchmarking of the client’s web application CX (‘dipstick assessment’)
- The dipstick assessment is complemented by a KPI-based assessment that is based on other data sources, e.g. Google Analytics
- A sentiment analysis
- A test execution.
Dipstick assessment
With dipstick assessment, TCS QET assesses the five CUSAP attributes, provides a quantitative score, and recommendations for removing technical issues that were identified:
- Compatibility testing is based on the automated comparison of screens, using most-used browsers
- Usability testing relies on TCS guidelines for page presentation, content, interaction, and website navigation
- Security testing is structured around the OWSAP’s top ten vulnerabilities
- Accessibility testing is based on the Web Content Accessibility Guidelines (WCAG) 2.0
- Performance testing benchmarks from an end-user perspective (the web application’s performance).
The assessment is essentially based on a sample approach, covering 10%-20% of a web application’s pages or mobile application’s screens.
KPI-based assessment
The KPI-based assessment relies on the analysis of data captured by several web analytics and application performance monitoring software tools, e.g. Adobe Analytics, Google Analytics, AppDynamics, and Dynatrace.
A KPI assessment provides a short analysis identifying any potential issues with a web application and recommending next steps. An example of this approach is for a merchant site: understanding how many customers engaged in a transaction, how many completed the transaction, comparing this number against the projected number of transactions, and understanding the difference based on a CUSAP analysis.
Sentiment analysis
With its sentiment analysis, TCS QET is expanding its sources of data further to social media and forums, still with its five CUSAP attributes in mind. QET points to causes other than IT, e.g. lack of product availability, or uncompetitive pricing that influences KPIs such as conversion rate. This service relies on the traditional analysis of Twitter, Facebook, Google +, and user forum data.
Execution services
Execution services are essentially an extension of the ‘dipstick’ assessment explained earlier. While dipstick is based on samples, execution services aim to test the entire application. An implication is the level of automation required: TCS QET is investing in automation, planning to achieve ~50% of testing automation in execution services in its next release this year, eventually reaching 100% automation.
Towards a non-linear business?
Three evident features of CXAP are:
- It is a comprehensive set of offerings that covers most UX testing activities
- TCS is creating further UX testing automation
- TCS is centralizing its UX testing IP, a good move as UX testing suffers from tool fragmentation.
TCS says it has about twenty-five clients for CXAP, most of whom are opting for bundled dipstick, KPI assessment, and execution services.
The ambitions of QET with CXAP go beyond having a central UX testing IP aggregation point that is subscription-based. In the short-term, TCS wants to develop a self-service, where prospects and existing clients will register, select the service of their choice (based on a service catalog approach), request a quote, and get the service provided. Some of the underlying technology for this self-service portal is already in place. Also, CXAP will expand to functional testing.
And QET’s ambitions do not stop there. QET also wants to add AI/cognitive capabilities to CXAP, conduct an automated root cause on defects, and predict defects, based on past release data. This is a bold ambition and we will monitor developments with interest.