DEBUG: PAGE=domain, TITLE=NelsonHall Blog,ID=1469,TEMPLATE=blog
toggle expanded view
  • NelsonHall Blog

    We publish lots of information and analyst insights on our blogs. Here you can find the aggregated posts across all NelsonHall program blogs and much more.

    explore
  • Events

    Keep up to date regarding some of the many upcoming events that NelsonHall participates in and also runs.

    Take the opportunity to join/attend in order to meet and discover live what makes NelsonHall a leading analyst firm in the industry.

    explore

Subscribe to blogs & alerts:

manage email alerts using the form below, in order to be notified via email whenever we publish new content:

Search research content:

action=something else...array(7) { ["program"]=> int(-1) ["analyst"]=> int(-1) ["industry"]=> int(-1) ["serviceline"]=> int(-1) ["vendor"]=> int(-1) ["country"]=> int(-1) ["application"]=> int(-1) } array(0) { }
from:
until:

Access our analyst expertise:

Only NelsonHall clients who are logged in have access to our analysts and advisors for their expert advice and opinion.

To find out more about how NelsonHall's analysts and sourcing advisors can assist you with your strategy and engagements, please contact our sales department here.

Infosys Applying AI to Test Automation

go to blog home

Search posts by keywords:

Filter posts by author:

Infosys has briefed NelsonHall on several automation testing initiatives based on analytics, AI/machine learning and robotics.

The software testing services industry continues to impress with its level of investment in IP and automation. Gone are the days when testing service vendors were talking about their testing frameworks and set of standalone accelerators and test case repositories. Those remain core elements of any IP portfolio, but the automation priority of most testing services vendors has shifted to platforms and analytics, with machine learning/AI usage being rolled out (incidentally, Infosys has unveiled its AI tool, MANA, recently, targeting L3 support. MANA is not yet used by Infosys’ testing practice).

Currently, the industry seems to focus on three main topics:

  • Creating DevOps platforms, and in particular the Dev to Ops process i.e. automating the traditional test life cycle from test process design to test execution, including test support services, up to the build integration/deployment into production environments
  • Creating digital platforms, based on sentiment analysis and UX, mostly through performance testing and usability testing
  • Deploying initially analytics and then AI/machine learning across testing operations. This is far stretch than previously, where analytics really took the form of two main IPs 1. Reporting capabilities on top on HPE ALM 2. Data testing for ETL projects.

Infosys has been working on analytics and AI/machine learning and defining different testing use cases largely across these three main topics and unveiled its “Infosys AI led QA platform”. See below.

Defect and Log Data Analysis

Infosys has created a tool, AUTUMN, which can analyze data from several data sources e.g. defect management tools such as HP ALM; server logs; and ITSM tools for aggregating log data and identifying applications that have a history of raising defects.

Sentiment Analysis/UX

Sentiment analysis has become a common offering and Infosys is able to extract trends and analysis based on data extracted from social media (Facebook and Twitter), app stores (Apple Store and Google Play), as well as (based on client requirements) from consumer web sites (e.g. a retailer). Why does this fit into testing? Largely, because consumer feedback is taken into account for improving UX in web sites and mobile apps and helps identifying application issues from a UX perspective. Infosys says that it is starting to see testing departments being at least partially assessed on sentiment analysis tends and consumer satisfaction.

Test Case Number Optimization

This is based on capturing details on test cases (from different sources, including HP ALM as well as from test case repositories), clustering them into groups based on natural language processing algorithms across test case descriptions. The company uses its Infosys Information Platform (IIP) to analyze and visualize clusters, to identify correlated test cases, and uses other test case optimization techniques such as pair wise and CBT from a statistical approach.

The approach has several purposes all related to reducing test execution effort: first identify redundant test cases. Second, only execute a few of correlated test cases (i.e. test cases that are not identical but share similarities and have the same test pass/fail patterns). Third, create test scripts that can accommodate relatively similar test cases.

Defect Analysis and Visualization

In a somewhat traditional approach, Infosys is providing a data visualization tool for identifying defects and correlating them across applications, domains, teams and drive root cause analysis. The tool is based on the Pareto approach (that assumes, from a testing perspective, that 80% of defects are caused by 20% of applications/teams/technologies).

Defect Prediction (For Agile Projects)

The approach is specific to agile projects. Based on historical data on pass/fail performance of test cases for one given sprint, Infosys helps predicting a test pass/fail for a specific sprint. Infosys is using up to eight parameters (e.g. application complexity, degree of change in the application, past history defects, identity of developers working on the application) for being able to provide predictions.

Again, the benefits of this approach is to reduce the test workload by avoiding testing what does not need to be tested. This approach is especially relevant when test cases have not been scripted or need to be enhanced. Infosys is currently its own IP based on open source algorithms.

All in all, Infosys provides a good idea of where most advanced testing service vendors are in terms of analytics and AI/machine learning. Far from all testing service vendors are at this stage of investment. The approach taken by Infosys and others that has guided their automation program for the past decade has been linear, starting with point solution accelerators and test case repositories. Clients are usually not ready to go beyond incremental productivity improvements (see the relative limited adoption of model based testing in spite of its high potential).

Infosys is also going beyond pure software testing and has worked with its product engineering service colleagues on building a robot for testing physical transactions on ATMs. These robots are used for remote testing and use cases include PoS and peripheral systems.

No comments yet.

Post a comment to this article:

close