W Power 2024

How generative AI is changing Infosys from within

AI is automating some tasks, changing skill mixes for existing roles, and creating a new generation of specialists, says CTO Mohammed Rafee Tarafdar

Harichandan Arakali
Published: Mar 5, 2024 03:01:16 PM IST
Updated: Mar 6, 2024 09:13:24 AM IST

Mohammed Rafee Tarafdar, CTO, InfosysMohammed Rafee Tarafdar, CTO, Infosys

Infosys CTO Mohammed Rafee Tarafdar offers his views on how artificial intelligence (AI) is changing the company and the IT services industry. Tarafdar, who's also a executive vice president and the head of the strategic technology group at Infosys, explains how becoming an “AI-first†company back-to-back with adapting to the “cloud-first digital transformation†in the industry is changing Infosys from within. Edited excerpts.

Q. You recently teamed up with Infosys Knowledge Institute to study the AI opportunity. What were your top findings?

Over the past two years, our focus at Infosys has been on transforming into an AI-first organisation. This journey encompasses three main pillars: Reimagining client work by embedding AI into services, incorporating AI into our workspace for applications, energy and water management, and transforming our workforce into an AI-first entity.

To achieve this, we've initiated programmes making all Infosys employees AI-aware and implementing an “AI Builders Masters†initiative. Amidst this transformation, we conducted surveys through Infosys Knowledge Institute across the US, Europe, and Asia Pacific on generative AI.

One revelation is the perishable nature of AI models. While the industry sees constant advancements in bigger and better models, our emphasis is on using organisational data and knowledge to fine-tune existing models. Instead of building pre-trained models from scratch, we create specialised, fine-tuned models tailored to specific tasks, such as generating financial code.

Read More

Another finding addresses the talk surrounding hallucinations in AI models. We recognise that the data ingested into these models inherently carries hallucinations, often stemming from diverse human perspectives. Rather than viewing hallucinations as flaws, we consider them as features, developing techniques to minimise their impact for more effective use in knowledge-based tasks.

Our experience also underscores the significance of specialised models in an enterprise context. While there's a race for generalised models, we've successfully implemented a narrow transformer approach, using open-source models as a base and swiftly fine-tuning them with organisational data. This approach has proven both time-efficient and cost-effective, especially in scenarios like wealth advisory and mortgage loan ratings.

Q. Give us a sense of what these mean in terms of your AI strategy or generative AI strategy.

In shaping Infosys' AI strategy, our primary focus is on reimagining the workplace and workforce with three overarching objectives. First, we aim to use AI to amplify human potential, aligning with our company's purpose.

This involves implementing AI assistants tailored to various roles. For developers, a code assistant enhances productivity in tasks like coding, testing, and documentation. A personalised learning assistant supports continuous learning, while a sales assistant consolidates collective knowledge for client-facing teams.

Over 35,000 users currently benefit from these AI assistants, with plans to expand their availability across functions.

The second objective centres on unlocking value, drawing on our extensive industry knowledge. We've developed a digital brain, providing employees with quick access to valuable information, reducing busy work, and enhancing productivity.

Our applied AI platform streamlines the entire AI development process, promoting efficiency and automation. Additionally, we're creating specialised models for specific tasks, both internally and externally, further unlocking value.

The third objective is to create exponential impact, focusing on integrating AI into every client project. Our digital operating model incorporates AI elements, emphasising AI-first design, engineering, and analytics for meaningful processes. Talent transformation programs—AI Aware, AI Builder, and AI Master—ensure a skilled workforce.

These initiatives collectively form the Topaz offering, encompassing knowledge assets, IP platforms, partnerships with AI-first companies, hyper-scalers, and talent. Topaz represents a comprehensive solution for our clients.

Also watch: How Infosys, TCS might benefit as enterprises move to sunset legacy apps using AI

Q. Providing AI assistants to free up people to be more creative: Give us a couple of examples.

In the past, our sales colleagues faced challenges when clients sought information on complex topics like ERP consolidation, for example. The conventional process involved reaching out to experts across different time zones, leading to time-consuming efforts in collecting and presenting information.

However, with our Navi Sales assistant, an internal co-pilot, accessing 40 years of knowledge has become instantaneous. Sales colleagues can swiftly retrieve information on topics like ERP consolidation, identify experts, gather case studies, and seamlessly embed them into documents, significantly reducing turnaround time. Moreover, the Navi Sales assistant supports discussions with clients by providing on-the-spot information lookup, enhancing efficiency and client interactions.

In the realm of learning, our platform Lex revolutionises the traditional learning approach. It not only offers information in a ChatGPT-style manner but also guides individuals through learning paths and connects them with industry experts.

The platform facilitates collaborative learning in small cohorts, fostering quick and effective knowledge acquisition. This becomes particularly significant in a large organisation like ours, with 330,000 employees. AI assistants and tools simplify and streamline these processes, making them accessible and efficient for our entire workforce.

Q. How is this changing the workday for experienced programmers?

In our experience, AI assistants prove exceptionally beneficial for experienced architects and programmers, boosting productivity even more than their less skilled counterparts. For skilled professionals, these assistants accelerate daily tasks, especially in tasks requiring code generation or information retrieval.

As an expert developer or techie, I can swiftly review and assess the generated code or provided information, making corrections efficiently. This becomes crucial when working on client projects, such as upgrading a database, where AI assistants provide checklists for quick execution.

Q. What kind of applications have you been able to build and what is the impact they are beginning to have?

Many enterprises express reluctance to share their core information, such as code or knowledge, with third parties due to security and privacy concerns. This hesitation makes approaches like creating fine-tuned models or using retrieval augmentation techniques crucial, helping organisations keep their information private and secure, even in cloud environments. We've successfully applied this approach in various scenarios.

For instance, in the telecom sector, we tailored ba code assistant specifically for a legacy application. Another example involves a bank seeking knowledge extraction from COBOL code for transformation purposes. The versatility of knowledge assistants extends to social analytics for retailers, providing tonality analysis of customer feedback to improve consumer engagement and campaign strategies.

In marketing, we've streamlined content creation for product descriptions across multiple digital channels, accelerating the process, especially for global operations requiring translation into various languages. Our clients derive value from applying these technologies in nine business areas: Customer service, contact centre and operations, sales and marketing, business operations, IT operations, software engineering, risk and compliance, employee experience, and learning.

Q. How is all this changing IT services itself?

Today, our clients expect AI, particularly generative AI, to be integrated into the core offerings of IT services companies. The focus is on embedding AI as an integral part of ongoing maintenance, application development, and transformation projects, rather than treating it as a separate initiative.

There's an emphasis on making every member of the IT services company AI-aware. The transformation also brings about new roles and three main shifts: Automation of certain tasks, evolving skill mixes for existing roles, and the emergence of specialised AI masters. Talent transformation becomes essential to adapt to these changes.

Project delivery methods need to evolve, as demonstrated by our evolved digital operating model, emphasising the incorporation of AI into our methods. A crucial aspect is adopting a business-centric view, exploring the potential of AI in reimagining experiences and processes.

There is a need for AI models with smaller footprints to manage high running costs and recognising that AI's efficiency and output improve gradually, requiring effective change management for higher adoption rates.

Also read: Infosys is building core IP around specialised AI models: CTO Rafee Tarafdar

Q. Have you seen any impact of AI, and more specifically generative AI, on the transition to the cloud model?

First, it revolutionises self-service capabilities on hybrid cloud platforms. Instead of traditional methods, users can now engage in natural language conversations, instructing generative AI to perform tasks like provisioning a new virtual machine, for example.

Second, generative AI plays a pivotal role in the cloud migration phase. It automates the generation of scripts essential for operationalising workloads on the cloud, subsequently enhancing efficiency. It addresses the testing aspect, ensuring the robustness of the migrated workloads.

Beyond migration, the second layer involves establishing the GPU (graphics processing units—the computer chips favoured for AI) infrastructure essential for AI experimentation. Creating an AI stack within each cloud infrastructure is crucial. This can be achieved using existing open AI application-program interfaces from major cloud providers or building a native stack with specialised components like Nvidia, AMD, or Intel, depending on the workload.

The third layer focuses on the tooling necessary for operationalising AI development on the cloud and implementing corresponding controls.

Q. What are some of the main challenges you see along the way?

One significant challenge in the AI landscape revolves around data readiness. While organisations had invested in traditional data, incorporating user-generated content like documents, audio, video, and marketing material presents a unique hurdle.

Establishing a foundation for data management, involving normalisation, consolidation, and ease of access for internal AI builders, becomes imperative. Another challenge lies in responsible AI practices, emphasising ethical and unbiased model development.

Talent availability is a persistent concern due to the rapid evolution of AI, particularly in large language models. Co-innovation collaborations emerge as a strategy to address the scarcity of experts.

Crafting impactful AI use cases is the fourth challenge. Rather than a use-case-centric approach, organisations should focus on strategic value-chain analysis to derive compounding business impact and value.

Scaling AI solutions poses a fifth consideration. Ensuring these models can accommodate the demands of a large user base globally is crucial for a seamless experience.

X