The explosion of data is a defining characteristic of the times we are living in. Billions of terabytes of data are generated every day, and million more algorithms scour this data for patterns for our consumption. And yet, the more data we have, the harder it becomes to process this data for meaningful information and insights.

With the rise of generative AI technologies, such as ChatGPT, knowledge workers are presented with new opportunities in how they process and extract insights from vast amounts of information. These models can generate human-like text, answer questions, provide explanations, and even engage in creative tasks like writing stories or composing music. This breakthrough in AI technology has opened up possibilities for knowledge workers.

Built with powerful LLMs, Generative AI has taken the world by a storm and led to a flurry of companies keen to build with this technology. It has indeed revolutionized the way we interact with information.

And yet, in this era of ever-increasing information overload, the ability to ask the right question has become more critical than ever before.

While the technology has evolved faster than we imagined, its potential is limited by the ways we use it. And while there is scope for immense benefits, there is also a risk for harm if users don’t practice judgment or right guardrails are not provided when building Gen AI applications.

As knowledge workers and technology creators, empowering ourselves and our users relies heavily on the ability to ask the right questions.

Here are three key considerations to keep in mind:

1. The Art of Framing Questions:

To harness the true potential of generative AI, knowledge workers must master the art of framing questions effectively. This involves understanding the scope of the problem, identifying key variables, and structuring queries in a way that elicits the desired information. A poorly constructed question can lead to misleading or irrelevant responses, hindering the value that generative AI can provide.

Moreover, knowledge workers should also consider the limitations of generative AI. While these models excel at generating text, they lack true comprehension and reasoning abilities. Hence, it is crucial to frame questions that play to their strengths, allowing them to provide valuable insights within their domain of expertise.

2. The Need for Precise Inquiry:

Despite the power of generative AI, it is essential to remember that these models are not flawless. They heavily rely on the input they receive, and the quality of their output is heavily influenced by the questions posed to them. Hence, the importance of asking the right question cannot be overstated.

Asking the right question is a skill that knowledge workers must cultivate to extract accurate and relevant insights from generative AI. Instead of relying solely on the model to generate information, knowledge workers need to approach it with a thoughtful mindset.

3. Collaboration between Humans and AI:

Generative AI should be viewed as a powerful tool that complements human expertise rather than replacing it. Knowledge workers must leverage their critical thinking, domain knowledge, and creativity to pose insightful questions that enable generative AI to augment their decision-making processes. The synergy between human intelligence and generative AI has the potential to unlock new levels of productivity and innovation.

Think of Gen AI as a powerful Lego block, a valuable component within the intricate structure of problem-solving. It’s not a replacement but an enhancement, designed to work in harmony with human capabilities to solve a problem.

In conclusion, in the age of generative AI, asking the right questions is fundamental. Careful framing of queries unlocks generative AI’s true power, enhancing our decision-making. Cultivating this skill and fostering human-AI collaboration empowers knowledge workers to navigate the information age and seize new growth opportunities.

To say the payments industry is going through disruption is certainly not a hyperbole these days. The fundamental shifts in how commerce gets done have begun to impact the way payments have been done all these years. On the one side, the payments industry has seen the entry of diverse fintech players, including giants like Facebook and Tencent, in addition to the start-ups that are presenting increased competition for banks and corporations. On the other, the threat from fintechs is being further fuelled by rapidly evolving customer expectations, which continue to push the boundaries for the industry as a whole. It is increasingly apparent that the payments marketplace will look fundamentally different a decade from now. There will be new form factors, real-time infrastructure, greater levels of integration with social media and e-commerce, to name a few of the changes. In effect, the revolution that has completely disrupted the consumer payments industry over the last decade or so is finally coming to take into its fold the corporate payments industry, too.

I see three big trends that are likely to shake up this segment, which is at half a trillion dollars a year, and growing:

  1. Direct-to-consumer models: As firms across industries move to direct engagement with their customers, it is becoming increasingly necessary to deliver them the same level of digital experiences as the consumer payments industry does.
  2. Global payment flows: Cross-border payments now make up over 10 per cent of all corporate payments, and they are growing. These flows are almost always digital in nature, with the added complexity of regulatory compliance and risk management.
  3. Data monetization: Bank treasury services have had the advantage of managing and servicing fund flows between their corporate clients. And as these fund flows become increasingly digital, they have enabled banks to build a data goldmine. Banks are now actively looking to leverage this data to deepen their service offerings.

Banks and corporations have started responding to the call of digital, and the payments processing industry is currently going through a wave of infrastructure modernization. I see significant technology investments by CIOs across firms that are setting the stage for the next wave of digital transformation. The payments industry will look fundamentally different a few years from now. By adopting digital channels, embracing automation, adopting open standards and making smart bets in technology, banks and corporations can emerge as winners in the payments marketplace.

Making digital transformation happen

As digital transformation initiatives in payments pick up steam, there are four main areas of focus, each of which is important to ensure not just a solid foundation for a digital payments ecosystem, but also to ensure the groundwork for unlocking the revenue potential from treasury and payment operations. This is something yet to be tapped in most organizations:

  1. Enabling of digital channels: Buoyed by the consumer payments industry, there is a rapidly growing array of digital payment channels that need to be integrated into the digital payments service offerings.
  2. Process automation: Payment processes typically span across entities (bank-corporation-consumer), and integration across disparate systems will make for a critical foundation to enable scalable implementations.
  3. Payment analytics: Payment processes have always been data rich, and even more so when digital channels continue to grow. Effective use of the data to make better decisions (e.g., manage risk, prevent fraud) and, furthermore, explore data monetization opportunities, are becoming important.
  4. Adoption of standards: For any multi-entity ecosystem with entities across the globe to scale with technology, it is essential to establish standards. Adoption of open banking standards is essential for digital payments to succeed – and we are at an inflection point, given the increasing adoption of these standards.

Digital channels

The ubiquitous cheque has been the staple in corporate payments for years now. Despite being the most expensive payment instrument, the cheque has dominated the corporate payment world for decades. It is not just the processing cost of the paper cheque that makes it a burden for banks. It is also a security headache. It is well known that paper cheques are the largest vehicle of payment fraud.

All that is changing. And, as it usually happens, this started with the consumer payments business. As the direct-to-consumer models continue to evolve, the B2C payments business is growing rapidly (annual growth rate of 15 per cent led by digital e-commerce)6. And the digital payment technologies that are coming out of the consumer payments industry (Zelle, Paypal, digital wallets, et al) offer a rich choice for banks and corporates to offer digital experiences to their customers:

  • Disbursement of funds: Transfer volumes of Medicare/Medicaid funds by healthcare providers to their members continue to rise and should, by and large, be digital.
  • Refund management: In a direct-to-consumer world, corporations need to manage refunds to customers from excess payments and product returns. Customers used to instant digital payments from the e-commerce world are expecting a similar experience everywhere.
  • Loyalty/reward disbursements: As corporations build deep relationships with their customers, they continue to adopt customer engagement strategies from e-commerce retailers. These include cash-back payments and encashing of loyalty points, which need to be executed through digital channels. A similar revolution is around the corner in B2B payments, with the expansion of consumer-like payment rails, such as digital wallets, in addition to the existing ones like ACH, Wire, virtual cards, etc. We believe the convenience of digital payments is only a starting point. There is so much more to it by way of benefits:
  • Streamlining of payment processes has a direct impact on working capital management. Trade finance is key to enabling global supply chains, and fintechs are coming up with specific solutions.
  • Cross-border payments to suppliers and subsidiaries need to stand up to heavy regulatory requirements in addition to managing the risk of fraud. Digital payments are increasingly the safest alternative. In addition to enforcing compliance, digital channels can ensure transparency of global fund flows
  • Of late, acquiring a deeper understanding of the supplier ecosystem has become an important factor (see the section on payment analytics below).

Automation

70 per cent of corporate treasury and payments professionals list manual and inefficient processes among their top challenges. In addition to their high costs, manual processes are also error-prone, difficult to scale in response to variable volumes, and increasingly susceptible to fraud.

Process simplification and automation opportunities extend across the value chain – from establishing the payment exchange with suppliers (B2B) and consumers (B2C) to creating a variety of services around three-way (PO, invoice and receipt) matching, and all the way to the disbursement of funds through different digital payment channels. Several fintechs see this as a big area of opportunity and are building to this end auxiliary platforms that can integrate with corporate systems and automate the end-to-end processes.
Bank treasury services offer a slew of products and services – from cheque processing to ACH/Wire – to their corporate clients. For instance, a large bank helps one of the largest healthcare providers in the US process over 4 million transactions on an annual basis, covering their entire value chain, from providers – corporate hospitals (B2B) and individual doctors (B2C) – to pharmaceuticals (B2B).

  • There is a clear opportunity to digitally onboard these entities onto the payments network in a rapid, secure manner using DIY portals as well as create an ‘omni-channel’ like experience that minimizes onboarding friction.
  • Automating a payment network of this size and complexity is undoubtedly an integration challenge, given the multiple legacy systems at enterprises and, increasingly, ERP systems. With the increasing adoption of API (application programming interface)-based data exchanges, end-to-end automation with multiple payment rails is a necessary building block for digital payments.

Data and analytics

Payment processing through digital channels is data rich: strategies and execution led by analytics on the transaction data can help in a variety of ways: improving revenues, cutting operating costs, detect fraud and other anomalous behaviour.

Risk and fraud analytics: As payments migrate to digital platforms, it is almost inevitable that fraud becomes more sophisticated too. And, as the volume and complexity of payments grow, fraud is becoming just as hard to track, identify and prevent. Fraud prevention will have to move beyond transaction-centric assessment to leveraging AI for detection and prevention of emerging fraud. Broadly speaking, organizations need to think of payments fraud at two levels:

  • Account fraud: Digital identity theft is a leading cause of fraud. Methods like ATO (account takeover) and synthetic identity creation can be used to gain access into accounts and siphon funds. Tracking and preventing this requires going beyond the traditional knowledge-based authentication methods to monitor authentication journeys, looking for anomalous patterns.
  • Phantom payments: Businesses lose significant amounts to fraudulent payments, triggered both by employees (e.g., initiating a phantom payment) or payees (e.g., creating double invoices). Monitoring and flagging them requires a range of methods, starting from rule-based systems (e.g. proximity of transaction requests) to more sophisticated machine learning methods (e.g., payee behaviour risk-scoring and setting of guard-rails sensitive to each risk segment).

Data monetization: A bank managing the payment flows between its corporate clients and their network of suppliers and customers has the unique ability to understand the financial behaviour of all the firms in this network. This can be a powerful tool for the bank to develop targeted strategies to drive superior experience and value for its clients:

  • Working capital optimization: Banks can help their clients optimize their working capital by forecasting fund flows and using that to transfer the optimal funds into their payment accounts.
  • Service bundling: Using a combination of behavioural and transactional patterns, banks can help define optimal service bundles for their clients. For example, corporate payments that span multiple countries can be optimized with a combination of exchange-rate hedging and currency-float solutions.

Adoption of standards

Open Banking

Starting in 2015, when the European Parliament adopted open banking standards (PSD2), there has been a growing momentum in adoption of standards. And, as it happens with standards, this can catalyse innovation and efficiency across the world of payments once they reach a critical mass of adoption. Open banking regulations require banks to open up their systems and data to third-party providers through secure channels. This has the potential to accelerate:

  1. Seamless transfer of funds between banks, using standards as opposed to relying on the current custom of point-to-point software integrations.

    digital-payments-disrupt-or-get-disrupts

  2. The capability of corporations with multiple bank accounts across currencies to efficiently aggregate bank account data into a single accounting portal for automated reconciliation, mitigating issues in one of the most complex of payment transactions – crossborder payments.

Blockchain technology

Several financial services firms are increasingly looking to blockchain technology to mitigate the risk of fraud. The three fundamental underpinnings of the technology are distributed ledger, and immutable and permissioned access. Taken together to underpin a payment processing service, they make it possible to trace the entire sequence of wire transfers. Visa launched its B2B Connect Platform based on a private blockchain with the aim of enabling faster cross-border payments. Similarly, a host of banks, including HSBC, BNP Paribas and ING, launched Contour, a blockchaininspired platform designed to make the $18-trillion trade finance market more efficient and secure. I expect this space to see a lot more action in the coming years.

After decades of plodding along with archaic systems, the $2 trillion behemoth that is the global payments industry, is waking up, shaken up by the fintechs (a revolution of sorts that PayPal ignited).12 And, as it often happens, innovation in one sector rapidly spills over to adjacent areas; the dramatic change that started in consumer payments created the technology building blocks for digital disruption in corporate payments too. Combined with the adoption of standards and, most notably, the maturity of blockchain technologies, the corporate payments industry is primed for a burst of innovation.

A Complementary Partnership

“Data is the new currency.”— has gained immense popularity in recent years as data is now a highly valuable and sought-after resource. Overtime data continues to be accumulated and is becoming increasingly abundant.​​ The focus has now shifted from acquiring data to effectively managing and protecting it. As a result, the design and structure of data systems have become a crucial area of interest, and research into the most effective methods for unlocking its potential is ongoing.

While innovation and new ways keep coming to the fore, the best of the ideas currently consists of two distinct approaches in the form of data mesh and data fabric. Although both aim to address the challenge of managing data in a decentralized and scalable manner, they have different approaches and benefits, and they differ in their philosophy, implementation, and focus.

Data Mesh

The architectural pattern was introduced by Zhamak Dehghani for data management platforms that emphasize decentralized data ownership, discovery, and governance. It is designed to help organizations achieve data autonomy by empowering teams to take ownership of their data and provide them with the tools to manage it effectively. Data mesh enables organizations to create and discover data faster through data autonomy. This contrasts with the more prevalent monolith and centralized approach where data creation, discovery, and governance are the responsibility of just one or a few domain-agnostic team(s). The goal of data mesh is to promote data-driven decision-making and increase transparency, break down data silos, and create a more agile and efficient data landscape while reducing the risk of data duplication.

Building Blocks of Data Mesh

data-management-platforms

Data Mesh Architecture

Since data mesh involves a decentralized form of architecture and is heavily dependent on the various domains and stakeholders, the architecture is often customized and driven as per organizational needs. The technical design of a data mesh thus becomes specific to an organization’s team structure and its technology stack. The diagram below depicts a possible data mesh architecture.

It is crucial that every organization designs its own roadmap to data mesh with conscious and collective involvement of all the teams, departments, and line of Business (LoBs), with a clear understanding of their own set of responsibilities in maintaining the data mesh.

Data mesh is primarily an organizational approach, and that's why you can't buy a data mesh from a vendor.

Data Fabric

Data Fabric is not an application or software package; it’s an architectural pattern that brings together diverse data sources and systems, regardless of location, for enabling data discovery and consumption for a variety of purposes while enforcing data governance. A data fabric does not require a change to the ownership structure of the diverse data sets like in a data mesh. It strives to increase data velocity by overlaying an intelligent semantic fabric of discoverability, consumption, and governance on a diverse set of data sources. Data sources can include on-prem or cloud databases, warehouses, and data lakes. The common denominator in all data fabric applications is the use of a unified information architecture, which provides a holistic view of operational and analytical data for better decision-making. As a unifying management layer, data fabric provides a flexible, secure, and intelligent solution for integrating and managing disparate data sources. The goal of a data fabric is to establish a unified data layer that hides the technical intricacies and variety of the data sources it encompasses.  

Data Fabric Architecture

It is an architectural approach that simplifies data access in an organization and facilitates self-service data consumption. Ultimately, this architecture facilitates the automation of data discovery, governance, and consumption through integrated end-to-end data management capabilities. Irrespective of the target audience and mission statement, a data fabric delivers the data needed for better decision-making.

Principles of Data Fabric

Parameters Data Mesh Data Fabric
Data Ownership
Decentralized
Agnostic
Focus
High data quality and ownership based on expertise
Accessibility and integration of data sources
Architecture
Domain-centric and customized as per organizational needs and structure
Agnostic to internal design with an intelligent semantic layer on top of existing diverse data sources
Scalability
Designed to scale horizontally, with each team having their own scalable data product stack
Supports unified layer across an enterprise with the scalability of the managed semantic layer abstracted away in the implementation

Both data mesh and data fabric aim to address the challenge of managing data in a decentralized and scalable manner. The choice between the two will depend on the specific needs of the organization, such as the level of data ownership, the focus on governance or accessibility, and the desired architecture.

It is important to consider both data mesh and data fabric as potential solutions when looking to manage data in a decentralized and scalable manner.

Enhancing Data Management: The Synergy of Data Mesh and Data Fabric

A common prevailing misunderstanding is that data mesh and data fabric infrastructures are exclusive to each other i.e., only one of the two can exist. However, fortunately, that is not the case. Data mesh and data fabric can be architected to complement each other in a way that the perquisites of both technologies are brought to the fore to the advantage of the organization. 

Organizations can implement data fabric as a semantic overlay to access data from diverse data sources while using data mesh principles to manage and govern distributed data creation at a more granular level. Thus, data mesh can be the architecture for the development of data products and act as the data source while data fabric can be the architecture for the data platform that seamlessly integrates the different data products from data mesh and makes it easily accessible within the organization. The combination of a data mesh and a data fabric can provide a flexible and scalable data management solution that balances accessibility and governance, enabling organizations to unlock the full potential of their data.

Data mesh and data fabric can complement each other by addressing different aspects of data management and working together to provide a comprehensive and effective data management solution.

In conclusion, both data mesh and data fabric have their own strengths but are complementary and thus can coexist synergistically. The choice between the two depends on the specific needs and goals of the organization. It’s important to carefully evaluate the trade-offs and consider the impact on the culture and operations of the organization before making a decision.

What is Contract Pull Through?

The pharma sales team engages in contracts with brands, hospitals, clinics, infusion centers, doctor offices, IDNs, ONA, GPOs, and other networks. These networks are often referred to as pharma accounts, and contracts are lined up to improve overall sales, market share, and profitability. Contracts with these accounts are based on various factors such as rebate percentage, formulary tiers, and performance-based fees.

Now, the current size of pharma gross contracted sales is to the tune of 50B USD and projected to grow to 85B USD over the next 5 years[1], making this a big area for the pharma commercial team to have a close look and improve sales effectiveness while engaging with these accounts. Pharma companies are interested in the contracted sales, rebates, terms, and tiers data from the accounts to measure effectiveness. Mainly to figure out which of the existing accounts are underperforming, and which are performing above benchmark – and this aspect of pulling contract data across accounts to measure account effectiveness is the key objective of Contract Pull Through.

We may define Contact Pull Through, as the analysis of –

  1. How much an account has purchased (sales) by contract program and by brand;
  2. How much they’ve received in discounts (rebates, chargebacks);
  3. How they are doing against their baselines; and
  4. Where are the opportunities to buy more and save more?

Why does pharma need to focus on Contract Pull Through?

Large and mid-size pharma companies have Contract Pull Through from the accounts, as the top-of-mind problem, as even increasing effectiveness by 2-5% would mean savings to the tune of millions of dollars.

Based on our experiences working with the client’s market access team – we realized that the organization delegated the task of supporting field pull-through entirely to its payer account managers. These executives reported spending 75% of their time creating and pulling reports, time that could have been better spent with customers or in more strategic dialogue with the field members.

The key stakeholders for Contract Pull Through are the field team members i.e., Business Engagement Managers (BEMs) and Healthcare Market Directors (HDs), who need to focus on Contract Pull Through for –

  1. Generating contract awareness and pull through for major providers in their ecosystems
  2. Creating awareness around the contracts/terms offered by pharma firms for the products
  3. Engaging with customers to show their historical performance and current performance

Some specific Contract Pull Through use cases the pharma account team focuses on include –

  • A portfolio purchasing summary of the account, which enables the account to understand how much volume the account has bought, and the savings received from pharma products
  • Product contribution at an account level, which enables the understanding of how much volume is coming from each pharma product
  • Contract eligibility of an account, to understand which contracts are available at a certain account

Other business insights that Contract Pull Through data may help pharma companies are around –

  • How is the account performing as compared to others, in their ecosystem/region?
  • Which is the account’s dominant payer, and how does that payer work at a national level?
  • How much this account can purchase to reach the next tier?

Use case deep dive: Portfolio purchasing summary of an account

A portfolio purchasing summary of the account is one of the critical use cases handled through Contract Pull data – what the Contract Pull Through team is looking for is to understand a particular account within a regional ecosystem, across all periods of the contract or for a particular period –

  • What has been the number of gross sales? Has that gone up or down compared to its last quarter?
  • What part of the total gross sales is contracted vs non-contracted? What part of sales is attributed to specialty pharmacy? What percentage of account savings is attributed to contracting?

Also, another key insight to look at for portfolio purchasing summary for the account is to identify the product mix/segment mix (GPO, 340B, NCCN, etc.) across the portfolio, and double click on which product/ segment contribution has gone up, or down over the previous periods, and how does it fare against its anticipated baseline numbers.

Insights of this degree can immensely help the field team members understand how much volume the account has bought and the savings received from contracted pharma products, how are the accounts performing against their expectations, and where there are opportunities to buy more to save more.

How to generate the pull through business insights from data?

To arrive at these insights- the key data elements that must be looked into are contracts, chargebacks, 867 Sales, non-contracted sales, rebates, terms and tiers, account hierarchy, and zip to territory-mapping data.

These data sets coming from different source systems are ingested, assimilated, and presented as output for improved decision-making –

  • Firstly, it needs to be ingested into cloud-based or on-prem databases by using RPA tools like UIPath
  • The ingested data then goes through a set of data quality engine checks to ensure data is of expected quality.
  • The clean dataset then is transformed through the ETL process, where complex calculations through business-defined rules are applied, and
  • Presented through BI tools reports providing visualized/graphics and tabular data on gross sales, savings, performance, opportunities for an account, and other pull through insights.

Conclusion: CPT – Top of mind for strategic and financial objectives

BEMs/ HDs have Contract Pull Through insights generation as a huge part of their mind share. Thus, large and mid-pharma organizations are, or should invest in building a robust Contract Pull Through system to enable them to understand how the accounts are performing against their expectations, and in turn, identify the opportunities for them to sell more to optimize rebate payments. By doing that they have financial benefits to the tune of millions of dollars in terms of optimized rebates, and savings. As also the strategic incentive to understand the top accounts to concentrate on, and the key underperforming accounts to re-negotiate the contracting terms and tiers.

Self-service AI refers to the intelligence that business users (analysts and executives) can acquire on their own from the data without the extensive involvement of data scientists and engineers. It means enabling them to acquire the actionable intelligence to serve their business needs by leveraging the low-code paradigm. This results in reduced dependency on other skills such as core IT and programming and makes faster iterations possible at the hands of business users.

As the data inside and outside of the organizations grows in size, frequency and variety, the classical challenges such as hard-shareability across BUs, lack of single-ownership and quality issues (missing data, stale data, etc.) increase. For IT teams owning the data sources, this becomes an additional task to ensure provisioning of the data in requisite format, quality, frequency and volume for ever-growing analytics needs of various BU teams, each having its own request as a priority request. Think of the several dashboards floating in the organizations created at the behest of various BU teams, and even if with great effort they are kept updated, it is still tough to draw the exact insights that will help take direct actions based on critical insights and measure their impact on the ground. Different teams have different interaction patterns, workflows and unique output requirements – making the job of IT to provide canned solutions in a dynamic business environment very hard.

Self-service intelligence is therefore imperative for organizations to enable their business users to take their critical decisions faster every day leveraging the true power of data.

Enablers of self-service AI platform – Incedo LighthouseTM

Incedo LighthouseTM is a next-generation, AI-powered Decision Automation platform targeted to support business executives and decision-makers with actionable insights generation and their consumption in daily workflows. Key features of the platform include:

  • Specific workflow for each user role: Incedo LighthouseTM is able to cater to different sets of users, such as business executives, business analysts, data scientists and data engineers. The platform supports unique workflows for each of the roles thereby addressing specific needs:
    • Business Analysts: Define the KPIs as business logic formulations from the raw data, also define the inherent relationships present within various KPIs as a tree structure
    • Data Scientists: Develop, train, test, implement, monitor and retrain the ML models specific to the use cases on the platform in an end-to-end model management
    • Data Engineers: Identify the data quality issues and define-apply remediation across various dimensions of quality, feature extraction and serving using online analytical processing as a connected process on the platform
    • Business Executives: Consume the actionable insights (anomalies, root causes) auto-generated by the platform, define action recommendations, test the actions via controlled experiments and push confirmed actions into implementation
  • Autonomous data and model pipelines: One of the common pain points of the business users is the slow speed of data to insight delivery and further on to action recommendation, which may take even weeks at times for simple questions asked by a CXO. To address this, the process of insights generation from raw big data and then onto the action recommendation via controlled experimentation has been made autonomous in Incedo LighthouseTM using combined data and model pipelines that are configurable in the hands of the business users.
  • Integrable with external systems: Incedo LighthouseTM can be easily integrated with multiple Systems of Record (e.g. various DBs and cloud sources) and Systems of Execution (e.g. SFDC), based on client data source mapping.
  • Functional UX: The design of Incedo LighthouseTM is intuitive and easy to use. The workflows are structured and designed in a way that makes it commonsensical for users to click and navigate to the right features to supply inputs (e.g. drafting a KPI tree, publishing the trees, training the models, etc.) and consume the outputs (e.g. anomalies, customer cohorts, experimentation results, etc.). Visualization platforms such as Tableau and PowerBI are natively integrated with Incedo LighthouseTM thereby making it a one-stop shop for insights and actions.

Incedo LighthouseTM as self-serve AI at a Pharmaceutical Clinical Research Organization (CRO)

In a recent deployment of Incedo LighthouseTM, the key user base is the Commercial and Business Development team of a Pharma CRO. The client, being a CRO, had drug manufacturers as its customers. The client’s pain point revolved around the low conversion rates leading to the loss of revenue and added inefficiencies in the targeting process. A key reason behind this was the wrong prioritization of leads that have lower conversion propensity and/or have lower total lifetime value. This was mainly due to judgment-driven, ad-hoc and simplistic, static, rule-based identification of leads for the Business Development Associates (BDA) to work on.

Specific challenges that came in the way of application of data science for lead generation and targeting were:

  • The raw data related to the prospects – using which the features are to be developed for the predictive lead generation modeling – were lying in different silos inside the client’s tech infrastructure. This led to inertia to develop high-accuracy, predictive lead generation models in the absence of a common platform to bring the data and models together.
  • Even in a few exceptional cases, where the data was stitched together by hand and predictive models built, the team found it difficult to keep the models updated in the absence of integrated data and model pipelines working in tandem.

To overcome these challenges, the Incedo LighthouseTM platform was deployed that allowed them to:

  • Combine all the data sources’ information into a Customer-360-degree view, enabling the BDAs to look at a bigger picture effortlessly. This was achieved by pointing the readily available connectors within the Incedo LighthouseTM platform to the right data sources, and establishing data ELT pipelines that are scheduled to run in tandem with the data refresh frequency (typically weekly). This allowed the client’s business analysts to efficiently stitch together various data elements, that were earlier lying in silos, in a self-serve model and include custom considerations that are region and product specific during the data engineering stage.
  • Develop and deploy AI/ML predictive models for conversion propensity using Data Science Workbench which is part of the Incedo LighthouseTM platform, after developing the data engineering pipelines that create ‘single-version-of-the-truth data’ every single time raw data is refreshed. This is done by leveraging the pre-built model accelerators for predictive modeling, helping the BDAs sort those prospects in the descending order of their conversion propensity, thereby maximizing the return on the time invested in developing them. The Data Science Workbench also helped with the operationalization of various ML models built in the process, while connecting model outputs to various KPI Trees and powering other custom visualizations.
  • Deliver key insights in a targeted and attention-driving manner to enable BDAs to make most of the information in a short span of time. This is achieved through well-designed dashboards to rank-order the leads based on the model reported conversion propensity, time-based priority and various other custom filters (e.g. geographies, areas of expertise). The intuitive drill-downs were encoded using the region-specific KPI Trees to enable them to know the exact account portfolios of their business that were lagging behind. These KPI Trees were designed by the client’s business analysts within the platform’s self-serve KPI Tree Builder, saving multiple iterations with the IT teams. The KPI Trees allowed the BDAs to double click on their individual targets, understand the deviations from actuality, and review the comments from earlier BDAs who may have been involved, to decide the next best actions for each lead.

The deployment of Incedo LighthouseTM not only brought about real improvements in target conversions, but also helped transform the workflow for the BDAs by leveraging Data and AI.

HCP engagement in the new era

The engagement strategies for pharma representatives to connect with HCPs were already in a state of transformation. Covid-19 has only accelerated this process. With fewer HCPs now preferring in-person meetings and with the advent of new technologies, there has been a steady rise in the use of various digital channels like email, social, virtual connects etc. Statistics show that the volume of emails sent to HCPs increased by almost 300% in the last year and the average interaction duration in virtual meetings has increased manifold. All of these developments have compelled the drug companies to reimagine their engagement strategies, maintain a healthy relationship with the HCPs and use the right channels for making the required impact on the HCPs.

Personalization – The need of the hour:

With different technologies at their disposal and HCP preferences differing, pharma companies have realized that they have to change their marketing and engagement tactics to meet the engagement needs of each doctor. Each HCP’s expectation from the interaction is different and the education required for each of them is largely dictated by the patient cohort the HCPs serve. The interest points of HCPs differ and their response to channels differs. So does the response of HCPs to various incentives offered by pharma companies. For example, in one of the recent analyses executed using the Incedo LighthouseTM Platform, we found that the Pediatricians (PD) respond to nutritional rebates much more than their Non-PD counterparts.

Personalization, and sometimes hyper-personalization, therefore is the central theme of customer engagement across domains. HCPs now prefer to be connected to a digital platform of their choice e.g. mobile, email, social, call activity, etc. This behavior may differ across various HCP segments e.g. across therapeutic areas, affiliation, years of experience, and geography, apart from the patient cohorts they serve.

With response data now available to the drug companies, it is possible to derive insights on the HCP preference by various cuts such as segment, sub-segment, geography, etc. The HCP preference and behavior patterns shed light on how receptive they are to digital engagement. This data analysis is leveraged by organizations to analyze the context and content for a digital interaction to derive Next Best Action by answering critical questions related to the message and channel strategies.

Helping Pharmaceuticals understand High Impact Channels using Incedo LighthouseTM

In one of the recent deployments of Incedo LighthouseTM at a pharma organization, the client wanted to understand the most impactful channels for engaging with the HCPs. This was also driven by the CMO agenda to understand the profitable marketing channels to get a bang for the buck. Understanding the segments and sub-segments of HCPs from the visualization and segmentation powered by the Incedo LighthouseTM platform, ML models were built at different therapeutic areas. The marketing investment was translated to input variables specific to the channel and the impact was measured on the overall sales. Using the Data Science workbench of the Incedo LighthouseTM platform, different linear and non-linear ML models were built. The insights from the models were used to derive the contribution of different channels on both the baseline and promotional sales.

Using this, the ROI of channels was determined by various cuts. For example, at a broad level, every marketing dollar spent gave an extra 40 cents as a return. Also, it was understood that among all the channels, digital channels were underinvested and the HCPs responded best to them.

Using the KPI Tree and Cohort Analyzer functionality of the platform, one could see which HCPs were under-reached and had responded well and which were overreached and didn’t respond on certain channels. Using the deep drill-downs, one could go to the affiliation/hospital levels and identify the next action and specific ways to reach them.

Lastly, Incedo LighthouseTM’s advanced visualization capabilities help generate response curves of each channel with further drill-down capabilities. These can be instrumental in simulating the performance of HCP cohorts and channels and identifying the breakeven dollar spend. Laying optimization algorithms on top of it, leveraged from Incedo LighthouseTM’s pre-built accelerators, organizations can get to the channel strategy designed to optimize the HCP engagement through each medium while minimizing the investment.

The global financial services industry has seen disruption over the past few years with new fintech players and digital giants eating up into the market share of the legacy banking institutions. Whether it is digital credit lending platforms, payment tools or new credit card issuers, the common narrative to drive user adoption and build market share is enhanced customer experience through personalization.

The Value of Personalization for Retail Banking

There has been a sudden shift in the way some of the banks are leveraging personalization across the customer lifecycle. The key focus areas and objectives to enhance customer experience and deliver incremental impact for the bank are mentioned below :

  1. Building a Growth Engine to capture new markets and customer segments
    The next gen fintech players have been able to create data-enabled products that enable faster underwriting by using the prospect’s bureau scores, digital KPIs, social media data etc. to identify the creditworthiness of a prospect. The big banks need to move fast to enable similar, faster turnaround times for loan fulfillment to enable higher acquisitions from new customer segments.
  2. Maximizing customer lifetime value for existing customer base
    In order to maximize the share of customer’s wallets, banks have the following levers – improved cross sell, better servicing & higher retention rates. The best-in-class firms are leveraging AI-enabled, next best action recommendations to identify what products should be offered to a customer at what time and through what channel. The focus has also moved from reactive retention interventions to proactive, data-enabled retention strategies personalized to each customer.
  3. Improved risk assessment and mitigation controls
    Use of Personalization is not limited to marketing interventions, the digital world needs personalized controls when it comes to risk management, fraud detection, anti-money laundering and other control processes. Sophisticated AI models for risk, fraud & AML detection coupled with real time implementation is critical to build strong risk defense mechanisms against fraudsters.

The overall impact of personalization in Retail banking is dramatic with added opportunity to improve experiences across the customer touchpoints. Based on Incedo’s deployment of solutions for banking and fintech clients, given below is an illustrative example of use cases and potential opportunities.

value-potential-personalization

Building a personalization engagement engine requires integrated capabilities across Data, AI/ML and Digital Experiences

The reimagined customer engagement built on personalization requires a clear understanding of customer’s needs, behavior, requirements etc. and the ability to integrate these with customer front end platforms and channels (website, mobile app etc). This requires capabilities interwoven across the spectrum of Data, AI/ML and Digital Experiences.

  1. Data foundation and the strategy to enable a 360 degree view of the customer:
    Most of the banks struggle to stitch together a holistic profile of the customer in terms of their products, lifestyle behavior, transactional patterns, purchase history, offers used, preferred channels, digital engagement etc. This necessitates development of a clear data strategy right from capturing customer touchpoints to building 360 degree data lakes for the customer. Given the huge data storage requirements, this may also mean building modern digital cloud platforms that capture not only customer’s purchase history but also granular data points like digital clickstream data.
  2. AI, ML and Analytics-enabled decisioning layer to drive Next Best Action Recommendations:
    Identifying the right product/offer/service for the customer at the right time and through the right channel of engagement is important to ensure it converts into an optimal experience for the customer. This is done through a series of AI models, customer segmentation, optimizations etc. These AI/ML models are built on historical data and are tested, monitored and enhanced on an ongoing basis to ensure any new feedback is incorporated into the models.
    Building an AI/ML engine needs expert Data Scientists and Business Intelligence experts with a background in ML, statistics and contextual domain knowledge.
  3. Optimal Digital Experience to capture customer attention and maximize conversions:
    Data-enabled recommendations do not work if not supplemented with the right creatives and simplified communication that drive call to action for the customer. Digital Experiences that enable impactful omnichannel journeys whether through email, website, mobile app, ATMs, branches, service reps etc. are very important in this regard. The A/B testing of digital experiences that may include application forms, digital journeys, website interstitials, campaign banners etc. is critical to build best-in-class customer experiences.

While data, AI and digital experiences are the three key building blocks of the personalization enabled engagement layer, there is a need to orchestrate and integrate these capabilities to ensure that banks are able to deliver value from the personalization initiatives. Building these capabilities is no mean task and can take long time cycles to reap any tangible benefits, especially in cases where firms are building these capabilities from scratch.

How to turn the personalization opportunity into reality ?

While many traditional banking institutions have tried to build personalization enabled engagement, it has been observed that either the efforts fail or do not scale up over time, leading to non-optimal ROI on investments made.

Apart from building capabilities across data, AI/ML and digital experiences, it is critical to embed the personalization recommendations into enterprise users’ workflows whether it is a part of CRM, Salesforce, Credit risk decisioning systems etc. The end-to-end decision automation of workflow is critical to drive adoption and actual implementation of personalized experiences for customers.

Incedo’s LighthouseTM enabled CX personalization for the banks is an enterprise grade solution that enables shorter time to market for Data/AI enabled marketing personalization and accelerated realization of personalization opportunity.

Incedo’s LighthouseTM enabled CX personalization solution for the banks enables automated AI/ML enabled decisioning right from the data layer to customer reach out, to ensure personalized product, offer or service is delivered to the customer when it matters. The prebuilt library of Customer 360 degree data lakes and AI/ML models enable accelerated implementation of personalization initiatives. This is supplemented with digital command centers and on-a-click of a button operationalization of recommendations to deliver omni-channel engagement.

incedo-personalization-solution

No matter where the banking clients are in their personalization journey, the solution is implemented in a way that it helps realize the business impact within a matter of weeks and not years. The solution implementation is supplemented with a personalization roadmap for the organization, where Incedo’s team of experts work together with client teams to not only implement solutions but also help the firm build its in-house personalization capabilities over a period of time.

It is critical for banking institutions to acquire new customers, maximize customer value and retain their best customers at a greater speed and accuracy then ever before. Right from personalized account opening experience to hyper-personalized cross sell product and offer recommendations to trigger-based retention strategies, providing a “wow” experience to the customer needs personalization capabilities. Complementing the trust that traditional banks and credit unions have with these capabilities would ensure that they continue to maintain their competitive advantage over the new fintech players or digital giants.

In today’s world, where the evolution of digital technology, AI and machine learning algorithms has influenced human lives, the concept of AI-driven, personalized experiences across customer touchpoints with the business has been gaining traction for some time.

“85% of businesses say they are providing somewhat personalized experiences to customers, and 60% of consumers agree with that.” – Twilio Segment Report

“72% of customers rate personalization as ‘highly important’ in today’s financial services landscape” – Capco research report “Insights for Investments to Modernize Digital Banking”

The application of personalization is becoming ubiquitous now – from the kind of articles that search sites show us, to the posts and reels that come in front of us on social media, to the kind of products that get recommended to us everywhere on the Internet. Personalized recommendations have become the smart marketer’s greatest tool and weapon for reaching out to their customers and creating a differentiation from the competitors.

For early adopters of personalization in the banking sector, the focus today is on investing in increasingly better and faster ways of personalization. Personalization today combines features bank customers want and are willing to pay for as inspired by digital banking with the human touch that still remains vital for effective customer engagement. Some of the banks, however, are still new in the journey and trying to formulate the strategy of AI-driven personalization.

Below we discuss some of the avenues where the CMO’s office has been able to unleash the power of AI-driven personalization and reap huge benefits from it.

1. Personalized Product & Service Recommendations

Retailers have been using personalization extensively to sell to and engage with their customers better. While e-tailers pioneered this space, we are starting to see companies across Banking, Telecom, FMCG & Electronics sectors, etc. using the power of personalized recommendations to enable their cross-selling and up-selling campaigns.

Banks look at factors like demographics, income & employment, transactional activity levels, spending patterns, debt worthiness & repayment, etc. to build a 360-degree view of their customers. Using this intricate knowledge of their customers’ financials and spend behavior, they are able to create extremely personalized offers aimed to provide the customers with the right financial tools suiting their lifestyle and needs. The level of precision helps the banks not just attract customers better, but also trim down the costs of traditional mass-reach channels like call centers.

Wealth Management firms are using similar techniques as well. As part of a more services driven business, these firms are helping financial advisors cater to investors with more personally tailored advice. Based on the knowledge of the investor’s behavior and life goals, advisors get access to the recommendations around the next-best-action to take to their clients. Also, by understanding the advisors themselves, the WM firms are able to offer them a suite of services more aligned with the advisor’s personal style of research and portfolio building.

2. Personalized Marketing Communication

Not only are companies able to tailor their products and services, but they are also able to personalize the way they communicate these to their customers. Measuring the effectiveness of past campaigns on customers, marketers can tweak some of the below levers for marketing personalization:

Messaging: Depending on the buyer segment, the messaging for a specific product can be focused on offering discounts for the discount-diggers OR for providing detailed product features for the heavy-research users OR for product comparisons for the more flexible early-stage users, etc.

Channel Personalization: Using Channel Affinity models, marketers run focus campaigns targeting the right customers on the right channels. Banks, for example, target customers having high lifestyle spends with credit card display ads on shopping sites. At the same time, the HNI customers get offered a more personal touch with the relationship managers calling them with special customized offers.

Communication Time personalization: Knowing a customer’s travel search history, Telecom companies can offer international roaming plans at the perfect time. Banks also use this strategy to offer instant lines of short-term credit to customers with a low account balance. Also, in general, based on an understanding of when a customer is most active on social media or their smartphones, marketers can run social media campaigns or send notifications to the customer for maximum impressions.

3. Personalized Digital Experiences

Beyond trying to influence the buying behavior directly, the most effective form of personalization to be offered is to update the way customers engage with the firm on a regular basis. By knowing what customers are doing on the company websites, the marketers can get a far deeper understanding of the customers’ needs and expectations. This particular vision led the CMOs to realize the extremely inadequate digital behavior tracking that companies have on their own portals and applications. What followed was a surge in digital data collection platforms like Google Analytics and Adobe Analytics. A few companies have managed to deploy these solutions effectively to understand their customers better than ever. This led them to build models for Journey Personalization, which aimed at providing the customers with the fastest path to conversions based on their interests and preferences.

Businesses that have managed to leverage the power of personalization, have consistently been able to create a differentiated positioning from their competition. This has allowed them not only to attract customers away from competitors but also command a premium price at the same time. The clear business advantage has led them to invest heavily in more use cases and enhance their models from good to great.

Incedo LighthouseTM – A platform to natively support personalization use cases

With our proprietary platform Incedo LighthouseTM, we help clients successfully deploy multiple use cases for AI-driven customer personalization. The platform brings together Big Data (millions of customers, daily updated, across several dimensions), data engineering and data science in an efficient use-case centric manner in self-serve mode. The platform can serve multiple use cases for personalization together, e.g. cross-sell offers with the right channel for right customer cohorts at specific times of the year. This leads to faster and automated implementation of the journey:
– From data to critical insights: e.g. identification of cohorts of customers that would respond to deep discounts
– And, From insights-to-actions recommendations e.g. evaluating statistically the required level of deep discounting to optimize ROI

Significant success of AI-driven, personalized recommendations has not come without its fair share of speeding tickets. Couple of examples include:
– Compromising Personally Identifiable Information (PII) inside the machine learning lifecycle, thus jeopardizing customers’ privacy
– Inadvertently introducing biases into the recommendation algorithms, leading to discrimination and unfair business practices

Incedo LighthouseTM helps in protecting against these issues in a very direct manner – more on that in the next blog!

Migration to cloud has led a way to heavily automate the deployment process. Teams rely on deployment automation for not just deploying regular updates to their application, but the underlying cloud infrastructure as well. There are various deployment tools available in the market to set up pipelines for almost everything that we could think of. Faster delivery, less manual efforts, and easier rollbacks are now driving the agenda for Zero Touch Deployments.

What does Zero Touch in Cloud mean?

We would love a cloud environment where workload AWS accounts especially a production account require no console login to design, implement and operate the infrastructure and application resources. The team could have read access to view the resources but that’s as far as they can go. This helps in avoiding human errors such as forgetting to check the resource ARN before modifying/ deleting the resource on AWS CLI command. This happens with a lot of developers. Resolving these issues is what is the idea behind Zero Touch. Using pipelines and IaC (Infra As Code) tools, it becomes easier to apply it practically.

zero-touch-cloud-deployment

In picture (a), the IAM role “Shared-Deployment-Role” in the “Shared Deployment” account is assuming IAM roles in the workload accounts to deploy resources. The workload accounts could have additional roles to allow users to assume and login into a specific account. Users may have read-only access in Prod account to view services and resources. The “Deployment-Role” in each workload account is created along with the initial infrastructure layer using the IaC tool (AWS CloudFormation/ Terraform/ AWS CDK) and Pipelines (CodePipeline/ GitLab/ Jenkins/ BitBucket). AWS CodePipeline is configured in the Shared Deployment account and IaC templates are stored in the AWS CodeCommit repository for version control.

zero-touch-cloud-deployment

Picture (b) gives a high-level understanding of hoe Application deployment and Infrastructure deployment pipelines would look in AWS Cloud.

Infrastructure Layer:

Using CloudFormation templates, CodeBuild and CodePipeline; we deploy resources like and are not limited to IAM roles for deployment, VPC, Subnets, Transit Gateway/ Attachments, and Route53 hosted zone(s). These services and resources are necessary to deploy and launch the application. The resource ID/ ARN values are stored in Parameter Store for consumption by IaC templates for the application. Parameter Store helps in developing re-usable IaC templates. How? The answer is to create Parameter Store keys with the same name across all the workload accounts and allow Infrastructure templates to update the values dynamically. Deployment of the infrastructure layer is generally managed by the organization’s IT team with approved AWS services and the organization’s cloud best practices.

Application Layer:

Every application in an organization can differ in the services required to host it in the cloud. Application developers or DevOps teams can choose any one or combination of approved CI/CD and IaC tools to design and host the application in workload accounts. Teams can leverage CodePipeline, CodeBuild, CodeDeploy in Shared Deployment account to build and deploy applications in workload accounts by assuming respective “Deployment” roles. Remember that the IT team had created parameters that hold resource id(s)/ ARN(s) of resources that could be consumed by application templates. The Agile model for development, test, and deploying application templates are encouraged to be adopted ensuring only clean and tested code/template(s) go into Production.

Conclusion:

There is no one “the best” way of designing infra and application deployment. Size, complexity, cost, and time could determine what is optimal. A Zero Touch Cloud Deployment strategy can comprise various permutations and combinations of infra and application components. However, the motive behind the approach could help in minimizing human errors and many sleepless nights.

DevOps is a term that is not new for the software world. However, it is certainly the magical wand which has really sped up the digital transformation. In a sense, the entire SaaS products story is written with the help of DevOps . In today’s VUCA world, digital services aren’t simply nice to have but are a basic expectation from consumers and enterprise customers alike. In the whole digital transformation journey DevOps clearly aligns well with the business goals, ensuring that the experiences they deliver form a seamless and customer-delighting part of the entire journey.

Continuous delivery and integration with magnificent tools have allowed the companies to create entire products as individual chunks. These individual chunks of functionality, captured by user stories, can be developed, and deployed into production in a day or two, not in weeks or months. That has really changed the game while we look at product development.

The Product Led Approach (PLA) driven by DevOps has created a culture in which the final goal has converted into the delivery of a fixed set of requirements, on-time, and on-budget scenarios. Scripts that can set up the entire deployment infrastructure, including software-defined networking, are managed just like the source code of the services running on them. Business-centric services that can evolve quickly and independently, combined with frequent and reliable releases, finally put the old dream of reusable and re-combinable components in reach for the companies.

How DevOps can help in Digital Transformation?

  • Maturity Model: DevOps is the aggregation of cultural philosophies, practices, and gear that will increase an organization’s potential to supply programs and offerings at high velocity. This results in evolving and enhancing merchandise at a quicker tempo than businesses using the conventional software processes. Enterprises are moving from large, monolithic applications to smaller, loosely coupled microservices. This enables clients to act faster, better adapt to changing markets, and grow more effectively to achieve their business goals. Companies use DevOps continuous delivery practices that help teams take ownership of these services and then release updates faster.
  • Break Organization Silos to Collaborate: DevOps helps in driving the collaborative thought-process and change in mindset. DevOps helps organizations achieve digital transformation by changing the social mindset of the market, cutting off silos, and covering the way for continuous innovation and agile experimentation. With a DevOps model, development and operations teams are no longer “isolated”. In fact, DevOps encourages better communication between the two teams and creates development channels that enable continuous integration. The software problems are identified, resolved and deployed faster.
  • Organize Process around Customers: The increased speed allows companies to better serve their customers and be fair in the marketplace. Processes can be seamlessly designed and finalized based on customers’ business needs, helping them achieve higher value growth. When combined with rich digital telemetry from modern monitoring and observability tools, we end up with a strong knowledge of our systems that helps reduce mean time to recovery (MTTR), allowing teams to really take ownership of production services.
  • Build an experimental mindset: Experimentation is the fundamental need for success in today’s rapidly changing technology stack. DevOps can help create the speed of experimentation at which the business can reliably implement these ideas and launch them into the market to start learning again.
  • DevOps and Cloud: Cloud is part of almost every digital transformation journey. DevOps and cloud are completely synergetic to each other. This powerful combination has empowered the developers to respond to the business needs in near real-time. The latency of software development has become a part of past. The partnership of DevOps with cloud has given rise to a new term generally called ‘CloudOps’. The overall advancement in CloudOps has lowered the total cost of ownership for the organizations. This has made a direct impact not only on the top-line revenue and market share but also on its innovation capabilities and response time. Cloud was created majorly to tackle the challenges of Availability, Scalability and Elasticity goals based on dynamic demand. CloudOps uses the DevOps principles of CI/CD to realize the best practices of high availability by refining and optimizing business processes.
Recruitment Fraud Alert