Large Language Models are reshaping the telecoms industry across multiple applications and domains.
They can boost information and data exchange – which can unlock new levels of efficiency when it comes to obtaining crucial information from different inventory platforms to drive process automation.
Large Language Models (LLMs) will transform the power, usefulness, and efficiency of telecoms networks. That’s because — like others — our industry has access to huge volumes of customer data, which can then be analysed and processed to create large data sets for business strategy and customer behaviour insights, and predicting and rectifying potential network and QoS issues. This enlarged and enriched data set can be used by LLMs to rapidly process queries and to exchange information.
LLMs are a branch of Artificial Intelligence (AI) that form a foundational model for AI applications. They can generate text on any subject, extract key information or data, translate data sources into useful or common formats, provide business and customer insight, take decisions to rectify any issues, and can also be embedded into workflows to enable intelligent data handling and automation.
While there are many promising avenues being explored for the adoption of LLMs, one that has immediate benefits is for handling data accessibility and availability from network inventory platforms.
Large Language Models – processing inventory data
The inventory platform is central to the operation of today’s networks. It’s not only a key repository of asset information, but it’s also a resource that is fundamental to the processing of multiple workflows and processes.
Operators may run a number of inventory processing systems, each including data on a related set of network assets and services. This data needs to be accessible to other systems – you can’t, for example, provision a new connection to a business if you don’t know what resources are available and their status.
So, during workflows, other systems may request information from an inventory platform, which then determines the actions that can be taken. To use the example above, a business customer may make a request to upgrade an existing connection from, say, 500Mbps to 1Gbps.
For this to be processed automatically, we need to know if the existing connection can be upgraded to the desired capacity – and the answer to this question depends on information held in the inventory platform.
To achieve this, we need integration between the inventory and the process that needs to retrieve the information. However, raw inventory data can exist in different formats, which are not compatible or readable by the system requesting the information. This raises challenges for integration, particularly if an inventory platform uses a proprietary or legacy format that is not well understood.
Of course, anything can be integrated with anything else – with the right tools – but LLMs offer a potentially more efficient way to manage this integration and data exchange in the future and to close silos that may exist in many networks.
That’s because, with the right training, an LLM can convert different formats of data (and other kinds of input) into a standard format so that it can be more easily parsed and processed, without human intervention.
In other words, the data ingested can then become the resource available for interrogation – and is thus made accessible to other systems that can obtain this information through the interfaces available.
In this case, the LLM acts as a data mediation bridge between different platforms – unlocking access to data scatted, for example, across multiple network inventory solutions. We’ve already seen in early trials that the application of an LLM can dramatically accelerate information retrieval from inventory solutions and enable it to be automatically requested and passed to the interrogating system.
In turn, this enhances existing automation flows – and accelerates data collection and presentation to the processes that consume it, so when one system needs information from the inventory platform, the LLM can obtain the necessary data, from the correct solution.
Using LLMs to close data gaps in the inventory
But LLMs can do more than simply convert data into a consistent, usable format. They can check data formats for expected information and parameters, to ensure data consistency. Data consistency is central to the role of inventory systems, since these are supposed to provide an accurate ‘single source of truth’ for the network.
Even if multiple systems are in place, covering, for example, different network domains – legacy access, fibre, transport, business services and so on — each needs to be as accurate as possible. However, gaps in the data stored will inevitably emerge: exposing such gaps through the automated use of LLMs is therefore a key additional benefit.
Where gaps are found, these can be raised as alerts, forcing actions to take place – reconciliation from OSS systems outside of normal procedures, manual interventions to add missing data, and so on.
As a result, the overall accuracy of the inventory platforms in place (which are foundational to operational performance) can also be enhanced through use of the LLM, as well as driving efficiency for workflows.
The capabilities of LLM mean that the telecoms industry is holding a powerful tool in its hands. They provide multiple benefits, including analysis of complex data and information that is way beyond human capabilities, correlation of trends in customer behaviour, combining data sets from multiple sources, predicting potential network faults and maintenance, enabling automation, providing auditing and compliance data, and much more.
Automated standardisation of data formats using LLMs
The bonus is that LLMs can interpret and combine data from multiple inventory systems that have been deployed – almost haphazardly over the years – to maintain efficient, integrated operations and equipment planning. Inventory systems both collect and expose data – as system status changes, as networks expand, as devices are shipped and activated, and so on.
The We Are CORTEX (WAC) platform collects data from relevant inventory systems – a process that can be automated and enhanced over time, accelerating data collection, translation, and presentation. If there are gaps in the data collection or clashes of formats, our solution can then raise alerts, enforce remedial actions, and through LLMs, improve over time.
For telecoms, LLMs can improve accuracy and efficiency across multiple domains, including service assurance, customer service, network efficiency, data management, auditing and compliance, and much more.
To find out how WAC’s platform can help you apply LLMs to drive automation between systems, contact us today.





