2025 in Review: From Complexity to Cognitive Operations | Charting Our Course for 2026 

Reflections in 2025: Progress and Transformation at Digis Squared 

As we approach the end of the year, it is worth pausing to reflect on the significant technological shifts that have characterized 2025. At Digis Squared, our focus has been not only on innovation, but also on achieving measurable progress in addressing the ongoing challenges faced by Mobile Network Operators (MNOs) and the telecommunications sector. This year has marked a turning point, where the concept of intelligent networks has begun to materialise in practical operations. This transformation has been fuelled by the urgent need to manage increasingly complex systems, respond to mounting cost pressures, and navigate a rapidly changing threat environment. 

Operational Consolidation and Application 

In my role as Chief Technology Officer, I have witnessed 2025 becoming the year of consolidation and practical application. Our collective efforts have moved beyond the initial excitement surrounding emerging technologies such as generative AI. Instead, we have dedicated ourselves to the tangible development of networks that are capable of self-awareness and self-optimisation. The momentum established through the deployment of our solutions and the insights gained from real-world implementations now positions us strongly for an even more dynamic year ahead in 2026. 

Looking Back and Ahead: Addressing Core Challenges and Future Trends 

This review aims to summarize the fundamental challenges currently facing the telecoms industry, to highlight the ways in which our technologies are directly tackling these issues, and to share my perspective on the trends that are likely to shape the future of our sector. 

The Complexity Crisis in Modern Telecom Networks 

The modern telecommunications network presents a formidable challenge for operators due to its intricate composition. These networks are built from a vast array of equipment sourced from multiple vendors, layered with technologies that span generations—from 2G through to 5G—and encompass a diverse range of domains such as Radio Access Networks (RAN), Core networks, Transport, Business Support Systems (BSS), and IP Multimedia Subsystems (IMS). 

This inherent complexity has evolved into a significant crisis for Mobile Network Operators (MNOs). The interplay of different vendors and technologies, combined with the multifaceted nature of network operations, has resulted in a landscape where operational efficiency is frequently compromised. Critical challenges have emerged; each demanding the implementation of cognitive solutions to restore control, streamline processes, and ensure the ongoing viability and security of telecom operations. 

Challenge Area Description Impact  
Operational Inefficiency Sheer volume of alarms and data makes root cause analysis slow and manual. Long Mean Time To Resolution (MTTR) and high operational expenditure (OPEX)1
Budget Constraints Pressure to reduce OPEX while simultaneously investing in 5G, 6G R&D, and network modernization. Need for highly efficient, automated tools that minimize human intervention. 
Cybersecurity Exposure The vast attack surface of IoT devices and distributed networks, coupled with sophisticated AI-driven threats. Non-negotiable security requirements, demanding proactive, predictive defense and Zero Trust adoption3
Vendor Lock-in & Silos Disparate vendor systems create data silos, hindering end-to-end visibility. Costly, inflexible contracts and inability to optimize across the entire network4
Sustainability Mandate Massive energy consumption from 5G and data centers drives the need for a rapid transformation toward Net ZeroEconomic and regulatory pressure to integrate energy efficiency into network design. 

Reflections on 2025: From Potential to Practice 

As I reflect on the entirety of 2025, it becomes evident that the year was marked not by a wave of groundbreaking inventions, but rather by the steady advancement and practical implementation of robust, established technologies. Over the past twelve months, the narrative shifted decisively from the anticipation of future possibilities to the tangible realisation of enterprise-ready solutions. 

Across several critical sectors, we observed a transition: technologies that once held only theoretical promise are now being deployed at scale and integrated into real-world business operations. This maturation has underscored the value of reliable, effective tools, demonstrating how innovation is as much about refinement and application as it is about invention. 

  1. Generative AI Democratizes Network Operations 

If previous years were about the novelty of Generative AI, 2025 was about its utility. We moved beyond simple automation to true cognitive operations by integrating GenAI directly into the operational workflow. 

  1. The Strategic Importance of Digital Twin Technology 

The concept of a Digital Twin—a virtual replica of a physical system—is not new, but its application in telecom reached a critical maturity point in 2025. This technology allows MNOs to simulate complex changes, test new configurations, and predict the impact of traffic surges before they occur on the live network. 

  1. Private 5G Powering Industry 4.0 

The private network market accelerated rapidly, with projections showing a 65.4% CAGR in private 5G connections through 2030. Industries moved beyond proof-of-concept to live deployment, particularly in logistics and manufacturing. 

  1. Cybersecurity as a Core Business Function 

The increasing complexity of our networks and the rise of sophisticated AI-driven threats made cybersecurity a top priority in 2025. The focus shifted from reactive defense to proactive, predictive security postures. We saw greater adoption of Zero Trust architectures and AI-powered threat detection systems. For telecom operators, securing the vast attack surface of IoT devices and distributed networks became a non-negotiable aspect of service delivery. 

  1. 5G Expansion and the Dawn of 6G Research 

Throughout 2025, 5G continued its global expansion, particularly in emerging markets where its impact on enterprise connectivity is most profound. 6G research also increased exponentially in 2025 targeting specific applications like CF-MIMO , RIS and ISAC .  

  1. Field Testing Takes to the Skies (Green Operations) 

2025 marked a turning point for field testing sustainability. Traditional drive testing, with its high fuel consumption and limited access to difficult terrains, began to give way to drone-based solutions

  1. AI-Automated Planning & The 15-Minute Optimization 

The era of manual network planning effectively ended in 2025. The complexity of multi-layer networks made manual parameter tuning obsolete, we also perceived massive increase in the demand for SMART CAPEX and OPEX solutions. 

  1. Active Probing & Service Assurance 

As networks became more complex, “passive” monitoring was no longer enough. 2025 saw a rise in Active Service Assurance

  1. Quantum Computing’s Cautious Breakthroughs 

While still in its early stages, quantum computing made tangible progress in 2025. We saw advancements in qubit stability and error correction, moving quantum systems closer to solving real-world optimization problems that are intractable for classical computers. For telecom, this holds immense promise for complex tasks like network routing optimization and spectrum management. Though widespread application remains on the horizon, the progress this year has solidified its potential as a game-changing technology. 

2026 Technological Outlook and Predictions 

As the telecommunications industry builds upon the momentum generated in 2025, the year ahead is set to deliver significant advances in network intelligence and integration. The following are detailed forecasts for the principal trends that are expected to shape 2026: 

  1. AI-Native Networks Become the Industry Standard 

The application of artificial intelligence will evolve beyond mere overlays. In 2026, the emergence of AI-native networks will see machine learning deeply embedded within the air interface and protocol stack. Key network elements, such as Radio Units and Distributed Units, will develop “Sense-Think-Act” capabilities, empowering them to make micro-adjustments in beamforming and spectrum allocation within milliseconds. 

  1. The Dark NOC as a Strategic Priority 

In response to ongoing budgetary pressures, the Dark NOC—where up to 80% of operational tasks are autonomously managed—will become a primary strategic objective for mobile network operators (MNOs). This shift will redefine the role of human engineers, transitioning their focus from routine monitoring and repairs to governance and design. The reliance on vendor-agnostic platforms will increase, aiding in the unification of data across fragmented ecosystems. 

  1. Transition to Post-Quantum Cryptography (PQC) 

The migration towards Post-Quantum Cryptography will progress from the planning stages to phased execution. Telecom operators are anticipated to begin upgrading encryption keys and hardware security modules, aiming to counteract the risk posed by potential adversaries who may harvest encrypted data now, with the intention of decrypting it in the future as quantum computing matures. 

  1. Intensification of 6G Standards Prototyping 

The development of 6G standards will move from theoretical discussions to tackling targeted engineering challenges. One of the central focuses will be Semantic Communications, which shifts the paradigm from transmitting raw bits to communicating actual meaning. By leveraging AI to filter out irrelevant data at the source, 6G networks are expected to realise substantial efficiency gains. 

  1. Agentic AI and the Advancement of Cybersecurity Autonomy 

2026 is poised to be recognised as the “Year of the Defender” due to the emergence of Agentic AI—autonomous systems capable of independent decision-making. These intelligent agents will prove indispensable in defending against AI-driven identity attacks. Unlike conventional static playbooks, Agentic AI can reason, adapt, and autonomously develop defence strategies to address vulnerabilities in real-time. 

  1. Private Networks Will Drive Enterprise Digitalization 

The demand for private networks will accelerate in 2026 as more enterprises recognize their value in achieving secure and reliable connectivity for mission-critical operations. 

2025: A Year of Achievement 

Throughout 2025, we played a pivotal role in driving technological advancements for our clients and customers. Acting as a cornerstone of progress, our team delivered a series of significant accomplishments that underscored our commitment to innovation and excellence. The following is a sample of the achievements realised during the year, reflecting our ongoing dedication to supporting our clients’ evolving needs and enabling their success in an increasingly complex technological landscape. 

  • Our KATANA platform proved essential here, offering a vendor-agnostic “Single Pane of Glass.” By integrating data from Huawei, Nokia, Ericsson, and new Open RAN vendors into one topology view, we solved the fragmentation issue. This enabled Zero Touch Provisioning (ZTP) via our iMaster module, allowing devices to be automatically configured and integrated upon connection, regardless of the hardware vendor. 

  • We introduced ConnectSphere, an extension of our testing capabilities that deploys static probes in VIP areas and strategic locations. Unlike traditional drive tests that are periodic, these probes run 24/7 service testing to assure quality in real-time. This allows for proactive fault detection in both the Core and Radio domains, ensuring that high-value enterprise clients receive the Quality of Service (QoS) they demand 

  • We doubled down on OctoMind, our AI-based planning platform. By leveraging modules like X-Planner (for PCI/RSI planning) and ACP (Automatic Cell Planning), we moved from manual processes that took days to AI-driven optimization that takes just 15 minutes. This automation improved network performance metrics from 68% to 94% while eliminating human error in parameter configuration. 

  • We saw a surge in specialized use cases where Wi-Fi was insufficient. For instance, in pharmaceutical warehouses, we deployed Private 5G to support mission-critical forklift automation, autonomous mobile robots (AMR), and collision avoidance systems that require ultra-low latency. By implementing Managed Services that include L1 maintenance and radio optimization, we ensured these networks met stringent SLAs for reliability and security, keeping data strictly on-premises 

  • Through our INOS Air solution (in collaboration with DJI Enterprise), we enabled operators to conduct 5G, 4G, and 3G testing at specific altitudes (5m to 20m). This allows for testing in recreational areas, dense industrial compounds, and over water (marinas and ports) where cars simply cannot go. This shift not only expanded accessibility but also slashed OPEX and carbon footprints, aligning perfectly with the industry’s Net Zero goals by utilizing electric drones instead of fuel-heavy vehicles. 

  • We introduced GenAI Chatbots within our KATANA platform to democratize data access. Engineers no longer need complex SQL skills; they can simply ask natural language questions like “Which sites have hardware faults right now?” or “Why is the throughput low in site X?” to get instant, actionable insights. This capability allows non-technical staff to create business intelligence reports and resolve performance issues faster, significantly reducing the barrier to entry for advanced network analysis.  

  • We are actively leveraging and integrating Digital Twin technology into our cognitive solutions. By integrating the Digital Twin with the Network Data, we provide MNOs with a powerful tool for predictive Coverage and Capacity. This capability is a game-changer for planning and optimization, allowing MNOs to move from simply reacting to network events to proactively optimizing their infrastructure, ensuring network slicing integrity, and managing resources with unprecedented foresight. 

Digis Squared Focus in the Next Chapter 

Our strategic focus for the coming year is clear: to leverage our cognitive solutions to solve the industry’s most pressing challenges. 

Private 5G: Cognitive Operations at the Edge 

The successful deployment of Private 5G will hinge on the ability to deliver cognitive operations at the edge. Our focus will be on ensuring the success of these mission-critical networks by providing the necessary tools to manage unique demands. Tools like KATANA and ConnectSphere will be vital for ensuring real-time fault isolation, predictive capacity planning, and dynamic resource allocation for industrial automation. 

Net Zero: Energy Efficiency as a Strategic Priority 

The transformation toward Net Zero is a strategic priority. We will continue to refine our AI and cognitive tools to optimize network elements, dynamically power down resources based on predictive traffic patterns, and ensure that energy efficiency is a core design principle. 

  • Example: Our cognitive engine can predict low-traffic periods 72 hours in advance to automatically place specific 5G-enabled sites into an ultra-low power state, maximizing OPEX savings. 

Expanding the Cognitive Toolkit 

We will continue to expand the capabilities of our core platforms, further integrating the predictive power of the Digital Twin with the real-time insights of KATANA and the customer-centric metrics of INOS. Our goal is to provide a unified, vendor-agnostic platform that allows MNOs to achieve true operational autonomy and move confidently toward the Dark NOC. 

Closure  

The path forward is clear: the only way to manage the complexity, cost, and security demands of modern telecom networks is through intelligence. For telecom operators in emerging markets, these advancements offer a chance to leapfrog legacy systems and build agile, efficient, and intelligent networks. 

At Digis Squared, we are committed to turning these technological possibilities into operational realities for our partners. By focusing on cognitive operations powered by KATANA, INOS, OctoMind, and Digital Twin technology, we are not just optimizing networks; we are building the future of telecommunications—a future that is more efficient, more secure, and more sustainable. 

The Road to 6G: Engineering Breakthroughs in the Terahertz Spectrum

While the theoretical possibilities are exciting, I’ve learned throughout my career that theory represents only one side of the equation. On the other side lies reality: signal loss, energy constraints, component limitations, and the unforgiving properties of our atmosphere. Today, I want to examine the engineering challenges that 6G must overcome to transform its spectral ambitions into practical, deployable technology.

Perhaps the most fundamental challenge we face is propagation loss. As frequencies increase, free-space path loss grows exponentially, a physical reality that cannot be engineered away. At 100 GHz, signal attenuation is already substantially higher than in traditional 5G bands. By the time we reach 1 THz, even a few meters of distance can drastically degrade signal strength. This isn’t merely an inconvenience; it fundamentally reshapes how we must approach network architecture. 6G will require advanced beamforming techniques, ultra-short-range cells, or reconfigurable intelligent surfaces (RIS) just to maintain basic communication links at these frequencies.

Atmospheric absorption presents another significant hurdle. In sub-THz and THz ranges, atmospheric gases—particularly water vapor and oxygen absorb electromagnetic waves in ways that create distinct challenges for wireless communication. Absorption peaks occur at specific frequencies: 183 GHz for water vapor and 325 GHz for oxygen, effectively creating “spectral dead zones” where long-range communication becomes impractical. Our strategy must therefore focus on identifying and utilizing transparency windows (such as 140 GHz) for viable communication links, while allocating other frequency bands for indoor or ultra-dense deployment scenarios where atmospheric effects are minimized.

The hardware requirements for THz communication represent perhaps the most immediate practical challenge. Today’s RF integrated circuits and front-end modules simply weren’t designed for terahertz operation. Silicon CMOS technology, the workhorse of modern wireless systems, begins to hit fundamental performance limits beyond 200 GHz. Alternative semiconductor technologies like Gallium Arsenide (GaAs) and Indium Phosphide (InP) show promise but remain expensive and less amenable to mass production. Beyond the semiconductors themselves, waveguide components, antennas, and packaging become highly lossy and mechanically delicate at these frequencies. Innovation pathways include hybrid integration approaches, nanophotonic technologies, plasmonic antennas, and metamaterials, all of which require substantial research investment before commercial viability.

Power efficiency emerges as another critical bottleneck. Power amplifiers operating at THz frequencies currently suffer from poor efficiency, generating excessive heat while delivering limited output power. In battery-constrained mobile devices, this inefficiency could render many theoretical applications impractical. Addressing this challenge will require multifaceted approaches: AI-driven energy management systems, novel energy harvesting techniques, and beam-aware hardware designs that minimize power consumption when full-power transmission isn’t necessary.

Precision timing and synchronization take on new importance at these frequencies. With the ultra-short wavelength characteristic of THz signals, even nanosecond-level timing errors can destroy link integrity. This impacts not just data transmission reliability but also the accuracy of sensing and positioning applications that 6G promises to enable. Meeting these requirements will demand high-stability clock sources, potentially including quantum timing references, and integrated sensing-transmission designs that maintain phase coherence across multiple functions.

The testing and simulation infrastructure for THz systems remains underdeveloped. Existing RF testbeds rarely extend beyond 100 GHz, creating a gap between theoretical models and practical verification. Simulation models for THz propagation are still evolving, and standards for THz-specific channel models are under development but not yet finalized. Without robust tools for repeatability and comprehensive test systems, mass deployment of THz technology remains speculative at best.

Finally, ecosystem fragmentation presents a strategic challenge. Unlike 5G, which benefited from relatively rapid ecosystem convergence around specific bands and technologies, 6G’s spectral frontiers are being explored in different frequency ranges across various countries and research institutions. Technical definitions and key performance indicators lack harmonization, and mainstream OEM and chipset vendor roadmaps have yet to fully incorporate these advanced frequency bands. This fragmentation could slow development and increase costs unless addressed through coordinated international efforts.

Despite these formidable challenges, I see tremendous beauty in the struggle to overcome them. These obstacles aren’t roadblocks; they’re invitations to innovate in ways that will transform not just telecommunications but multiple scientific and engineering disciplines.

The development of 6G will require an unprecedented fusion of telecommunications engineering, quantum physics, and materials science. Those who successfully bridge these domains will lead the industry forward not just in products and services, but in establishing entirely new paradigms for how we understand and utilize the electromagnetic spectrum.

As we navigate these challenges, I believe we’ll discover that the limitations imposed by physics aren’t constraints but catalysts forcing us to think more creatively, collaborate more effectively, and ultimately develop solutions that extend far beyond telecommunications into healthcare, environmental monitoring, security, and countless other domains that will benefit from mastery of the terahertz frontier.

This blog post was written by Head of Products, Mohamed Sayyed, at Digis Squared.

Semantic Communications: Use Cases, Challenges, and the Path Forward

Today, I want to delve deeper into the practical applications of semantic communications, examine the challenges we face in implementation, and outline what I believe is the most effective path forward.

Let’s begin by exploring the transformative potential of semantic communication across various domains.

In the realm of 6G and beyond, semantic communication will enable significantly leaner, context-aware data exchange for ultra-reliable low-latency communications (URLLC). This isn’t merely an incremental improvement; it represents a fundamental shift in how we approach network efficiency and reliability.

For Machine-to-Machine (M2M) and IoT applications, the implications are particularly profound. Devices will be able to understand intent without requiring verbose data transmission, resulting in substantial savings in both spectrum usage and energy consumption. In a world moving toward billions of connected devices, this efficiency gain becomes not just beneficial but necessary.

Autonomous systems present another compelling use case. When vehicles and robots can communicate purpose rather than raw data, we see marked improvements in decision-making speed and safety. This shift from data-centric to meaning-centric communication could be the difference between an autonomous vehicle stopping in time or not.

The future of immersive experiences, including extended reality, holographic communication, and digital humans, will increasingly rely on shared context and compressed meaning. These applications demand not just bandwidth but intelligent use of that bandwidth, making semantic communication an ideal approach.

Finally, Digital Twins and Cognitive Networks will benefit tremendously from real-time mirroring and network self-awareness based on semantics rather than full datasets. This allows for more sophisticated modelling and prediction with less overhead.

Despite these promising applications, several significant challenges stand in our way.

Perhaps the most fundamental is what I call “semantic noise” errors in understanding, not just in transmission. This represents an entirely new category of “noise” in the communication channel that our traditional models aren’t equipped to address.

Context synchronization presents another hurdle. How do we ensure that sender and receiver share enough background knowledge to interpret messages correctly? Without this shared foundation, semantic communication breaks down.

From a theoretical perspective, modelling meaning mathematically remains a complex challenge. We need to move beyond bits to quantify and encode “meaning” in ways that are both efficient and reliable.

The dependence on advanced AI also presents practical challenges. Semantic communication requires deep integration with natural language processing, reasoning models, and adaptive learning technologies that are still evolving rapidly.

Finally, standardization poses a significant obstacle. Our current network protocols simply weren’t built for semantic intent exchange, requiring substantial rethinking of our fundamental approaches.

In the first phase, Awareness & Modelling, we need to define semantic entropy, capacity, and metrics while developing proof-of-concept systems in research settings. This foundational work should include embedding semantic layers into AI-enhanced protocols, establishing the technical groundwork for what follows.

The second phase, Prototyping in 6G Environments, involves integrating semantic communication with URLLC and mMTC (massive Machine Type Communications). We should test these integrations with Digital Twin networks and edge AI, while simultaneously establishing pre-standardization working groups to ensure alignment across the industry.

The final phase, Ecosystem Integration & Commercialization, will require embedding semantic modules into chipsets and network functions, deploying them in smart cities, Industry 4.0 environments, and immersive media applications. Standardization through bodies like 3GPP and ITU will be crucial during this phase to ensure global interoperability.

This journey toward semantic communication isn’t just a technical evolution; it’s a reimagining of how networks understand and transmit meaning. The challenges are substantial, but the potential rewards in efficiency, intelligence, and new capabilities make this one of the most exciting frontiers in telecommunications.

This blog post was written by Amr AshrafProduct Architect and Support Director at Digis Squared.

Why 6G Spectrum Matters: The Invisible Anchor of the Next Wireless Revolution

As I reflect on the trajectory of mobile communications, I find myself at a fascinating inflection point. We stand at the threshold of another major leap forward, and the promise of 6G extends far beyond incremental improvements in speed or latency. What truly excites me is how 6G represents a fundamental reimagining of how intelligence, presence, and connectivity converge in our networks and devices. At the core of this transformation lies an often overlooked but absolutely critical element: spectrum.

I’d like to explore why spectrum will once again shape not just our networks, but our societies and the very fabric of our digital existence.

Each generation of wireless technology has been defined by the spectrum it unlocked. 3G introduced us to mobile internet, fundamentally changing how we access information. 4G gave birth to the mobile economy, enabling video streaming, social media, and real-time applications that have transformed business models and social interactions. 5G pushed into millimeter wave frequencies, delivering industrial-grade responsiveness for critical applications.

But 6G represents something more profound. The leap isn’t merely technological—it’s philosophical. Connectivity is evolving to become contextual and cognitive. Our networks won’t just react to demands; they’ll anticipate needs. Devices are transforming from communication tools into intelligent sensors and agents that understand and interact with their environment. To enable this vision, 6G will require access to new spectral frontiers, particularly the sub-terahertz and terahertz (THz) ranges that have remained largely untapped for communications.

The relationship between spectrum and 6G innovation is multifaceted and critical. First, we face the fundamental challenge of data hunger meeting bandwidth bottlenecks. Applications like immersive extended reality, holographic communication, and digital twins demand terabit-per-second scale bandwidth capacities that can only be provided through the vast, underutilized frequency bands far above today’s cellular allocations.

Second, moving into terahertz bands introduces entirely new physics to our communication systems. This isn’t just about higher speeds; it means fundamentally different signal behaviours, novel hardware challenges, and revolutionary ways of sensing the environment. The properties of these frequencies will enable capabilities we’ve barely begun to imagine.

Third, spectrum is increasingly becoming a strategic national resource. The countries and companies that shape the 6G spectrum narrative will effectively shape the rules of digital engagement for the next decade and beyond. This geopolitical dimension adds another layer of complexity to spectrum allocation and standardization.

As we develop these new frequency bands, we’ll need new ways to describe and categorize them. Just as 5G required a new “language” to describe its frequency bands (such as n78 or FR2), 6G will demand new spectrum notations to handle wider bandwidths (tens or hundreds of gigahertz), account for dynamic spectrum sharing and AI-managed allocation, describe multi-layered integration across space, air, and terrestrial networks, and reflect new use-case mappings for sensing, localization, and environmental feedback.

Without clear and intelligent spectrum notations, we risk fragmenting the global 6G conversation—both technically and geopolitically at precisely the moment when unified approaches are most needed.

We often discuss spectrum in abstract terms as an invisible field of energy we harness for communication. But the spectrum has language. It has a notation. And as we transition from 5G into the far more complex realm of 6G, that language is evolving in significant ways.

To understand the future of wireless, we must first understand how we describe it. At the most basic level, frequency measurements tell us about radio wave oscillation: 1 Hz represents one cycle per second, 1 MHz is one million cycles per second, 1 GHz is one billion cycles per second, and 1 THz is one trillion cycles per second. Higher frequencies oscillate faster, enabling more data to be carried per unit time but also introducing greater signal loss, narrower coverage, and new technical challenges.

The evolution of mobile communications has consistently moved toward higher frequencies: 2G operated in hundreds of MHz, 4G and early 5G exploited sub-6 GHz bands, 5G NR expanded into millimeter wave (24–100 GHz), and 6G will push from 100 GHz to potentially 10 THz. This progression reflects our growing appetite for bandwidth and the technological innovations that make higher frequencies viable for communication.

In 5G, standardized notations were introduced to simplify discussions about specific frequency bands. Designations like n78 (3300–3800 MHz, a widely deployed mid-band 5G range) and broader categories like FR1 (sub-6 GHz frequencies) and FR2 (24–52 GHz millimeter wave) have streamlined regulatory, engineering, and operational conversations. However, as we move into sub-THz and THz frequencies, these notation schemes begin to show their limitations.

As we begin to propose bands like 140 GHz, 275 GHz, and even 1 THz for 6G, new spectrum notation systems will be required to unify wider frequency ranges under flexible identifiers, account for hybrid use cases where a single band supports sensing, communication, and computing simultaneously, and enable AI interpretation through machine-readable notations for real-time spectrum management.

We might see notations like fTH1 (Fixed THz Band 1: 275–325 GHz), dTHx (Dynamic Terahertz experimental block), or sT1 (Sensing THz Band 1, dedicated for RF-based environment detection). While these are speculative examples, they illustrate the fundamental need: our notation must evolve alongside our use cases and technology.

The importance of well-defined spectrum notation extends across multiple stakeholder groups. For engineers, poorly defined notation creates confusion in hardware design, simulation, and deployment. For regulators, a lack of harmonized notation leads to regional incompatibility and inefficiencies in global rollout. For innovators, a shared, evolving language opens doors to collaborative research, efficient prototyping, and even machine-to-machine spectrum negotiation.

It’s worth noting that notation isn’t neutral; it embodies power. Whoever defines the language often shapes the outcome. As we collectively create 6G, spectrum notation represents a strategic touchpoint—a bridge between science, policy, and geopolitics that will influence the development trajectory of next-generation wireless technology.

The future of 6G is being written not just in laboratories or boardrooms but in the electromagnetic spectrum itself. If 5G reached into the millimeter-wave frontier, 6G is preparing for a quantum leap into the sub-terahertz and terahertz bands. These frequency ranges, once considered the domain of theoretical physics or space science, are now firmly in the telecom spotlight.

Before exploring specific frequencies, it’s important to understand that 6G isn’t simply “5G, but faster.” It aims to support terabit-per-second data rates for holographic and immersive applications, microsecond-level latency for real-time control and tactile internet, native AI and sensing capabilities embedded directly in the spectrum layer, and multi-dimensional connectivity spanning terrestrial, airborne, and satellite networks. To support these capabilities, we need wider bandwidths than ever before—and that’s only possible at higher frequencies.

Several spectrum ranges are emerging as candidates for 6G deployment. Upper Mid-Bands (7–24 GHz), sometimes called FR3, offer a potential balance between coverage and capacity for early 6G deployments. Candidate bands in this range include 7–15 GHz, with particular interest in the 10–14.5 GHz range being explored by ITU. These frequencies could support urban macro deployments with extended coverage and decent capacity, though existing satellite usage presents challenges that will require robust coexistence frameworks.

Sub-Terahertz bands (100–300 GHz) represent the range where true 6G performance begins to shine. Particular interest has focused on 100–140 GHz (under exploration in Europe, Korea, and Japan) and 275–325 GHz (proposed as a new THz communication block). These frequencies could enable indoor ultra-high-speed access, device-to-device communications, and real-time augmented, virtual, and extended reality applications. However, they face challenges including severe path loss, line-of-sight requirements, and hardware immaturity.

Terahertz Bands (0.3–10 THz) push beyond traditional RF into new physical domains. These bands, currently under early-stage scientific study, could support wireless cognition, high-speed backhaul, and environmental sensing. The challenges here are substantial: limited current RF integrated circuits, lack of global regulatory frameworks, and energy efficiency concerns.

Low-Band Spectrum (Sub-1 GHz) remains essential even in the 6G era. While not new, these frequencies provide critical coverage for massive IoT, rural areas, and emergency communications. The primary challenge is that this spectrum is already heavily saturated with legacy systems.

International harmonization efforts are underway across multiple organizations. ITU-R (WP 5D) is actively evaluating candidate frequencies for IMT-2030 (the official designation for 6G). The FCC in the United States has opened exploratory windows at 95–275 GHz. Europe’s Hexa-X project advocates for coordinated research into 100+ GHz spectrum. China, Korea, and Japan are conducting field trials at 140 GHz and above. Global harmonization will be crucial—not just to avoid interference, but to enable cross-border 6G roaming, manufacturing scale, and effective spectrum diplomacy.

Rather than depending on any single band, 6G will likely employ a layered spectrum approach: low bands for resilient, wide-area coverage; mid bands for urban macro deployment and balanced rollout; sub-THz for immersive services and fixed wireless; and THz for sensing, cognition, and backhaul. All of these layers will be dynamically orchestrated, likely through AI and real-time feedback systems, to create a seamless connectivity experience across diverse environments and use cases.

Author: Obeidallah Ali, R&D Director at DIGIS Squared

Obeidallah Ali leads the Research & Development efforts at DIGIS Squared, driving innovation in AI-powered telecom solutions. With deep expertise in 5G Network Design, Optimization, and Automation, he focuses on developing tools like INOS™ and Katana™ that help operators diagnose, troubleshoot, and enhance network performance worldwide.

For inquiries, please contact:
Email: info@digis2.com

Semantic Communications: Rethinking How Networks Understand Meaning

Traditional communication models, like Shannon’s theory, have always focused primarily on the accuracy of bit transmission from sender to receiver. But in today’s world, dominated by AI, IoT, and immersive experiences, this approach is becoming increasingly insufficient. The challenge isn’t just about transmitting data anymore; it’s about transmitting the right data, with the right context, at precisely the right moment.

At its core, semantic communication represents a model that prioritizes understanding over mere accuracy. Rather than sending every bit of information, semantic systems intelligently transmit only what’s necessary for the receiver to reconstruct the intended meaning. This represents a profound shift in how we conceptualize network communication.

Consider this practical example: a device needs to send the message “I need a glass of water.” In classical communication, this entire sentence would be encoded, transmitted, and decoded bit by bit, regardless of context. But in a semantic communication system, if the context already indicates the user is thirsty, simply transmitting the word “glass” might be sufficient to trigger complete understanding. This approach is inherently context-aware, knowledge-driven, and enhanced by artificial intelligence.

The necessity for semantic communication becomes increasingly apparent when we consider its practical benefits. It substantially reduces redundant data transmission, which conserves both bandwidth and energy, critical resources in our increasingly connected world. For latency-sensitive applications like critical IoT systems, autonomous vehicles, and holographic communication, this efficiency translates to meaningful performance improvements. Furthermore, it enhances machine-to-machine understanding, enabling more intelligent edge networks, while aligning communication more closely with human-like reasoning patterns, making our interactions with technology more natural and efficient.

When we examine these advantages collectively, it becomes evident that semantic communication isn’t merely a beneficial addition to our technological toolkit; it represents a fundamental paradigm shift in communications technology.

The enabler of this transformation is undoubtedly artificial intelligence, particularly in domains such as natural language understanding, knowledge graphs, semantic representations, and the ability to learn shared context between sender and receiver. When integrated with Digital Twins and Cognitive Networks, semantic communication becomes even more powerful, allowing systems to predict, understand, and take proactive action rather than simply reacting to inputs.

At Digis Squared, we view Semantic Communication as a cornerstone of future AI-native networks. I believe it will fundamentally reshape how we design, operate, and optimize telecom systems, not only by increasing efficiency but by making networks truly intelligent.

As Head of Product, I find myself increasingly asking a question that challenges conventional thinking: What if our networks could understand why we communicate, not just what we communicate? This perspective shifts our focus from merely building faster networks to creating smarter, more meaningful ones that truly understand human intent.

Author: Mohamed Sayyed, Head of Product at DIGIS Squared

Diagnosing and Resolving “FAILURE_MSG4_CT_TIMER_EXPIRED” in 5G Standalone Networks

In the deployment and optimization of 5G Standalone (SA) networks, ensuring the robustness of the Random Access Channel (RACH) procedure is critical. DIGIS Squared identified and resolved a recurring RACH failure – “FAILURE_MSG4_CT_TIMER_EXPIRED” – during performance testing using our proprietary tools: INOS™ and Katana™. This white paper outlines the nature of the problem, diagnostic process, root cause analysis, and optimization strategies applied to restore optimal network performance.


Background

The 5G NR contention-based RACH procedure is essential for initial access, handover, and beam recovery. It involves a four-message handshake:

  1. Msg1: RACH preamble from UE
  2. Msg2: Random Access Response (RAR) from gNB
  3. Msg3: MAC CE or RRC message from UE
  4. Msg4: Contention resolution from gNB

The failure occurs when the UE sends Msg3 but does not receive Msg4 within the contention resolution timer window, resulting in an aborted RACH attempt.


Problem Identification

Failure Type: FAILURE_MSG4_CT_TIMER_EXPIRED
Detection Tool: INOS™ (field testing)
Confirmation Tool: Katana™ (OSS KPI analysis)

During extensive drive tests in both urban and suburban environments, INOS flagged multiple instances of RACH failure where Msg1, Msg2, and Msg3 were correctly transmitted, but Msg4 was not received. This was corroborated through Katana’s analysis of OSS counters, revealing high contention timer expiries in cells with:

  • Low SS-RSRP values (< -110 dBm)
  • High load and scheduling delays
  • Specific PRACH configurations

Root Cause Analysis

The following contributing factors were identified:

  • Msg3 Decoding Failures: UL signal degradation or beam misalignment prevented gNB from decoding Msg3.
  • Delayed Msg4 Scheduling: Resource contention at gNB delayed the contention resolution message.
  • Timer Misconfiguration: The default timer (sf64) was too short for specific TDD configurations.

Standards Reference:

  • 3GPP TS 38.331: RRC Protocol for NR
  • 3GPP TS 38.321: MAC Protocol for NR

Optimization Actions

Network-Side Adjustments

  • Increased ra-ContentionResolutionTimer from sf64 to sf128.
  • Reviewed and optimized PRACH Configuration Index and ZeroCorrelationZone settings.
  • Prioritized Msg4 scheduling at MAC layer in high-load scenarios.

Coverage Optimization

  • Fine-tuned beamforming and UL power control.
  • Extended PRACH monitoring duration in gNB firmware.


Post-Optimization Results

MetricBefore OptimizationAfter Optimization
RACH Success Rate89%97%
Msg4 Timer Expiry Rate11.8%<1.2%
Initial Access Latency (avg)440 ms260 ms
RRC Setup Drop RateModerateNear Zero

Verification was conducted using both INOS (field KPIs) and Katana (OSS trends), confirming significant improvement across all measured metrics.


Conclusion

This case study highlights the necessity of cross-layer observability in managing 5G SA network performance. By leveraging both real-time field data from INOS and OSS intelligence from Katana, DIGIS Squared successfully diagnosed and mitigated a complex RACH failure. The resolution not only improved RACH success rates but also enhanced user experience and access reliability.


Author: Obeidallah Ali, R&D Director at DIGIS Squared

Obeidallah Ali leads the Research & Development efforts at DIGIS Squared, driving innovation in AI-powered telecom solutions. With deep expertise in 5G Network Design, Optimization, and Automation, he focuses on developing tools like INOS™ and Katana™ that help operators diagnose, troubleshoot, and enhance network performance worldwide.


For inquiries, please contact:
Email: info@digis2.com

Diagnosing the Invisible: How We Enhanced CDN Caching Visibility to Prevent 404 Failures

Milliseconds matter in today’s hyper-connected digital world, and content delivery must be seamless, reliable, and globally scalable. At DIGIS Squared, we’re committed to going beyond surface-level metrics to detect and resolve the subtle issues that impact end-user experience at scale.

One such challenge we’ve recently tackled involved intermittent 404 errors and browsing failures caused by CDN (Content Delivery Network) caching problems. What appeared to be random access issues turned out to be symptoms of deeper inefficiencies in how content was cached—and more importantly, how that caching was monitored.


The Hidden Problem: When the Cache Misses

CDNs are the unsung heroes of modern web performance. By distributing content across global edge servers, they reduce latency, offload origin traffic, and enable resilient access for users worldwide. But when caching fails, whether due to misconfigured TTLs, cache-busting headers, or regional edge node discrepancies the impact can be significant:

  • End-users encounter 404 errors or content that fails to load
  • The origin server receives unnecessary load, reducing scalability
  • Diagnostics become harder due to lack of cache-level transparency

We noticed these exact patterns in our browsing analytics: certain requests, particularly through Akamai and Cloudflare, were returning failures that didn’t align with backend health or application logic. This pointed to a cache-layer issue, not an application bug.


The Solution: A New Dashboard to Measure CDN Caching Effectiveness

To combat this, we built and deployed a new internal dashboard that focuses on one core KPI: CDN Caching Hit Success Rate.

Here’s what it includes:

CDN Hit/Miss Analytics:

We track whether content is being successfully served from the cache or fetched from the origin, giving us clear indicators of performance degradation.

Provider-Specific Breakdown:

The dashboard separately monitors:

  • Akamai
  • Cloudflare

…two of the world’s most widely used CDN providers, with distributed edge networks and high cache sensitivity.

Unified KPI:

To give a macro-level view, we also calculate a global hit ratio that consolidates data across all CDN providers we observe in browsing sessions, helping us detect broader trends or cross-provider anomalies.

Root Cause Visibility:

Combined with error codes like 404, we can now correlate browsing failures directly to cache misses. This has already enabled us to:

  • Identify content types with poor caching behavior
  • Advise clients on improving their CDN TTL, cache-control headers, and edge rule configurations
  • Proactively alert when hit ratios drop below optimal thresholds


Why This Matters to Telecom & Digital Experience Teams

For operators, OTT providers, and enterprises relying on global content delivery, cache efficiency is no longer a back-end concern; it’s a frontline performance metric. Here’s why this matters:

  •  A single percent drop in cache hit ratio can significantly increase origin load, affecting cost and latency
  • In telecom, real-time browsing quality KPIs are vital to SLA monitoring and customer retention
  • Cache failures often go unnoticed because traditional monitoring tools don’t surface them unless there’s a full outage

By adding this caching intelligence into our performance analytics suite, we’re enabling smarter diagnostics, better QoE benchmarking, and deeper insights across the full delivery chain from device to content edge.

The Evolution of Self-Organizing Networks: From SON to Cognitive SON to LTMs

As we approach 2030, the telecommunications industry is at a point where traditional network automation methods are merging with advanced AI technologies. Based on my experience over the past decade with network optimization solutions, I would like to share some insights on potential future developments.

Two Perspectives on SON Evolution

When discussing the future of Self-Organizing Networks (SON), it’s crucial to distinguish between two perspectives:

SON as a Conceptual Framework

The fundamental principles of self-configuration, self-optimization, and self-healing will remain essential to network operations. These core concepts represent the industry’s north star – autonomous networks that can deploy, optimize, and repair themselves with minimal human intervention.

These principles aren’t going away. Rather, they’re being enhanced and reimagined through more sophisticated AI approaches.

Vendor-Specific SON Implementations

The feature-based SON solutions we’ve grown familiar with – ANR (Automatic Neighbour Relations), CCO (Coverage & Capacity Optimization), MLB (Mobility Load Balancing), and others – are likely to undergo significant transformation or potential replacement.

These siloed, rule-based features operate with limited contextual awareness and struggle to optimize for multiple objectives simultaneously. They represent the first generation of network automation that’s ripe for disruption.

Enter Large Telecom Models (LTMs)

The emergence of Large Telecom Models (LTMs) – specialized AI models trained specifically on telecom network data – represents a paradigm shift in how we approach network intelligence.

Like how Large Language Models revolutionized natural language processing, LTMs are poised to transform network operations by:

  1. Providing holistic, cross-domain optimization instead of siloed feature-specific approaches
  2. Enabling truly autonomous decision-making based on comprehensive network understanding
  3. Adapting dynamically to changing conditions without explicit programming
  4. Learning continuously from network performance data

The Path Forward: Integration, or Replacement?

The relationship between traditional SON, Cognitive SON, and emerging LTMs is best seen as evolutionary rather than revolutionary.

  • Near-term (1-2 years): LTMs will complement existing SON features, enhance their capabilities while learn from operational patterns
  • Mid-term (3-4 years): We’ll see the emergence of agentic AI systems that can orchestrate multiple network functions autonomously
  • Long-term (5+ years): Many vendor-specific SON implementations will likely be replaced by more sophisticated LTM-driven systems

The most successful operators will be those who embrace this transition strategically – leveraging the proven reliability of existing SON for critical functions while gradually adopting LTM capabilities for more complex, multi-domain challenges.

Real-World Progress

We’re already seeing this evolution in action. SoftBank recently developed a foundational LTM that automatically reconfigures networks during mass events.

These early implementations hint at the tremendous potential ahead as we move toward truly intelligent, autonomous networks.

Prepared By: Abdelrahman Fady | CTO | Digis Squared

NWDAF: How 5G is AI Native by Essence

The evolution of telecommunications networks has always been characterized by increasing complexity and intelligence. As we’ve moved through successive generations of wireless technology, I’ve observed a consistent trend toward more adaptive, responsive systems. With 5G, this evolution has reached a critical inflection point by introducing the Network Data Analytics Function (NWDAF) a component that fundamentally transforms how networks operate and adapt.

NWDAF, introduced in the 5G Core architecture starting from Release 15 and continuing to evolve toward 6G, represents a pivotal element in the Service-Based Architecture (SBA). More than just another network component, it embodies a philosophical shift toward data-driven, intelligent network operations that anticipate the needs of both users and applications.

At its core, NWDAF serves as a standardized network function that provides analytics services to other network functions, applications, and external consumers. Its functionality spans the entire analytics lifecycle: collecting data from various network functions (including AMF, SMF, PCF, and NEF), processing and analyzing that data, generating actionable insights and predictions, and feeding decisions back into the network for optimization and policy enforcement.

I often describe NWDAF as the “central intelligence of the network”—a system that transforms raw operational data into practical insights that drive network behavior. This transformation is not merely incremental; it represents a fundamental reimagining of how networks function.

The necessity for NWDAF becomes apparent when we consider the demands placed on modern networks. Autonomous networks require closed-loop automation for self-healing and self-optimization—capabilities that depend on the analytical insights NWDAF provides. Quality of Service assurance increasingly relies on the ability to predict congestion, session drops, or mobility issues before they impact user experience. Network slicing, a cornerstone of 5G architecture, depends on real-time monitoring and optimization of slice performance. Security analytics benefit from NWDAF’s ability to detect anomalies or attacks through traffic behavior pattern analysis. Furthermore, NWDAF’s flexible deployment model allows it to operate in either central cloud environments or Multi-access Edge Computing (MEC) nodes, enabling localized decision-making where appropriate.

The integration of NWDAF with other network functions occurs through well-defined interfaces. The Np interface facilitates data collection from various network functions. The Na interface enables NWDAF to provide analytics to consumers. The Nnef interface supports interaction with the Network Exposure Function, while the Naf interface enables communication with Application Functions. This comprehensive integration ensures that NWDAF can both gather the data it needs and distribute its insights effectively throughout the network.

The analytical capabilities of NWDAF span multiple dimensions. Descriptive analytics provide visibility into current network conditions, including load metrics, session statistics, and mobility patterns. Predictive analytics enable the network to anticipate issues before they occur, such as congestion prediction, user experience degradation forecasts, and mobility failure prediction. Looking forward, prescriptive analytics will eventually allow NWDAF to suggest automated actions, such as traffic rerouting or slice reconfiguration, further enhancing network autonomy.

As we look toward 6G, NWDAF is poised to evolve into an even more sophisticated component of network architecture. I anticipate the development of an AI/ML-native architecture where NWDAF evolves into a Distributed Intelligence Function. Federated learning approaches will enable cross-domain learning without requiring central data sharing, addressing privacy and efficiency concerns. Integration with digital twin technology will allow simulated networks to feed NWDAF with predictive insights, enhancing planning and optimization. Perhaps most significantly, NWDAF will increasingly support intent-based networking, where user intentions are translated directly into network behavior without requiring detailed technical specifications.

The journey toward truly intelligent networks is just beginning, and NWDAF represents a crucial step in that evolution. By embedding analytics and intelligence directly into the network architecture, 5G has laid the groundwork for networks that don’t just connect—they understand, anticipate, and adapt. This foundation will prove essential as we continue to build toward the even more demanding requirements of 6G and beyond.

Prepared By: Amr Ashraf | Head of Solution Architect and R&D | Digis Squared

Share:

ACES NH & DIGIS Squared Partnership Milestone

We are proud to announce the successful delivery and deployment of DIGIS Squared’s advanced cloud native testing and assurance solution, INOS, to ACES NH, the leading telecom infrastructure provider and neutral host in the Kingdom of Saudi Arabia.

As part of this strategic partnership, DIGIS Squared has delivered:

  • INOS Lite Kits for 5G Standalone (5GSA) testing and IBS testing.
  • INOS Watcher Kits for field / Service assurance.  
  • Full deployment of the INOS Platform over ACES NH cloud hosted inside the Kingdom, ensuring data localization and privacy compliance.

The ACES NH team is now leveraging INOS across all testing and assurance operations, with:

  • Comprehensive, detailed telecom network field KPIs & Service KPIs.
  • Auto RCA for field detected issues.
  • Full automation of testing and reporting workflows, that enables higher testing volumes in shorter timeframes.
  • AI-powered modules for virtual testing and predictive assurance.
  • A flexible licensing model that enables the support of all technologies.

This partnership highlights both companies’ shared vision of strengthening local capabilities and equipping ACES NH with deeper network performance insights—supporting their mission to provide top-tier services, in line with Saudi Arabia’s Vision 2030.

We look forward to continued collaboration and delivering greater value to the Kingdom’s digital infrastructure.

About ACES NH:

ACE NH, a Digital infrastructure Neutral Host licensed by CST in Saudi Arabia and DoT in India. ACES NH provide In-Building Solutions, Wi-Fi-DAS, Fiber Optics, Data Centers and Managed Services. We at ACES NH design, build, manage and enables Telecom-Operators, Airports, Metros, Railways, Smart & Safe Cities, MEGA projects. With its operations footprint in countries from ASIA, Europe, APAC, GCC and North-Africa with diverse projects portfolio and with focus on futuristic ICT technologies like Small-cells, ORAN, Cloud-Computing. ACES NH is serving nearly 2 billion worldwide annual users.