All Publications

89 Publications

Forthcoming

Quantum Annealing (QA)-accelerated MIMO detection is an emerging research approach in the context of NextG wireless networks. The opportunity is to enable large MIMO systems and thus improve wireless performance. The approach aims to leverage QA to expedite the computation required for theoretically optimal but computationally-demanding Maximum Likelihood detection to overcome the limitations of the currently deployed linear detectors. This paper presents X-ResQ, a QA-based MIMO detector system featuring fine-grained quantum task parallelism that is uniquely enabled by the Reverse Annealing (RA) protocol. Unlike prior designs, X-ResQ has many desirable system properties for a parallel QA detector and has effectively improved detection performance as more qubits are assigned. In our evaluations on a state-of-the-art quantum annealer, fully parallel X-ResQ achieves near-optimal throughput (over 10 bits/s/Hz) for 4 × 6 MIMO with 16-QAM using six levels of parallelism with 240 qubits and 220 𝜇s QA compute time, achieving 2.5–5× gains compared against other tested detectors. For more comprehensive evaluations, we implement and evaluate X-ResQ in the non-quantum digital setting. This non-quantum X-ResQ demonstration showcases the potential to realize ultra-large 1024 × 1024 MIMO, significantly outperforming other MIMO detectors, including the state-of-the-art RA detector classically implemented in the same way.

2024

Rapid delay variations in today's access networks impair the QoE of low-latency, interactive applications, such as video conferencing. To tackle this problem, we propose Athena, a framework that correlates high-resolution measurements from Layer 1 to Layer 7 to remove the fog from the window through which today's video-conferencing congestion-control algorithms see the network. This cross-layer view of the network empowers the networking community to revisit and re-evaluate their network designs and application scheduling and rate-adaptation algorithms in light of the complex, heterogeneous networks that are in use today, paving the way for network-aware applications and application-aware networks.

NextG cellular networks are designed to meet Quality of Service requirements for various applications in and beyond smartphones and mobile devices. However, lacking introspection into the 5G Radio Access Network (RAN) application and transport layer designers are ill-poised to cope with the vagaries of the wireless last hop to a mobile client, while 5G network operators run mostly closed networks, limiting their potential for co-design with the wider internet and user applications.  This paper presents NR-Scope, a passive, incrementally-deployable, and independently-deployable Standalone 5G network telemetry system that can stream fine-grained RAN capacity, latency, and retransmission information to application servers to enable better millisecond scale, application-level decisions on offered load and bit rate adaptation than end-to-end latency measurements or end-to-end packet losses currently permit. Our experimental evaluation on various 5G Standalone base stations demonstrates NR-Scope can achieve less than 0.1% throughput error estimation for every UE in a RAN. The code is available on github.

Today’s wireless networks are evolving rapidly, experiencing an unprecedented surge in traffic volume, radio density, and spectral efficiency demands. This thesis addresses the critical challenges arising from this evolution of next-generation (NextG) wireless networks, focusing on three primary objectives: achieving high data rates, ultra-low latency, and massive connectivity.

To meet these diverse and demanding requirements, this thesis poses a central question: Can we build a smarter radio environment controlled and learned by software, capable of self-configuring in real-time to meet different application needs? Current approaches to handle uncontrolled wireless signals are end-to-end, but communication endpoints are limited in their ability to shape inherent propagation behavior. By focusing on changing the environment itself rather than endpoints, this thesis seeks to enhance key aspects of modern wireless networks.

Millimeter-wave technology enables multi-Gbps data rates, but its high-frequency signals are vulnerable to blockage, limiting its practical use. This thesis presents two innovative solutions to overcome this challenge. mmWall is a programmable smart surface, installed on buildings and composed of over 4,000 metamaterial elements. It can steer signals through the surface to extend outdoor mmWave signals indoors or reflect them to bypass obstacles. Wall-Street is a vehicle-mounted smart surface designed to provide robust mmWave connectivity in high-mobility environments, ensuring reliable communication even in dynamic scenarios. Extending our smart radio concepts to ultra-reliable, low-latency satellite networks, we introduce Wall-E, a dual-band smart surface that mitigates signal blockage by relaying full-duplex satellite-to-ground links, and Monolith, a smart surface that boosts data rates for inter-satellite communication. To address the growing overhead in massive Internet of Things (IoT) networks, we propose CLCP, a machine learning technique that predicts the radio environment to reduce communication overhead. This AI-driven approach complements our programmable surfaces, forming a comprehensive smart radio solution.

Given the highly complex nature of real-world systems, conceptual models alone are insufficient to fully explain them. Our solutions are implemented in physical hardware prototypes, integrated with existing network protocols, and rigorously tested through experimentation. This thesis thus offers a concrete answer to the above central question, laying the foundation for software-controlled smart radio environments in NextG wireless networks.

Error correction codes are essential for reliability and capacity in wireless networks. By correcting errors in real-time, they reduce re-transmissions, conserve bandwidth, and enhance network performance. However, these advantages come at the price of high decoding complexity and high latency which compels network designers to make sub-optimal deployment choices such as considering approximate decoding algorithms, limiting parallelism, bit-precision, and iteration counts, sacrificing the potential capacity and performance gains. Moreover, the ever-increasing user demand in wireless networks poses additional challenges in managing power consumption, operational costs, and the carbon footprint of base stations and terminals. This highlights the need for continued innovation in wireless network baseband architecture and implementation strategies.

This dissertation introduces quantum computing-based processing architectures for decoding error correction codes, offering new computational paradigms to address these challenges at scale. By harnessing the principles of quantum mechanics, we propose a transformative shift in the way decoding is achieved, benefiting wireless performance and capacity, through the design and implementation of the following systems: (1) QBP, quantum annealing decoder for LDPC codes, (2) HyPD, hybrid classical–quantum annealing decoder for Polar codes, (3) QGateD, quantum amplitude amplification decoder for generic XOR-based error correction codes, (4) FDeQ, quantum gate decoder flexible to both LDPC and Polar codes, (5) QAVP, quantum annealing approach to vector perturbation precoding (a multi-user MIMO downlink baseband optimization problem). These systems collectively fall under the thesis that quantum computing is a promising approach for baseband processing, warranting further justification from an economic and environmental impact perspective. To address this and to make the case for quantum computing in wireless industry, (6) the dissertation presents a comprehensive cost and carbon footprint analysis of quantum hardware, both quantitatively and qualitatively. This may be of potential interest to NextG wireless networks and quantum architectures.

Forward Error Correction (FEC) provides reliable data flow in wireless networks despite the presence of noise and interference. However, its processing demands significant fraction of a wireless network’s resources, due to its computationally-expensive decoding process. This forces network designers to compromise between performance and implementation complexity. In this paper, we investigate a novel processing architecture for FEC decoding, one based on the quantum approximate optimization algorithm (QAOA), to evaluate the potential of this emerging quantum compute approach in resolving the decoding performance–complexity tradeoff.

We present FDeQ, a QAOA-based FEC Decoder design targeting the popular NextG wireless Low Density Parity Check (LDPC) and Polar codes. To accelerate QAOA-based decoding towards practical utility, FDeQ exploits temporal similarity among the FEC decoding tasks. This similarity is enabled by the fixed structure of a particular FEC code, which is independent of any time-varying wireless channel noise, ambient interference, and even the payload data. We evaluate FDeQ at a variety of system parameter settings in both ideal (noiseless) and noisy QAOA simulations, and show that FDeQ achieves successful decoding with error performance at par with state-of-the-art classical decoders at low FEC code block lengths. Furthermore, we present a holistic resource estimation analysis, projecting quantitative targets for future quantum devices in terms of the required qubit count and gate duration, for the application of FDeQ in practical wireless networks, highlighting scenarios where FDeQ may outperform state-of-the-art classical FEC decoders.