All Publications

15 Publications
Applied Filters: First Letter Of Title: P Reset

P

Cellular networks are becoming ever more sophisticated and overcrowded, imposing the most delay, jitter, and throughput damage to end-to-end network flows in today s internet. We therefore argue for fine-grained mobile endpoint-based wireless measurements to inform a precise congestion control algorithm through a well-defined API to the mobile s cellular physical layer. Our proposed congestion control algorithm is based on Physical-Layer Bandwidth measurements taken at the Endpoint (PBE-CC), and captures the latest 5G New Radio innovations that increase wireless capacity, yet create abrupt rises and falls in available wireless capacity that the PBE-CC sender can react to precisely and rapidly. We implement a proof-of-concept prototype of the PBE measurement module on software-defined radios and the PBE sender and receiver in C. An extensive performance evaluation compares PBE-CC head to head against the cellular-aware and wireless-oblivious congestion control protocols proposed in the research community and in deployment, in mobile and static mobile scenarios, and over busy and idle networks. Results show 6.3% higher average throughput than BBR, while simultaneously reducing 95th percentile delay by 1.8x.

In this paper, we explore an enhanced CIM model, and propose a novel Ising formulation, which together are shown to be the first Ising solver that provides significant gains in the BER performance of large and massive MIMO systems, like 16x16 and 16x32, and sustain its performance gain even at 256-QAM modulation. We further perform a spectral efficiency analysis and show that, for a 16x16 MIMO with Adaptive Modulation and Coding, our method can provide substantial throughput gains over MMSE, achieving 2x throughput for SNR <=25 dB, and up to 1.5x throughput for SNR >= 30 dB.

Mobile video applications have gained increasing popularity and become part of everyone’s daily experience. The quality of video has a significant impact on both the quality of users’ experience for video streaming and the accuracy of video analytic systems, which further impacts the application revenue. The challenge to building a consistently high-quality video delivery system lies in two aspects. On the application side, the emerging new video applications are evolving to become more user-interactive, where existing prefetch and buffering algorithms cannot work properly. On the network side, the wireless network itself is fundamentally dynamic and unreliable due to the multipath effect and interference on the wireless channel. In this thesis, we present cross-layer optimizations from the application layer, network layer, and physical layer to improve the quality of video streaming over wireless network with the design and implementation of the following systems: Dashlet, a short video streaming system tailored for a high quality of experience by adapting to dynamic user actions. Dashlet proposes a novel out-of-order video chunk pre-buffering mechanism that leverages a simple, non machine learning-based model of users’ swipe statistics to determine the pre-buffering order and bitrate. Spider, a multi-hop, millimeter-wave (mmWave) wireless relay network design to maximize the video analytic accuracy of the delivered video. Spider integrates a low-latency Wi-Fi control plane with a mmWave relay data plane, allowing agile re-routing around blockages. Spider proposes a novel video bit-rate allocation algorithm coupled with a scalable routing algorithm that maximizes application-layer video analytics accuracy. LAIA, a system to programmable control the wireless channel so that the wireless network can achieve consistently high throughput for robust video delivery. With the programmable interface to control the wireless channel, LAIA can improve wireless channels on the fly for single- and multi-antenna links, as well as nearby networks operating on adjacent frequency bands. Putting it together, this thesis demonstrates a set of optimizations in different layers in through network stack for building a high quality and robustness wireless video delivery system. The extensive evaluation demonstrates a significant improvement on both quality of experience for video streaming and accuracy for video analytics.

Today’s wireless networks are evolving rapidly, experiencing an unprecedented surge in traffic volume, radio density, and spectral efficiency demands. This thesis addresses the critical challenges arising from this evolution of next-generation (NextG) wireless networks, focusing on three primary objectives: achieving high data rates, ultra-low latency, and massive connectivity.

To meet these diverse and demanding requirements, this thesis poses a central question: Can we build a smarter radio environment controlled and learned by software, capable of self-configuring in real-time to meet different application needs? Current approaches to handle uncontrolled wireless signals are end-to-end, but communication endpoints are limited in their ability to shape inherent propagation behavior. By focusing on changing the environment itself rather than endpoints, this thesis seeks to enhance key aspects of modern wireless networks.

Millimeter-wave technology enables multi-Gbps data rates, but its high-frequency signals are vulnerable to blockage, limiting its practical use. This thesis presents two innovative solutions to overcome this challenge. mmWall is a programmable smart surface, installed on buildings and composed of over 4,000 metamaterial elements. It can steer signals through the surface to extend outdoor mmWave signals indoors or reflect them to bypass obstacles. Wall-Street is a vehicle-mounted smart surface designed to provide robust mmWave connectivity in high-mobility environments, ensuring reliable communication even in dynamic scenarios. Extending our smart radio concepts to ultra-reliable, low-latency satellite networks, we introduce Wall-E, a dual-band smart surface that mitigates signal blockage by relaying full-duplex satellite-to-ground links, and Monolith, a smart surface that boosts data rates for inter-satellite communication. To address the growing overhead in massive Internet of Things (IoT) networks, we propose CLCP, a machine learning technique that predicts the radio environment to reduce communication overhead. This AI-driven approach complements our programmable surfaces, forming a comprehensive smart radio solution.

Given the highly complex nature of real-world systems, conceptual models alone are insufficient to fully explain them. Our solutions are implemented in physical hardware prototypes, integrated with existing network protocols, and rigorously tested through experimentation. This thesis thus offers a concrete answer to the above central question, laying the foundation for software-controlled smart radio environments in NextG wireless networks.

Wireless networks are ubiquitous nowadays and play an increasingly important role in our everyday lives. Many emerging applications including augmented reality, indoor navigation and human tracking, rely heavily on Wi-Fi, thus requiring an even more sophisticated network. One key component for the success of these applications is accurate localization. While we have GPS in the outdoor environment, indoor localization at a sub-meter granularity remains challenging due to a number of factors, including the presence of strong wireless multipath reflections indoors and the burden of deploying and maintaining any additional location service infrastructure. On the other hand, Wi-Fi technology has developed significantly in the last 15 years evolving from 802.11b/a/g to the latest 802.11n and 802.11ac standards. Single user multiple-input, multiple-output (SU-MIMO) technology has been adopted in 802.11n while multi-user MIMO is introduced in 802.11ac to increase throughput. In Wi-Fi’s development, one interesting trend is the increasing number of antennas attached to a single access point (AP). Another trend is the presence of frequency-agile radios and larger bandwidths in the latest 802.11n/ac standards. These opportunities can be leveraged to increase the accuracy of indoor wireless localization significantly in the two systems proposed in this thesis: ArrayTrack employs multi-antenna APs for angle-of-arrival (AoA) information to localize clients accurately indoors. It is the first indoor Wi-Fi localization system able to achieve below half meter median accuracy. Innovative multipath identification scheme is proposed to handle the challenging multipath issue in indoor environment. ArrayTrack is robust in term of signal to noise ratio, collision and device orientation. ArrayTrack does not require any offline training and the computational load is small, making it a great candidate for real-time location services. With six 8-antenna APs, ArrayTrack is able to achieve a median error of 23 cm indoors in the presence of strong multipath reflections in a typical office environment. ToneTrack is a fine-grained indoor localization system employing time difference of arrival scheme (TDoA). ToneTrack uses a novel channel combination algorithm to increase effective bandwidth without increasing the radio’s sampling rate, for higher resolution time of arrival (ToA) information. A new spectrum identification scheme is proposed to retrieve useful information from a ToA profile even when the overall profile is mostly inaccurate. The triangle inequality property is then applied to detect and discard the APs whose direct path is 100% blocked. With a combination of only three 20 MHz channels in the 2.4 GHz band, ToneTrack is able to achieve below one meter median error, outperforming the traditional super-resolution ToA schemes significantly.

A central design challenge for future generations of wireless networks is to meet users’ ever-increasing demand for capacity, throughput, and connectivity. Recent advances in the design of wireless networks to this end, including the NextG efforts underway, call in particular for the use of Large and Massive multiple input multiple output (MIMO) antenna arrays to support many users near a base station. These techniques are coming to fruition, yielding significant performance gains, spatially multiplexing information streams concurrently. To fully realize MIMO’s gains, however, the system requires sophisticated signal processing to disentangle the mutually-interfering streams from each other. Currently deployed linear filters have the advantage of low computational complexity, but suffer from rapid throughput degradation for more parallel streams. Theoretically optimal Maximum Likelihood (ML) processing can significantly improve throughput over such linear filters, but soon becoming infeasible due to its computational complexity and limitations in processing time. The base station’s computational capacity is thus becoming one of the key limiting factors on performance gains in wireless networks. Quantum computing is a potential tool to address this computational challenge. It exploits unique information processing capabilities based on quantum mechanics to perform fast calculations that are intractable by traditional digital methods. This dissertation presents four design directions of quantum compute-enabled wireless systems to expedite the ML processing in MIMO systems, which would unlock unprecedented levels of wireless performance: (1) quantum optimization on specialized hardware, (2) quantum-inspired computing on classical computing platforms, (3) hybrid classical-quantum computational structures, and (4) scalable and elastic parallel quantum optimization. We introduce our prototype systems (QuAMax, ParaMax, IoT-ResQ, X-ResQ) that are implemented on real-world analog quantum processors, experimentally demonstrating their substantial achievable performance gains in many aspects of wireless networks. As an initial guiding framework, this dissertation provides system design guidance with underlying principles and technical details and discusses future research directions based on the current challenges and opportunities observed.

Error correction codes are essential for reliability and capacity in wireless networks. By correcting errors in real-time, they reduce re-transmissions, conserve bandwidth, and enhance network performance. However, these advantages come at the price of high decoding complexity and high latency which compels network designers to make sub-optimal deployment choices such as considering approximate decoding algorithms, limiting parallelism, bit-precision, and iteration counts, sacrificing the potential capacity and performance gains. Moreover, the ever-increasing user demand in wireless networks poses additional challenges in managing power consumption, operational costs, and the carbon footprint of base stations and terminals. This highlights the need for continued innovation in wireless network baseband architecture and implementation strategies.

This dissertation introduces quantum computing-based processing architectures for decoding error correction codes, offering new computational paradigms to address these challenges at scale. By harnessing the principles of quantum mechanics, we propose a transformative shift in the way decoding is achieved, benefiting wireless performance and capacity, through the design and implementation of the following systems: (1) QBP, quantum annealing decoder for LDPC codes, (2) HyPD, hybrid classical–quantum annealing decoder for Polar codes, (3) QGateD, quantum amplitude amplification decoder for generic XOR-based error correction codes, (4) FDeQ, quantum gate decoder flexible to both LDPC and Polar codes, (5) QAVP, quantum annealing approach to vector perturbation precoding (a multi-user MIMO downlink baseband optimization problem). These systems collectively fall under the thesis that quantum computing is a promising approach for baseband processing, warranting further justification from an economic and environmental impact perspective. To address this and to make the case for quantum computing in wireless industry, (6) the dissertation presents a comprehensive cost and carbon footprint analysis of quantum hardware, both quantitatively and qualitatively. This may be of potential interest to NextG wireless networks and quantum architectures.