IEEE Military Communications Conference
30 October – 3 November 2023 // Boston, MA, USA
Communications Supporting Military Operations in a Contested Environment

Tutorials

Tutorials will be held on both Monday, October 30 (AM/PM) & Thursday, November 2, 2023 (Afternoon Only).

Monday, October 30 (AM)
TUT-01 Quantum Communications, Cryptography, and Sensing
TUT-02 Network Over-Watch (NOW) 
TUT-03 Intelligent Learning Algorithms: Using Artificial Intelligence and Machine Learning to Design Tomorrow’s Military Networks

Monday, October 30 (PM)
TUT-04 The Path Towards 6G Networks – Building Tomorrow’s Tactical Military Networks 
TUT-05 Enabling dynamic spectrum access, spectrum sharing, and Electromagnetic Spectrum Operations (EMSO): The IEEE 1900.5.2 Standard for Modeling Spectrum Consumption 
TUT-06 Explainable AI and Radio Frequency Machine Learning
TUT-SP1 Satellite Communications (SATCOM) Tutorial - Sponsored by Mathworks, included at no additional fee with Full Registration
TUT-SP2 Artificial Intelligence (AI) for Wireless Tutorial - Sponsored by Mathworks, included at no additional fee with Full Registration

Thursday, November 2 (PM)
TUT-07 5G-Based Integrated Sensing and Passive Radar Systems
TUT-08 Terahertz-band Communications: Myth, 6G, or 7G
TUT-09 5G NR Sidelink: From Release 16 to Release 18
TUT-10 Reconfigurable Intelligent Surfaces for 6G and Beyond: Fundamentals & Applications

Day:  Monday, October 30
Time:  8:30 - 12:35
Room:  Simmons/3rd Floor

TUT-01 Quantum Communications, Cryptography, and Sensing

Abstract: Applying quantum technology to revolutionize communications and networking emerges as a new technological frontier. Quantum mechanics dealing with microscopic world would intuitively be different from the classic macroscopic world and must rely on mathematical and physical inductions to understand. From the quantum engineering point of view and background knowledge, this tutorial supplies the comprehensive knowledge in quantum mechanics, and then explores the technology and quantum engineering design of quantum communications, cryptography, and sensing, with breadth and depth. Both quantumclassic communications and quantum-entangled networking will be described. For the practitioners of communication engineering, more engineering aspects beyond quantum physics will be systematically introduced, particularly for quantum optical wireless communications and sensing, toward a successful compromise between the quantum information science and quantum communication engineering

Objectives and Motivation: In presenter’s view, in short, quantum supremacy is achieved due to extreme parallelism in quantum computing and possible networking, and extreme security due to quantum no-cloning theorem in quantum communication/computing/cryptography. From engineering point of view, to deeply understand the theoretical knowledge behind is a huge challenge in engineering community given the tremendous attraction of quantum supremacy, particularly for professionals in communication and networking engineering. However, it is not straightforward for communication researchers, professionals, and graduate students to get into the area due to lacking background in modern physics, quantum mechanics, and mathematics about Hilbert space, while almost all papers and books are written by physicists except those regarding quantum information theory. Furthermore, quantum information science in open literature is still distant away from quantum engineering design. Realizing the practical needs from engineering community in past few years, this tutorial is specially tailored to fill in this significant gap to bring quantum science/physics to quantum communication engineering in a smooth way.

Day:  Monday, October 30
Time:  8:30 - 12:35
Room:  Suffolk/3rd Floor

TUT-02 Network Over-Watch (NOW) 

This tutorial will guide personnel in the capabilities that provide secure communications over 5G networks for DoD personnel.

Relevance for Military Communications: Mission Critical Entities, such as Department of Defense entities, will use a mix of private 5G networks and commercially available 5G networks to exercise their mission objectives. For private networks, the managing entity has full configuration control. However, to increase resiliency and operational effectiveness, the private network device’s must be able reach beyond the boundary of the private network and effectively interoperate with, or operate through, a public operator network. When securely operating through a cooperating 5G network, Network OverWatch (NOW) is an approach that uses on-demand network slice orchestration and User Equipment Route Selection Policy (URSP) mechanisms to achieve mission success.

The technology behind NOW was demonstrated using Enterprise Private 5G Network to Public Operator network interworking and includes key DoD needed functionality such as:

  1. Tailored connectivity to applications in the private cloud for enterprise (corresponds to DoD application)
  2. Solution validated in proof-of-concept 5G SA trial using end-to-end orchestration including User Equipment Route Selection Policy (URSP).
  3. New slice design tool providing integrated booking and configuration of network slice (corresponds to DoD need for on-demand secure slice with specified SLAs from the DoD device to the DoD application over a cooperating operator network)

See Deutsche Telekom and Ericsson present first network slicing solution

Network OverWatch (NOW) is feasible NOW! Specifically, a DOD private network device can securely operate through a Cooperating Operator 5G networks using On-Demand Slice orchestration and User Equipment Route Selection Policy (URSP).

End-to end communication characteristics of the network slice is associated with capabilities and Service-level Agreement (SLA) requirements. For example, the DoD may order a slice with 10 MB on the downlink 20 MB on the uplink with complete isolation (i.e., the core will not be shared) for extra security for up to 1000 UEs. To deliver on the SLA, the network slice is typically further subdivided by the operator into slice subnets for radio access network (RAN), transport and core, respectively. Each "slice" is an isolated end-to-end logical network precisely tailored and adjusted to the requirements of its particular use case, even in real time. This gives cooperating operators the flexibility to host a diverse range of service requirements for various use cases over a single network, adapting, adding, and configuring as needed.

The DoD have many applications needing different traffic profiles and priorities. In 5G, differentiation can be realized by leveraging 5G network slicing capability. As just one example, a drone with a high definition (HD) point, tilt, zoom (PTZ) camera has no less than 4 distinct traffic patterns; each of which has different performance requirements. The command and control (C2) of the drone itself, the C2 of the PTZ camera, the HD video feed, and any necessary telemetry data from the drone i.e., altitude, position, etc. How would one ensure that flight control takes precedence over HD video, so the drone does not fly into the terrain it is observing? Another example would be to request a dedicated, end-to-end virtual subnetwork (RAN slice) within a cooperating operator, ensuring that all application traffic from the requesting device is routed to that virtual subnetwork and that subnetwork only has a connected path through the cooperating operator to the DOD’s own private network 5G Core and data network servers.  These requests may be dynamic or pre-provisioned and would use industry standard interfaces. As part of the mission planning process, the requested actions may include reserving a slice, activating the slice, advertise the slice to the device, have device route application traffic to the slice, reconfigure the slice based upon performance measures, and ultimately deactivate the slice once the mission is complete. By leveraging the existing standardized tasks, the DOD accelerates time to implementation.

Network OverWatch (NOW) is also feasible using standard 3GPP public/private interworking architectures. Standardized architectures include:

  1. Multi-Operator Core Network (MOCN)– request a RAN Slice. In this scenario, the RAN is owned by one operator, either public or private, and both parties use the same spectrum for their respective devices. The RAN broadcasts the PLMN ID of both the DoD network and the cooperative network operators. The RAN is shared, the spectrum is shared, however, DoD Core manages DOD users and cooperating operator’s core manages the cooperating operator’s users.
  2. Multi-Operator RAN (MORAN) - request a RAN Slice. In this scenario, the RAN is operated by one operator, but each operator using the shared RAN, brings his own spectrum. Each MORAN operator broadcasts his PLMN ID for his subscribers on his spectrum. DOD could become a MORAN operator using DOD owned spectrum. The RAN infrastructure is shared, but the spectrum is not shared.
  3. Private network partners – request an E2E slice that includes RAN and CORE. The DOD device would need to be provisioned with the Cooperating operator PLMN ID and DoD slice ID. The 5G RAN and Core network is shared.

Extend device reach while maintaining control:  DoD would remain in control of UE/device and applications placement/usage in a DOD slice provided by a cooperating operator network. The cooperating operators is assumed to expose/manage slice information, KPI and settlement information to its customers, e.g., DOD, via standardized interfaces. A DOD network slice can be dedicated to one customer, e.g., DoD or shared by multiple tenants. A secure DOD network slice could be used by multiple DoD agencies under DoD control to obfuscate which traffic belongs to which DoD related Agency.

Note: For contested area, these concepts still apply as the DoD brings their own network resources with them e.g., through deployable 5G and non-terrestrial networks.

Attend this session to understand the feasibility of Network OverWatch in private network to public network interworking. Cutting edge technology applied in innovative scenarios is a cornerstone for accelerating the 5G to Next G ambition.

Day:  Monday, October 30
Time:  8:30 - 12:35
Room: Wellesley/3rd Floor


TUT-03 Intelligent Learning Algorithms: Using Artificial Intelligence and Machine Learning to Design Tomorrow’s Military Networks

Intelligent learning algorithms are an exciting research area receiving an enormous amount of interest from both academia and industry. Machine learning algorithms now appear in virtually all walks of life and are being used across our military communications community, leading to many technical breakthroughs.  The professional community, and public at large, has become extremely interested in large language models such as Chat-GPT over the past year, as these tools provide a powerful opportunity to enhance our design, development, and analysis capabilities. The goal of this tutorial is to provide attendees with a strong familiarity with intelligent learning algorithms, including their strengths and weaknesses, and provide them with knowledge of how these algorithms can be used to help our community in the design of high-performance next-generation military networks. The tutorial will first provide attendees with an introduction to intelligent algorithms as well as a crash course on some of the theory behind intelligent algorithms.  The tutorial will then provide an overview of many of the key types of learning algorithms, including machine learning algorithms, genetic algorithms, bio-inspired algorithms, deep learning, and then discuss the emerging field of multi-agent learning algorithms. The tutorial will pay particular attention to neural network deep learning techniques, providing a deep-dive on the subject of neural networks. The tutorial will then go on to compare the strengths and weaknesses of various types of algorithms, developing and presenting a detailed taxonomy of all the various mainstream forms of intelligent learning. The tutorial will then go on to highlight a plethora of areas of communications and networking in which intelligent learning algorithms have played a key role in technology development, including all layers of the protocol stack.

Day:  Monday, October 30
Time:  14:10 - 17:30
Room: Simmons/3rd Floor


TUT-04 The Path Towards 6G Networks – Building Tomorrow’s Tactical Military Networks 

5G and beyond communications is an exciting research area receiving an enormous amount of interest from both academia and industry. While 6G is not yet well defined, work in this area is ongoing at a rapid pace and the future technology continues to rapidly solidify. The military domain is eager to adopt advanced cellular technologies.  However, many myths, misconceptions, and general confusion surround the entire 5G and beyond landscape. The goal of this tutorial is to provide attendees with a strong familiarity of the cellular landscape, cutting through marketing hype, and give tutorial attendees a clear idea of the state of the art in cellular communications and development. Ultimately, the goal of this tutorial is to provide attendees with an awareness and understanding of these technologies can be useful in their goals of designing high-performance next-generation military networks. This tutorial will provide attendees with a strong familiarity of 1) the 5G cellular standards as defined in 3GPP Release 15, 16, and 17, and 2) key IEEE 802.11 technologies that support 5G usage cases, with a particular focus will be spent on IEEE 802.11ax, or "WiFi 6." The tutorial also examines key “Beyond 5G” research and standardization activities (Release 18 and beyond), identifying key technology trends that will likely make up the future 6G network architecture. Discussion will also include emerging WiFi 7 technologies, such as IEEE 802.11be.

Day:  Monday, October 30
Time:  14:10 - 17:30
Room: MIT/3rd Floor


TUT-05 Enabling dynamic spectrum access, spectrum sharing, and Electromagnetic Spectrum Operations (EMSO): The IEEE 1900.5.2 Standard for Modeling Spectrum Consumption 

The Electromagnetic Spectrum Operations (EMSO) doctrine changes how warfighters view the EMS from being a resource that is divvied among systems to being a space through which Spectrum Dependent Systems (SDSs) maneuver to remain resilient and to achieve spectrum superiority. Simultaneously, there is a push across the broader civilian EMS user community to enable dynamic sharing of spectrum among heterogeneous users. These changes demand an alternative approach to capture the use of the EMS by SDSs from those of traditional spectrum management. Spectrum consumption models (SCMs) as defined by the IEEE 1900.5.2 Standard provide a means to fully support the planning processes of EMSO for defining maneuver spaces, giving EMSO orders, collecting and deconflicting EMSO plans, capturing SDS capabilities, enabling intuitive visualization of the EM Operating Environment, and developing algorithms to find optimal and advantageous opportunities in the EMS. Additionally, SCMs also provide a means to enable dynamic spectrum access (DSA) and its management.

In the EMSO arena, warfighters can create policy as part of their planning and SCMs can be used as a machine-readable way to capture and deliver that policy subordinate organizations and to DSA systems for them to follow. SCMs can enable the realization of Electromagnetic Battle Management (EMBM). In the civilian arena, regulators and their partners, government and commercial, can use SCMs to dynamically manage and implement spectrum sharing frameworks and environments. SCMs can be used to facilitate negotiation and to define the spectrum sharing boundaries that are part of spectrum sharing agreements.

The IEEE 1900.5.2 Standard for Modeling Spectrum Consumption is poised to be a key component of planning, understanding, and executing Electromagnetic Spectrum Operations (EMSO). It enables systems and procedures for the management of Dynamic Spectrum Access.  The standard defines a data model for SCMs and the criteria to arbitrate compatibility (i.e., non-interference) among combinations of RF devices and/or systems that have expressed the boundaries of their spectrum use with SCMs.  SCMs built in accordance with IEEE 1900.5.2 allow spectrum users to describe their anticipated use of spectrum so that the various stakeholders can understand and resolve conflicts.  SCMs are designed to make compatibility computations tractable and they enable the creation of algorithms to optimize the use of spectrum across multiple users.  SCMs are machine readable and can be used to provide spectrum use policy to spectrum dependent systems (SDSs) for autonomous selection of RF channels.  The SCMs are also a means for SDSs to autonomously collaborate in the use of shared spectrum.

By completing the tutorial, attendees will:

  • Understand the constructs of spectrum consumption modeling and how they are used to build SCMs of SDS use of spectrum
  • Understand the five roles of SCMs:
    • Configuration Options – models of the possible configurations and tuning of SDSs
    • Request – configuration options models with the addition of a planned operation
    • Consumption – the final configuration and tuning for an operation
    • Authorization – models of the spectrum available for use in planning
    • Constraints – models that provide boundaries that constrain use
  • Understand the fundamental methods of arbitrating compatibility among different SCMs
  • Understand how spectrum management is achieved via SCMs
  • Understand how SCMs facilitate spectrum sharing interactions between different entities/systems
  • Understand how EMSO planning is executed using SCMs 
  • Have awareness of the tools and code that support the building and use of the models

The intended audience for the tutorial is EMSO leaders and managers, engineers, regulators, wireless service operators (military and commercial), computer scientists, and researchers in dynamic spectrum access. Overall, audience members should have a good understanding of wireless communication theory, related to signal propagation, antennas, and link budget analysis. Past experience managing and/or configuring RF devices is helpful but not required.

Day:  Monday, October 30
Time:  14:10 - 17:30
Room: Wellesley/3rd Floor


TUT-06 Explainable AI and Radio Frequency Machine Learning

Artificial intelligence and machine learning (AI/ML) have demonstrated incredible capability in a wide range of applications including wireless communication systems.  State-of-the-art AI/ML approaches often use deep neural networks (DNNs), which are capable of modeling complex processes through their thousands, if not millions, of tunable parameters.  While DNNs have advanced predictive capability, the complexity of these models makes them difficult to interpret, which is why many consider these methods to be “black boxes.”  The inability to interpret a model can lead to a lack of trust, and ultimately a hesitancy to deploy AI-based systems in an operational setting, particularly for mission critical military applications. Additionally, Test and Evaluation (T&E) of AI systems utilizing DNNs is a difficult task.

The field of eXplainable AI (XAI) has arisen to mitigate these issues.  XAI techniques attempt to provide a user with insight into how a DNN produces a decision.  Attribution techniques are a subset of XAI that quantify the contribution of each model’s input to its prediction. This tutorial will cover an overview of the XAI field, a subset of XAI techniques (primarily attribution techniques), and common Python libraries for implementation.  While investigating these XAI techniques, participants will learn about common problems in wireless communication that can be addressed using AI/ML and relevant DNN solutions.

The objectives of this tutorial are as follows:

  1. Introduction to machine learning applications in the wireless communications domain (RF AI/ML) and common solution techniques that leverage DNNs (examples include signal detection, modulation recognition, etc.). Both raw data and image-based DNNs will be considered.
  2. Outline high-level concepts surrounding XAI and provide a deeper dive into attribution techniques.
  3. Demonstration of XAI techniques and Python packages for easy implementation – Captum and Quantus.

The motivation behind this tutorial is to expose researchers and engineers in the wireless communications domain to XAI concepts and demonstrate how XAI can be integrated into the Test and Evaluation (T&E) process. XAI techniques can be used to better understand these models and improve trust in the system, i.e., confirm that the model is learning characteristics of a problem that align with a user’s prior knowledge.  At the end of this tutorial, participants will have a working knowledge of XAI and be provided with example Python code for generating data, training a DNN, and estimating the attributions using a variety of techniques. A background in wireless communications is not required.

Day:  Monday, October 30
Time:  14:10 - 16:10
Room:  Suffolk/3rd Floor


TUT-SP1 Satellite Communications (SATCOM) Tutorial - Sponsored by Mathworks
 

Overview

In this hands-on tutorial, MathWorks product experts will walk you through a series of online exercises. These guided exercises will give you the opportunity to write and run your own code using Satellite Communications Toolbox and learn how, with minimal coding, you can use the toolbox to streamline your satellite-related workflows.

Agenda

  • Brief overview of Satellite Communications Toolbox
  • Hands-on exercises using MATLAB Online where you will:
    • Set up and launch a satellite scenario viewer
    • Compute and visualize the visibility access between a satellite and a ground station
    • Compute and visualize communications link closure between a satellite and a ground station

Important notes

  • To successfully participate in this tutorial, attendees should have previous experience calling MATLAB functions.
  • Since this tutorial uses MATLAB Online, attendees will need internet access to participate. You will get complimentary access to MATLAB Online as part of the registration process.
  • Attendees will need to spend around 5 minutes to set up your MATLAB Online account and access the files before the workshop.

Day:  Monday, October 30
Time:  16:10 - 18:10
Room:  Suffolk/3rd Floor


TUT-SP2 Artificial Intelligence (AI) for Wireless Tutorial - Sponsored by Mathworks

Overview

Artificial intelligence (AI) is rapidly becoming a critical component of many engineering systems and disciplines today. In the field of wireless, AI is being used to design and develop smarter ways to model physical layers, optimize performance of wireless systems and networks and address new 6G design challenges.

In this hands-on tutorial, you will write and run code entirely in the browser using MATLAB® Online™. You will learn how to apply principles of AI (machine learning, deep learning, domain-specific processing) to wireless communication workflows.

This interactive hands-on session will include the following:

  • Familiarize yourself with MATLAB Online and AI tools
  • Create and evaluate necessary components to succeed in AI modeling , by implementing an example of Modulation Classification
  • Deep dive into an advanced, domain-specific application that showcases a complete workflow for accomplishing 5G Channel Estimation

MathWorks instructors and teaching assistants (TAs) will be available throughout the session to guide you. Please bring your laptop and install the Google Chrome browser beforehand.

Day:  Thursday, November 2
Time:  14:10 - 17:30
Room:  Fairfield/3rd Floor

TUT-07 5G-Based Integrated Sensing and Passive Radar Systems

This half-day tutorial aims to provide a comprehensive overview of the state-of-the-art in how next-generation mobile broadband systems such as 5G New Radio (NR) can be enhanced to provide radio frequency (RF) sensing and radar tracking capabilities without changing the underlying waveform. This tutorial will commence by reviewing 5G/5G-Advanced NR waveform design and current positioning capabilities. We will then define information-theoretic and signal processing-based performance limits for integrated sensing and communications (ISAC), and an overview of passive radar techniques that exploit broadband wireless transmissions as signals of opportunity for target tracking. We then introduce and evaluate 802.11 and 5G signal processing techniques for ISAC, including channel modeling and technical performance for indoor and outdoor use cases such as gesture recognition, perimeter security, and vehicle and drone tracking. The tutorial concludes with a look at ISAC design considerations for future 6G wireless systems in the IMT-2030 framework.

This tutorial is timely for the following reasons:

1.      5G wireless is the fastest-growing mobile broadband technology in history with 1.5 billion consumer subscriptions projected by the end of 2023 and over 413 public network deployments in place or underway across the world.

2.      The US Department of Defense is actively exploring the use of 5G for various logistical, spectrum superiority, and training use cases through the ‘5G-to-Next G’ initiative.

3.      RF sensing is currently being discussed as a potential feature for NR in Third Generation Partnership Project (3GPP) Release 19.

4.      The International Telecommunications Union (ITU) has recently designated ISAC as a key capability of future IMT-2030 systems (6G).

The tutorial is intended to provide the audience with a complete overview of the potential benefits, research challenges, implementation efforts and applications of enabling ISAC solutions for 5G and 6G technologies. Some familiarity with radar principles and 4G/5G wireless communications system design would be beneficial but not mandatory.

Day:  Thursday, November 2
Time:  14:10 - 17:30
Room:  Exeter/3rd Floor

TUT-08 Terahertz-band Communications: Myth, 6G, or 7G

For decades, the sub-terahertz and terahertz bands (broadly speaking, the electromagnetic spectrum between 100 GHz and 10 THz) were considered no man’s land, only leveraged by exotic sensing applications. Today, the same frequency range is envisioned by both academia and industry as a key enabler of ultrabroadband wireless communication and sensing systems and a critical asset for the upcoming sixth generation (6G) of wireless systems, spanning even toward 7G. What has drastically changed in the recent ten years leading to these conclusions?

This tutorial will answer this question and define a roadmap for this exciting research field. In brief, major material, device, and circuit research accomplishments in the last decade have rapidly closed the terahertz technology gap. With new experimental platforms at hand and an open mind to revisit some of the pre-conceived notions regarding the terahertz channel, many of the myths regarding the propagation of terahertz signals have been debunked.

Stemming from these two facts, today we can state with certainty that:

1. The very large available bandwidth at terahertz frequencies offers enormous potential to alleviate the spectrum scarcity problem and break the capacity limitation of existing wireless communication systems.

2. Consequently, terahertz communications are expected to support epoch-making wireless applications that demand reliable multi-terabits per second data rates, ranging from holographic communications, extended reality, and ultra-high-definition content streaming among mobile devices to wireless backhaul connectivity, covert communications, and even satellite communication networks.

3. Moreover, beyond communications, the THz band opens the door to new forms of wireless sensing beyond radar and localization, including air quality monitoring, climate change study, and even nano-bio sensing for transformative healthcare applications.

This tutorial aims to provide an updated look at the field of terahertz communications, (i) explaining how some of the many envisioned problems have already been solved and (ii) highlighting the key critical challenges that remain open or have emerged due to unforeseen phenomena. After a high-level overview of the expected role of the terahertz band in 6G communications and sensing systems, the state of the art in terahertz device technologies will be reviewed, identifying the critical performance metrics that need to be considered when designing meaningful communication and sensing solutions. Similarly, the lessons learnt through both physics-based and data-driven channel modeling efforts will be summarized and utilized to drive the design of tailored communication and networking solutions. Then, a comprehensive survey of recent highly innovative solutions and open challenges will be provided, including those related to ultrabroadband physical layer solutions (e.g., waveform design and wavefront engineering), ultra-directional networking strategies (e.g., interference and coverage analysis, beam management, and multiple access), and integration of THz communications with other 6G enablers (e.g., intelligent reflecting surfaces, non-terrestrial networks, and machine learning).

Day:  Thursday, November 2
Time:  14:10 - 17:30
Room:  Dartmouth/3rd Floor


TUT-09 5G NR Sidelink: From Release 16 to Release 18

Sidelink is a 3GPP-standardized technology that provides direct UE-to-UE communication links, and it is gaining significant interest in many commercial, military and governmental applications, including in ultra-reliable low latency communications (URLLC), enhanced Mobile Broadband (eMBB), vehicle-to-everything (V2X) networks, AR/VR technologies, and first responder networks. With rapidly expanding applications for direct peer-to-peer communication networks, sidelink could potentially disrupt the current base-station-centric communications paradigm. Because of the extensive interest in sidelink-based applications, 3GPP is enhancing sidelink to support higher data rates, lower latencies, and improved connectivity requirements. In Release 18 of 3GPP specification work, the following sidelink-based technologies are currently under development: support at mmWave bands including beamforming and beam maintenance; technologies for unlicensed bands such that they co-exist fairly with incumbent technologies; operations to support multiple carriers; protocols to support relaying via a UE; and positioning and localizing technologies for sidelink. The objective of this tutorial is to present an overview of sidelink technology starting from the Release 16 version to the Release 18 version, which is expected to be completed in mid2024.

Day:  Thursday, November 2
Time:  14:10 - 17:30
Room:  Clarendon/3rd Floor


TUT-10 Reconfigurable Intelligent Surfaces for 6G and Beyond: Fundamentals & Applications

This tutorial offers a comprehensive exploration of reconfigurable intelligent surfaces (RIS) for 6G and beyond. Reconfigurable intelligent surfaces (RIS) are passive controllable arrays of small reflectors that direct electromagnetic energy towards or away from the target nodes, thereby allowing better management of signals and interference in a wireless network. RIS constitutes a rapidly emerging field and a promising future component of wireless systems for 6G and beyond.  As the demand for faster, more reliable, and energy-efficient wireless communication continues to grow, RIS offers an exciting new development that introduces a new component solution for wireless networks. A tutorial on RIS technology is a timely and essential resource, allowing the audience to stay ahead in this rapidly evolving field. The interest in RIS is motivated by its promise of scalability and economic viability, as well as controlling the wireless medium in a new and innovative manner that opens many new doors for applications and better performance. Due to the absence of encoding/decoding or active circuits in its signal path, RIS allows cost-effective management of signals and interference without additional injection of power into the wireless network. The tutorial builds a strong foundation for this subject, and is ideal for researchers as well as practicing engineers. It will enable its audience to plug into the rapidly expanding literature in RIS. It also is a wonderful tool for those who aim to become active in the area and make contributions. The tutorial introduces the RIS technology, its channel model, and outlines its most important and distinctive characteristics. The advantages brought about by the RIS are highlighted, as well as the limitations and challenges that need to be addressed in this area. More specifically, a key challenge in RIS is the acquisition and use of channel state information, which is given prominent coverage in this tutorial. A comprehensive discussion of potential applications, as well as open problems, is also part of the tutorial.

Gold Patrons

Silver Patrons

Innovator Patrons

Supporter Patrons

Exhibitors