key: cord-0432403-u26yc2yf authors: Tataria, Harsh; Shafi, Mansoor; Molisch, Andreas F.; Dohler, Mischa; Sjoland, Henrik; Tufvesson, Fredrik title: 6G Wireless Systems: Vision, Requirements, Challenges, Insights, and Opportunities date: 2020-08-07 journal: nan DOI: nan sha: 02453b90a9beff1a7fc932670dc1a7e3b2768c98 doc_id: 432403 cord_uid: u26yc2yf Mobile communications have been undergoing a generational change every ten years or so. However, the time difference between the so-called"G's"is also decreasing. While fifth-generation (5G) systems are becoming a commercial reality, there is already significant interest in systems beyond 5G - which we refer to as the sixth-generation (6G) of wireless systems. In contrast to the many published papers on the topic, we take a top-down approach to 6G. We present a holistic discussion of 6G systems beginning with the lifestyle and societal changes driving the need for next generation networks, to the technical requirements needed to enable 6G applications, through to the challenges, as well as possibilities for practically realizable system solutions across all layers of the Open Systems Interconnection stack. Since many of the 6G applications will need access to an order-of-magnitude more spectrum, utilization of frequencies between 100 GHz and 1 THz becomes of paramount importance. We comprehensively characterize the limitations that must be overcome to realize working systems in these bands; and provide a unique perspective on the physical, as well as higher layer challenges relating to the design of next generation core networks, new modulation and coding methods, novel multiple access techniques, antenna arrays, wave propagation, radio-frequency transceiver design, as well as real-time signal processing. We rigorously discuss the fundamental changes required in the core networks of the future, such as the redesign or significant reduction of the transport architecture that serves as a major source of latency. While evaluating the strengths and weaknesses of key technologies, we differentiate what may be practically achievable over the next decade, relative to what is possible in theory. For each discussed system aspect, we present concrete research challenges. Enabled by high bandwidths and new applications in massive machine type communications (mMTC) and ultra reliable low-latency communications (uRLLC), International Mobile H. Tataria Telecommunications 2020 (IMT-2020) -often colloquially called the fifth-generation (5G) of wireless systems -heralded the emergence of enhanced multimedia applications [1] [2] [3] . However, as the next decade unfolds, even richer multimedia applications in the form of high-fidelity holograms and immersive reality, tactile/haptic-based communications, as well as the support of mission critical applications for connecting all things are being discussed for consideration [2, 4] . To support such applications, even larger system bandwidths (than those seen in 5G) are required along with new physical layer (PHY) techniques, as well as higher layer capabilities, which are not present today. Significant efforts are underway to characterize and understand wireless systems beyond 5G, which we refer to as the sixth-generation (6G) [4] [5] [6] [7] [8] . Research on 6G wireless systems is now the center of attention for a large number of journal and conference publications, keynote talks and panel discussions at flagship conferences/workshops, as well as in the working groups of standardization bodies, such as the International Telecommunications Union-T (ITU-T) [4, 5, [8] [9] [10] [11] [12] . For the majority of these studies, the scope of work ranges from characterizing potential 6G use cases, identifying their requirements, and analyzing possible solutions -in particular for PHY of the Open Systems Interconnection (OSI) stack. In order to understand what future systems will be capable of, we first provide details on evolving requirements of daily life approaching the next decade, which will naturally drive the requirements for 6G. In what follows, we summarize the key drivers behind 6G systems, discuss the literature summarizing the 6G vision and performance metrics, and present the contributions of this paper. Followed by this, we present the organization of the remaining sections of the paper. According to the ITU-T in [8] , the three most important driving characteristics linked to the next decade of lifestyle and societal changes, impacting the design and outlook of 6G networks are: 1) High Fidelity Holographic Society, 2) Connectivity for All Things, and 3) Time Sensitive and Time Engineered Applications. In the sequel, we present our view of each disruptive change and connect its implications to wireless networks of the future. 1) High Fidelity Holographic Society: Video is increasingly becoming the mode of choice for communications today, and is evolving to augmented reality. Video resolution capability is increasing at a rapid rate; for instance, user equipment (UE) devices supporting 4K video require a data rate of 15.4 Mbps (per-UE) [1] . 1 In parallel to this, viewing time of the UEs is also increasing, to the point where it is now the norm for end-users to watch complete television programs or sports events live or on-demand streaming. As we enter the next decade, demand for such content is anticipated to grow at extreme rates [4, 11] . The ongoing COVID-19 pendemic is showing that video communication has enabled people, businesses, governments, medical professionals and their patients to remain in virtual contact, avoiding the need for travel while remaining socially, professionally, and commercially active. While educational institutions remain closed, online education is possible via video communication. At the time of writing this paper, premier conferences and workshops around the world were also held virtually using live video interfaces. We expect that many such developments will remain active, even in the post COVID-19 era. Holograms and multi-sense communications are the next frontier in this virtual mode of communication. In 2017, the renowned physicist Stephen Hawking gave a lecture to an audience in Hong Kong via a hologram showcasing the growing potential of such a technology. Holograms are not just a technological gimmick or limited to entertainment; rather a logical evolution of video communication providing a much richer user experience. Proof-of-concept trials of hologrammatic telepresence are already underway [13] . When it is deployed, holographic presence will enable remote users as a rendered local presence. For instance, technicians performing remote troubleshooting and repairs, doctors performing remote surgeries, as well as improved remote education in classrooms could benefit from hologram renderings. The data transmission rates for holograms are very substantial (at least for today). Besides colour, depth, resolution, and frame rate, as is the case in video, holographic images will need transmission from multiple view points to account for tilts, angles and observer positions relative to the hologram. As an example, if a human body is mapped in tiles say of dimensions 4"×4", then a 6"×20" person may need a transmission rate of 4.32 Tbps [7] . This is substantially more than what 5G systems are capable of providing. In addition, to consistently provide such high data rates, additional synchronization is required to coordinate transmission from the multiple view points ensuring seamless content delivery and user experience. Some applications may need to combine holograms with data from other sources. This would enable data to be fed back to a rendered entity from a remote point. Combinations of tactile networks and holograms, especially if we are able to provide touch to the latter, could open further applications. While audio, video and holograms involve the senses of sight and hearing, communication involving all of the five senses is being considered. Smell and taste are considered as lower senses, and are involved with feelings as well as emotions; thus digital experiences can be enriched via smells and tastes. Limited success has recently been achieved using the concept of a digital lollipop [14] , which are devices that are inserted into human mouths to monitor the tongue's papillae (taste sensors). In general, we believe that a variety of sensory experiences may get integrated with holograms. To this end, using holograms as the medium of communications, emotion-sensing wearable devices capable of monitoring our mental health, facilitating social interactions and improving our experience as users will become the building blocks of networks of the future [15] . 2) Connectivity for All Things: Using 5G as a platform, an order-of-magnitude or even higher amounts of planned interconnectivity and its widespread use will be another defining characteristic of the future society. This will include infrastructure that is essential for the smooth functioning of society that we have become used to today, such as water supplies, agriculture, uninterrupted power, transport and logistics networks, etc. This brings the necessity to operate multiple network types, going well beyond the standard terrestrial networks of today. There are significant attempts to develop uninterrupted global broadband access via integration between the terrestrial networks and many planned satellite networks -especially for low earth orbit (LEO) satellites. Communications from moving platforms, such as Unmanned Ariel Vehicle (UAV)based systems are also required as many new applications are emerging. In addition to this, there is also a desire to explore life on other planets. Recently, commercial flights to the moon are already in planning by companies such as SpaceX. Successful operation of such critical infrastructure brings the need for security beyond what is possible today. In addition to this, the increased reliability of the sensors monitoring the infrastructure is also essential to successfully migrate towards a truly connected society. 3) Time Sensitive and Time Engineered Applications: Humans and machines are both sensitive to delays in the delivery of information (albeit to varying degrees). Timeliness of information delivery will be critical for the vastly interconnected society of the future. New applications that intelligently interact with the network will demand guaranteed capacity and timeliness of arrivals. As we incorporate gadgets in our life, quick responses and real-time experiences are going to be increasingly relevant. In a network of a massive number of connected sensors which are the end points of communication, timeliness becomes critical and late arrival of information may even be catastrophic. 2 Time sensitivity also has a deep impact on other modes of communications in the future, such as those relying on tactile and haptic control. Conventional internet networks are capable of providing audio and video facilities, which can be classified as non-haptic control of communication. However, the tactile internet [16, 17] will also provide a platform for touch and actuation in real-time. Due to the fundamental system design and architectural limitations, current 5G systems are not able to completely virtualize any skill performed in another part of the world, and transport it to a place of choice, under the 1 ms latency limit of human reaction. This will be addressed in 6G systems with leaner network architectures and more advanced processing [17, 18] . With the above changes driving the need for 6G, we review the progress in literature on 6G systems. We note that besides the studies referred to in the subsection below, there are many papers dealing with specific technologies at the PHY, media access control (MAC), and transport layers of the OSI stack. These papers will be reviewed (partly) in the related sections of this paper. Overall, we stress that since 6G encompasses a large part of ongoing communications research, any literature review is necessarily incomplete and can only provide important examples. By now, a considerable number of papers have explored possible applications and solutions for 6G systems. For instance, the authors of [19] take a look at potential 6G use cases, and provide a system-level perspective on 6G requirements, as well as presenting potential technologies that will be needed to meet the listed requirements. In parallel, the authors of [9, 20] analyze use cases above 100 GHz with a special focus on wave propagation properties at 140 GHz. The studies in [4, 11, [21] [22] [23] [24] give a flavour of the possible key performance indicators (KPIs) of 6G systems, and provides a summary of enabling technologies needed to realize the KPIs, such as holographic radio (different from standard holograms), terahertz (THz) communications, intelligent reflecting surfaces (IRS) and orbital angular momentum (OAM). On a similar theme, the authors of [22, [25] [26] [27] [28] present the applications and enabling technologies for 6G research and development. A number of studies focusing on more specific technologies have also been published. For instance, the study in [29] proposes to explore new waveforms for 90-200 GHz frequency bands that offer optimal performance under PHY impairments. The authors of [30] present a vision of providing an internet of bio-nano things using molecular communication. The study in [31] gives an overview of the main architectures, challenges and techniques for efficient wireless powering of internet-ofthings (IOT) networks in 6G, where a promising solution is proposed. Moreover, the authors in [32] consider the requirements, use cases and challenges to realize 6G systems with a particular emphasis on artificial intelligence (AI)-based techniques for network management. The role of collaborative AI in 6G systems at PHY and above layers is discussed in [33] . The trade-off between achieving 6G KPIs and the involved cost is presented in [34] , where a vision for implementation of 6G networks in India has been given. The study in [35] covers a broad range of issues relating to taking advantage of THz frequency bands and provides an extensive review of the various radio-frequency (RF) hardware challenges that must be overcome for systems to operate in the THz bands. Collectively, the 6G vision developed by the studies mentioned above and by the current paper is summarized in Fig. 1 . While the aforementioned and other papers cover important aspects of 6G systems, the aim of the current paper is to provide a holistic top-down view of 6G system design. Starting from the the technical capabilities needed to support the 6G applications, we discuss the new spectrum bands which present an opportunity for 6G systems. While a lot of bandwidth is available in these new bands, how to utilize it effectively remains a key challenge, which is discussed in depth. For instance, frequency bands at 100 GHz and above present formidable challenges in the development of hardware and surrounding system components, limiting the application areas where all of the spectrum can be utilized. We discuss the deployment scenarios where 6G systems will most likely be used, as well as the technical challenges that must be overcome to realize the development of such systems. This includes new modulation methods, waveforms and coding techniques, multiple access techniques, antenna arrays, RF transceivers, realtime signal processing, as well as wave propagation aspects. We note that these are all substantial challenges in the way of systems that can be realized and deployed. Nevertheless, addressing these challenges at PHY is only a part of resolving the potential issues. As such, improvements in the network architecture are equally important. The present core network design is influenced -and encumbered -by historical legacies. For example, the sub-millisecond latency required by many of the new services cannot be handled by the present transport network architecture. To this end, flattening or significant reduction of the architecture is necessary to comply with 6G use case requirements. The basic fabric of mobile internetthe Transmission Control Protocol/Internet Protocol (TCP/IP) -is not able to guarantee quality of service (QOS) needed for many 6G applications, as it is in effect based on best effort services. These and many other aspects require a complete re-think of the network design, where the present transport networks will begin to disappear and be virtualized over existing fiber, as well as be isolated using modern software defined networking (SDN), and virtualization methodologies. At the same time, the core network functions will be packaged into a micro service architecture, and enabled on the fly. All of these topics and more are covered in the sections below. For each aspect of 6G that is discussed in the paper, we present a detailed breakdown of the strengths and weaknesses of the presented concepts, technologies, or potential solutions. We differentiate what may be practically realizable, relative to what is theoretically possible. To the best of our knowledge, a holistic contribution of this type is missing from the literature. The remainder of the paper is organized as follows. A vision for 6G, a discussion of seven most prominent use cases to be supported by 6G, as well as their technical requirements are given in Sec. II. A summary table of the KPIs and a comparison with 4G and 5G systems is also presented. This is followed by a discussion of the new frequency bands and deployment scenarios in Sec. III. With the top-down approach, the fundamental changes in the core and transport networks supporting 6G applications is discussed in Sec. IV. Complimenting this, a discussion of the new PHY techniques covering a wide range of topics such as waveforms, modulation methods, multiple antenna techniques, applications of AI and machine learning (ML) is contained in Sec. V. An overview of wave propagation characteristics of 6G systems for different applications and scenarios is given in Sec. VI. The challenges in building radio transceivers and performing realtime signal processing for 6G, as well as solutions to overcome them are described in Sec. VII. Finally, the conclusions are given in Sec. VIII. A comprehensive bibliography is provided for the reader to delve deeper. We now discuss the system requirements for 6G use cases. It is clear that the major applications and usage scenarios for 6G discussed above require instantaneous, extremely high speed wireless connectivity [7, 36, 37] . The system requirements for Network 2030 have recently been published by the ITU-T in [38] 3 . Here we review these, as well as requirements published in other sources quoted above. We categorize the requirements separately for each 6G use case in the subsections below. As discussed earlier, holographic displays are the next evolution in multimedia experience delivering 3D images from one or multiple sources to one or multiple destinations, providing an immersive 3D experience for the end user. Interactive holographic capability in the network will require a combination of very high data rates and ultra low latency. The former arises because a hologram consists of multiple 3D images, while the latter is rooted in the fact that parallax is added so that the user can interact with the image, which also changes with the viewer's position. This is critical in providing an immersive 3D experience to the user [6] . The key system requirements for this type of communication are: 1) Data rates: The data rates required depend on how the hologram is constructed, as well as on the display type and the numbers of images which need to be synchronized. Data compression techniques may reduce the data rates needed for the transmission of holograms, but even with compression, holograms will require massive bandwidths. These vary from tens of Mbps [39] to 4.3 Tbps [7, 40] for a human size hologram using imagebased methods of generating holograms. 2) Latency: Truly immersive scenarios require ultra low latency, else the user feels simulator sickness [40] . Nevertheless, if haptic capabilities are also added then sub-millisecond latency is required [38, 41] . This is elaborated in Use Case 2 in the following subsection. 3) Synchronization: There are many scenarios where synchronization needs to be adhered to in holographic communications. As different senses may get integrated, the different sensor feeds may be sent over different paths or flows, and will require synchronization, as well as coordinated delivery. When streams involve data from multiple sources, such as video, audio, and tactile, precise/stringent inter-stream synchronization is required ensuring timely arrival of the packets. Coordinated delivery of the flows need dependency objectives for timebased dependency, ordering dependency and QOS fate sharing. For all of this to happen, the network must have knowledge of the co-flows -something which is non trivial. Another example is the case of a virtual orchestra, whereby members of the orchestra are in different locations, and their movements must be coordinated such that it seems as if the music is emanating from the same stage. 4 Multiparty robotic communications via holograms is yet another example where communication between a leader and a follower or between multiple robotic agents requires synchronization [42] . 4) Security: Requirements for this depend upon the application. If remote surgery is to be carried out, then the integrity and security of that application is absolutely vital, as any lapse could be life threatening. Coordinating the security of multiple co-flows is an additional challenge, as an attack on a single flow could compromise all other members of the flow. 5) Resilience: At the system level, resilience is about minimizing packet loss, jitter and latency. At the service level, relevant quality-of-experience metrics are availability and reliability. For holographic communication services, an un-recovered failure event could pose a significant loss of value to operators. Therefore, system (network) resilience is of paramount importance to maintain the high QOS needs for these services. 6) Computation: There are significant real-time computational challenges at each step of hologram generation and reception. While compression can reduce the bandwidth needs, it will heavily influence the latency incurred. To this end, there is an important tradeoff between higher level of compression, computation bandwidth and latency which needs to be optimized. A discussion on this is contained in [42] . There are many applications that fall in this category [2] . Consider the following examples: • Robotic and Industrial Automation: We are at the cusp of witnessing a revolution in manufacturing stimulated by networks that facilitate communications between humans, as well as between humans and machines in Cyber-Physical-Systems (CPS) [43] . This so-called industry 4.0 vision is enabling a plethora of new applications [44] . 5 It requires communications between large connected systems without the need for human intervention. Remote industrial management is based on real-time management and control of industrial systems. Robotics will need realtime guaranteed control to avoid oscillatory movements. Advanced robotics scenarios in manufacturing need a maximum latency target in a communication link of 100 microseconds (µs), and round-trip reaction times of 1 millisecond (ms). Human operators can monitor the remote machines by VR or holographic-type communications, and are aided by tactile sensors, which could also involve actuation and control via kinesthetic feedback. • Autonomous Driving: Enabled by vehicle-to-vehicle (V2V) or vehicle-to-infrastructure communication (V2I) and coordination, autonomous driving can result in a large reduction of road accidents and traffic jams. However, a latency in the order of a few ms will likely be needed for collision avoidance and remote driving. Thus, advanced driver assistance, platooning of vehicles, and fully automated driving, are the key application areas that 6G aims to support, and mature, with first components to be implemented in Third Generation Partnership Project (3GPP) Release 16 [45] ; see also a list of use cases by the 5G Automotive Association (5GAA) in [46] . Yet, since no fully functional autonomous vehicles exist, further requirements and applications are sure to emerge over the next decade within this area. • Health Care: Tele-diagnosis, remote surgery and telerehabilitation are just some of the many potential applications in healthcare. We have already witnessed an early form of this during the ongoing COVID-19 pandemic, whereby a huge number of medical consultations are via video links. However, with the aid of advanced telediagnostic tools, medical expertise/consultation could be available anywhere and anytime regardless of the location of the patient and the medical practitioner. Remote and robotic surgery is an application where a surgeon gets real-time audio-visual feeds of the patient that is being operated upon in a remote location. The surgeon operates then using real-time visual feeds and haptic information transmitted to/from the robot; this is already happening in some instances, see e.g., [47] . The tactile internet is at the core of such a collaboration. The technical requirements for haptic internet capability cannot be fully provided by current systems as discussed in [18] . The key network requirements for these type of services are: [48] . 2) Latency: The human brain has different reaction times to various sensory inputs ranging from 1 to 100 ms [16] . While it takes 10 ms to understand visual information, and up to 100 ms to decode the audio signals, only 1 ms is required to receive a tactile signal [49] . Thus, the tactile internet requires end-to-end latency on the order of 1 ms [16] , and sub-ms latency may be required for instantaneous haptic feedback, otherwise conflicts between visual, and other sensory systems could cause cyber sickness to the tactile users [2] . Robotics and other industrial machinery will also need sub-ms latencies. 3) Synchronization: Due to the fast reaction times of the human mind to tactile inputs, different such real-time inputs arising from different locations must be strictly synchronized. Similarly, as machine control might have fast reaction times, their inputs need to be tightly (subms level) synchronized as well. 4) Security: For all of the above applications (from robotics to autonomous cars), we envisage security to be at the forefront of the potential issues. This is since an attack/failure on/of particular system functionality could lead to life threatening situations. 5) Reliability: Some applications, such as cooperative autonomous driving and industrial automation demand a level of reliability that wireless systems of today are not able to guarantee. Ultra reliable transmissions are assumed to have a success rate of "five nines", i.e., 99.999% [50] . Industrial IOT systems could require even higher reliability, such as 99.99999% [51] , since loss of information could be catastrophic in some cases. 6) Prioritization: The network should be able to prioritize streams based on their criticality. Visual feeds may have many views with different priorities; yet choice of priority may be left to the network operator. Mobile edge compute (MEC) will be deployed as part of 5G networks, yet this architecture will continue towards 6G networks. When a client requests a low latency service, the network may direct this to the nearest edge computing site. For computation-intensive applications, and due to the need for load balancing, a multiplicity of edge computing sites may be involved, but the computing resources must be utilized in a coordinated manner. 6 Augmented reality/virtual reality (AR/VR) rendering, autonomous driving and holographic type communications are all candidates for edge cloud coordination. The key network requirements for this are: computing awareness of the constituent edge facilities, joint network and computing resource scheduling (centralized or distributed), flexible addressing (every network node can become a resource provider), fast routing and re-routing (traffic should be able to route or re-route in response to load conditions). Fig. 2 demonstrates this vision via edge-to-edge coordination across local edge clouds of different network and service types, as well as edge coordination with the core cloud architecture. Access points in metro stations, shopping malls, and other public places may provide information shower kiosks [53] . The data rates for these information shower kiosks could be up to 1 Tbps. The kiosks will provide fibre-like speeds. They could also act as the backhaul needs of millimeter-wave (mmWave) small cells. Co-existence with contemporaneous cellular services as well as security seems to be the major issue requiring further attention in this direction. This use case can be extended to various scenarios that include real-time monitoring of buildings, cities, environment, cars and transportation, roads, critical infrastructure, water and power etc. Besides these use cases, internet of bio-things through smart wearable devices, intra-body communications achieved via implanted sensors will drive the need of connectivity much beyond mMTC. The key network requirements for these use cases are: Large aggregated data rates due to vast amounts of sensory data, high security and privacy in particular when medical data is being transmitted, as well as possibly low latency when a fast intervention (e.g., heart attack) is required. As yet no systems or models exist to assess these data needs. While on-chip, inter-chip, and inter-board communications nowadays are done through wired connections, those links are becoming bottlenecks when the data rates are exceeding 100-1000 Gbps. There have thus been proposals to employ either optical or THz wireless connections to replace wired links. The development of such "nanonetworks" constitutes another promising area for 6G. Important criteria for such networksbesides the data rate -is the energy efficiency (which needs to incorporate possible required receiver processing), reliability, as well as latency. Specific KPIs for nanonetworks depend on chip implementations and applications, which will become clearer as they are developed over the next decade. This use case presents a scenario that is based on internet access via the seamless integration of terrestrial and space networks. The idea of providing internet from space using large constellations of LEO satellites has re-gained popularity in the last years (previous attempts such as the Iridium project in the late 1990s had failed). The study in [54] compares Telesat's, OneWeb's, and SpaceX's satellite systems. The key benefits of these are: Ubiquitous internet access on a global scale including on moving platforms (aeroplanes, ships, etc.), enriched internet paths due to the border gateway protocols across domains relative the terrestrial internet, and ubiquitous edge caching as well as computing. The mobile devices for these integrated systems will be able to have satellite access without relying on ground base infrastructures. The key network requirements for this capability are: (1) Flexible addressing and routing; with thousands of LEO satellites there are new challenges for the terrestrial internet infrastructure to interact with the satellites. (2) Satellite bandwidth capability: The inter-satellite links and terrestrial internet infrastructure in some domains could be a bottleneck for satellite capacity. (3) Admission control by satellites: When a satellite directly acts as an access point, this requires each satellite to have knowledge about the traffic load in the space network to make admission control decisions. (4) Edge computing and storage: The realization of edge computing and storage will incur challenges on the satellite due to on-board limitations. Latency will also be a challenge as the physical distance between the satellite and end node will set a limit on the minimum delay introduced by the link. An example realization of spaceterrestrial integrated networks is depicted in Fig. 3 , where multiple services communicating to the satellite network and terrestrial networks are shown to seamlessly co-exist. Collectively, in view of the above, the key requirements for 6G systems may be summarized (in the style of corresponding requirements for 5G systems) as [36, 55] : • Peak data rate: ≥1 Tbps catering to holographic communication, tactile internet applications and extremely high rate information showers. This at least 50× larger than that of 5G systems. • User experience data rate: At least be 10 × that of the corresponding value of 5G. • User plane latency: This is application dependent, yet its minimum should be a factor 40× better than in 5G. • Mobility: It is expected that 6G systems will support mobility of upto 1000 km/h to include mobility values encountered in dual-engine commercial aeroplanes. • Connection density per-km 2 : Given the desire for 6G systems to support an internet-of-everything, the connection density could be 10× that of 5G. The above capabilities and more are summarized in Tab. I, relative to the corresponding values in 5G and 4G systems. Traditionally, new generations of wireless systems have exploited new spectrum in order to satisfy the increased demands for data rates. 5G systems are characterized to a significant degree by the use of the mmWave spectrum complimented by large antenna arrays. A further expansion to higher frequencies for 6G seems almost unavoidable. However, we note that not all 6G services will be suitable to be offered in the new bands. The existing bands for 4G and 5G will continue and may be re-farmed for 6G. In this spirit, the spectrum from 100 GHz to 1 THz is being considered as a candidate for 6G systems. Within this band, particular sub-bands have very high absorption (see Sec. VI for a discussion of the physical reasons) and are thus ill suited for communication over more than a few meters. The spectrum windows with lower absorption losses shown in Fig. 4 still represent a substantial amount of aggregated bandwidth [56] [57] [58] [59] . Nevertheless, this spectrum is also used by various existing services. Consequently, all of it will likely not be made available by frequency regulators, and also not allocated in a contiguous manner. In particular, over the range of 141.8 GHz to 275 GHz, there are various blocks containing existing services that have co-primary allocation status by the ITU. These services include fixed, mobile, radio astronomy, earth exploration satellite service (EESS) passive, space research passive, inter-satellite, radio navigation, radio navigation satellite, and mobile satellite systems. Amongst the above, the passive services are much more sensitive to interference, and their protection will require guard bands, limits on out-of-band emissions as well as in-band transmit power, restrictions on terrestrial beams (by controlling the power flux densities), and side lobes pointing upwards. All of these aspects are critical for the co-existance of terrestrial systems with space-based networks. The next World Radio Conference (WRC) in 2023 will consider allocation of 231.5 GHz to 252 GHz to EESS passive systems. Parts of the spectrum beyond 257 GHz is also allocated to various other passive services. The authors in [35] expound the difficulties of co-existence between radio astronomy and wireless services in THz bands. Despite all of the above, the amount of spectrum available represents a unique opportunity for 6G. The use of the above-mentioned frequency windows is dependent upon a specific use case; naturally, not all the windows will be suitable for all use cases. The first window of interest will be the one marked as W1 in Fig. 4 covering the frequency range from 140-350 GHz. This band is typically referred to as the sub-THz band, even though strictly speaking "high mmWaves" might be the more appropriate nomenclature. The two key advantages of this band are: 1) The existence of many tens of GHz of bandwidth that are currently lying unused; and 2) The ability to develop ultra massive multiple-input multipleoutput (MIMO) antenna arrays within a reasonable form factor. The use of spectrum in higher windows is accompanied by a higher absorption loss. Though Fig. 4 is shown up to 1 THz, one can go even higher in frequency up to 10 THz [35, 60] at the expense of beyond formidable hardware realization challenges, so that this use seems further away. From this point onward, a move to even higher frequency bands brings us to some familiar territory, namely that of free-space optical (including infrared) links, either through the use of laser diodes, or light emitting diodes (LEDs) commonly assumed for visible light communications (VLC). Both of these approaches have been explored for a number of years, but it is only recently that an integration into cellular and other wireless systems seems to increasingly become a realistic option. Besides the exploration and the use of new frequency ranges, an investigation into new deployments is necessary. While some applications of 5G will also continue to be deployed in the existing 5G bands, which over time may be refarmed to 6G, we identify possible new deployment scenarios primarily motivated by the previously unexplored THz bands. We note that there will naturally be many applications such as Connectivity for Everything (see Use Case 5 in Sec. II) which will be in existing the sub 1 GHz band where a lot of the IOT deployments are happening. Another example is cellular V2X communication intended for autonomous driving, which will use a combination of microwave and mmWave bands [48] . 1) Hot Spot Deployments: This is a conventional application whereby extremely high data rate systems (such as those described in Use Case 4) could be deployed indoors or outdoors. MmWave and THz systems, e.g., in the window W1, would be well suited for such scenarios. However, ubiquitous deployments will be uneconomical as coverage radius in outdoor environments is limited to about 100 m, and even less in indoor environments -this follows from both freespace pathloss (even with reasonable-sized antenna arrays) and molecular absorption [56, 62] , see Tab. II. If more bandwidth is needed, we can aggregate more windows, though this might further shorten the feasible transmission distance. The authors of [56] proposes a bandwidth vs. distance scheduling, whereby more bandwidth is available for a lower transmission distance (say all the windows), and this progressively reduces to W1 for large distances. However, all of the link budgets only consider free-space pathloss. Further consideration of obstructing objects, scattering, and other effects need to be taken into account for realistic deployment planning. 2) Industrial Networks: While 5G was innovative in introducing the concept of industry 4.0, we anticipate that 6G will take significant strides in transforming the manufacturing and production processes. The maturity of industrial networks will depend on a successful adoption of current and future radio access technologies to the key industry 4.0 and beyond use cases. Industrial networks are envisaged to be privatized, focusing on extreme reliability and ultra low latency. The key deployment use cases are: 1) Communication between sensors and robots; 2) Communications across multiple robots for coordination of tasks; and 3) Communication between human factory operators and robots. Currently, in order to achieve the requirements for ultra high reliability, the majority of the commercial deployments are taking place between 3.4-3.8 GHz, where the propagation channel is relatively rich in terms of diffraction efficiency [63, 64] . Yet machines with massive connectivity in the 6G era will also demand high data rates along side real-time control and AI to be able to transmit and process high-definition visual data, enabling digital twins of machines and operations, as well as remote troubleshooting. To this end, we foresee the use of mmWave frequencies in addition to bands below 6 GHz for industrial networks over the next decade. Preliminary studies such as the one in [65] are demonstrating possibilities and challenges of integrating mmWave frequencies within industry 4.0 scenarios. Another area of deployment is WPANs and wireless local area networks (WLANs). These could be in between a laptop and an access point, an information kiosk and a receiver [66] , between AR/VR wearables and a modem or between the "infostations" proposed in [67] . These are very short links perhaps less than 0.5 m to 1 m for WPANs, and up to 30 m for WLANs. All windows may be suitable for this application provided the link budget can meet the path loss when the higher windows are used and where appropriate implementation technologies exist. 4) Nano Scale Communications: This is an entirely new area of deployment and consists of wireless networks on a chip for on-chip communications, as well as for wearable biosensing networks, internet-of-bio-things networks [68] , etc. Here all windows can be used subject to hardware limitations. 5) Autonomous Vehicles and Smart Railway Networks: 6G could be used for information sharing between autonomous vehicles and V2I [69] . However, there are doubts if the complicated traffic conditions and short distances due to range limitation discussed earlier will make the THz bands (or even mmWaves) suitable for this application. Furthermore, high-speed adaptive links between antennas on train rooftops and infrastructure can be used for transmission of both safety-critical information and aggregate passenger data [70, 71] . Such extremely-high rate links are well suited for THz, yet the high mobility creates strong sensitivity to beamforming errors and possible issues with the Doppler spread. While the speed of mordern high-speed trains is almost constant, and thus beams can be steered into the right direction based on prediction, the required beamforming gain (and associated narrow beamwidth) make the system sensitive to even small deviations from the predictions [72] . Furthermore, high-frequency systems can also be used for access between UEs and antennas in the cabins that aggregate the passenger data, similar to a (moving) hotspot. Keeping in mind the emerging 6G use cases, technical requirements, new frequency bands and key deployment scenarios, in the following section, we discuss the changes required to the design of 6G radio and core network architectures. In order to cater for the next generation use cases, 6G will consolidate many of the disruptive approaches introduced by 5G. Notably, the 5G standardization effort provided the ground work to enable flexible topologies to be deployed, breaking the traditional centralized hierarchy that exists today. KPIs such as latency, can be tailored to use cases thanks to innovative features like network slicing, control/user plane separation and MEC. The service-driven architecture with atomized and largely API'ed software components allows already today for a much more open innovation community, thus helping to accelerate the pace of deployment. 6G will however introduce entirely novel paradigms. These will be novel features and capabilities; a novel thinking towards the underlying transport infrastructure; and novel philosophies around the entire design process, which will hopefully accelerate design and deployment even further. These are discussed in the following. Concerning novel protocol and architecture approaches, the following will be of notable importance: 1) Super-Convergence: Non 3GPP-native wired and radio systems will form an integral part of the 6G ecosystem. In fact, many of the more disruptive changes discussed below will not be possible without an easier and more scalable convergence between different technology families. Emphasis will be on mutual or 3GPP-driven security and authentication of said converged network segments. As such, wireline and wireless technologies like WiFi, WiGig, Bluetooth and others will natively complement 6G with the strong security and authentication methods of 3GPP used to secure the consolidated network. It will greatly aid with traffic balancing due to the ability to onboard and offload traffic between networks of different loads; it will support resilience since traffic delivery can be hedged between different technology families. 2) Non-IP Based Networking Protocols: Internet protocol version 6 (IPv6) is now decades old with calls for standardization of entirely novel networking protocols growing. Indeed, the body of research on protocols beyond IP is rich and several solutions are currently being investigated by European Telecommunications Standards Institute (ETSI)'s Next Generation Protocols (NGP) Working Group as possible candidates for such a disruptive approach. With more than 50% of networking traffic originating in or terminating at the wireless edge, a solution which caters the wireless sector is fully justified. Related to above NGP, ICNs are an active research area in the internet research task force (IRTF) and internet engineering task force (IETF), and constitute a paradigm shift from networking as we know it today (i.e., TCP/IP based) [73] . ICN is a step toward the separation of content and its location identifier. Rather than IP addressing, content is addressed using an abstract naming convention. Different proposals exist today for the protocol realisation of ICN. It was considered in the ITU-T Focus Group (FG) on IMT-2020 [74] as a candidate for 5G. In fact, several proposals already exist to carry ICN traffic tunnelled through the mobile network but such an approach defies the transparent and flat Internet topologies. A new ITU-T FG has been established to guide the requirements for the network of 2030 [8] . Furthermore, to bridge latest developments in networking design and operational management, intent-based networking as well as intent-based service design have emerged. It is a lifecycle management approach for networking infrastructure, which will be central to 6G. It will require higher-level business and service policies to be taken into account; a thus resulting system configuration leveraging on the end-to-end softwarized infrastructure; a continuous monitoring of the network and service state; and a real-time optimization process able to adapt to any changes in network/service state and thus ensuring that the intent is met. 4) 360-Cybersecurity & Privacy-by-Engineering-Design: While security has been taken very seriously in 5G from a protocol and architecture point of view, the underlying embedded code which embodies and executes the various system components has never been part of the standardization efforts. Most security vulnerabilities however have been due to poorly written code. Thus, future efforts will not only focus on a secure end-to-end solution but will also encompass a top (architecture, protocols) -down (embedded software) approach which we refer to as 360-cybersecurity approach. Furthermore, whilst security-by-design is now a well understood design approach, privacy is still being solved at "consent" level. Privacy-by-Engineeringdesign will ensure that mechanisms are natively built into the protocols and architecture which would e.g. prevent the forwarding of packets / information if not certified to be privacy-vetted. For instance, a security camera will only be allowed to stream the video footage if certain privacy requirements are fulfilled at networking level and possibly contextual level, i.e. understanding who is in the picture and what privacy settings they have enabled. 5) Future-Proofing Emerging Technologies: A large swath of novel technologies and features is constantly appearing, the introduction of which into the telco architecture often takes decades. Examples of such technologies today are quantum, distributed ledger technologies (DLT) and AI. Tomorrow, another set of technologies will appear. All these ought to be embedded quicker and more efficiently which is why 6G needs to cater for mechanisms allowing not-yet-invented technologies to be embedded into the overall functional architecture. The subsequent subsection lays out some possible approaches to achieving this. Here, some more details on the specific technology opportunities of quantum, DLT and AI: The exciting features of quantum is that it can be used to make the 6G infrastructure tamperproof. It can be used for cryptographic key exchanges and thus enabling a much more secure infrastructure. Furthermore, quantum computing enables a NP-hard optimization problems to be solved in linear time, thus allowing network optimization problems solved and executed in much quicker (if not real) time. DLTs enable data provenance in that data, transactions, contracts, etc, are stored and distributed in an immutable way. This proves useful in a large multi-party system with little or no trust between the involved parties. Whilst DLTs raised to fame in the financial world with the emergence of Bitcoin, the same industry dynamic applies to telecoms where different suppliers feed into the vendor eco-system, vendors into operators and operators serve consumers. DLTs allow for a much more efficient execution of all these complex relationships. For instance, a vendor feature approved by one operator with the approval stored on a given DLT, should make other operators trust the feature without the need for lengthy procurement processes. Another example is where consumers can create their own market place to trade data plans, or other assets as part of the telco subscriber plan. Finally, AI has been used within telecoms for years, but mainly to optimize consumer facing issues, such as churn, or network related issues, such as the optimal base station (BS) antenna array tilt combined with the optimal transmission power policies. However, with the emergence of distributed and more atomized networks, novel forms of AI will be needed which can be executed in a distributed fashion. Furthermore, consumer-facing decisions will need to be explained thus calling for Explainable AI (xAI) concepts which are able to satisfy stringent regulatory requirements. The underlying infrastructure, including the transport networks, will need to undergo substantial changes as the amount of traffic to be carried in 6G networks will be orders-ofmagnitude larger than what we will see in the next years with 5G networks. We expect the following fundamental changes: 1) Removal / Reduction of the Transport Network: Unknown to many, the transport (and attached core network functionalities) is in fact a legacy artefact; we do have it in 5G because we had in 4G, and we have it in 4G because we had in 3G, and we have it in 3G because we had in 2G, and the reason it was introduced in 2G is because back then the internet was not able to provide the required QoS. However, today the transport fiber infrastructure is really well developed and there is no reason for operators to maintain their own private "LAN at national scale". A complete rethink may thus give the opportunity for the cellular community to solely focus on the wireless edge (air interface + radio access network + control plane to support all); and simply use a sliced Internet fiber infrastructure to carry the cellular traffic. Whilst it requires some policy and operational changes, the technologies to support such a modus operandi are there. 2) Flattened Compute-Storage-Transport: A flattened transport-storage-compute paradigm will be enabled by a powerful 6G air interface and a complete re-think of the core and transport networks as suggested above. A possible scenario is where transport is virtualized over existing fiber but isolated using modern SDN and virtualization methodologies. At the same time, the Core Network functions are packaged into a microservice architecture and enabled on the fly using containers or server-less compute architectures. To underpin novel gaming applications, we will also see a clearer split between central processing unit (CPU) and graphics processing unit (GPU) instructions sets, allowing each to be virtualized separately; for instance, the GPU instructions are handled locally on the phone whilst the CPU instructions executed on a nearby virtual MEC. 3) Native Open Source Support: For economic and security reasons but also reasons related to quicker innovation cycles and thus quicker time-to-market, open source will be an ever-growing constituent of a 6G eco-system. This is corroborated by recent announcement of tier-1 operators going to use open source not only for their core network but also parts of the radio access network. This presents an exciting opportunity for the entire communications and computer science community, as features can be contributed at scale. Furthermore, not only open source (input) but also open data (output) will be instrumental in unlocking the potential of 6G. Notably, many if not most design and operational decisions in 6G will be taken by some form of algorithms. Said algorithms need to be trained which requires huge amount of data. The telco ecosystem has been historically conservative in opening up operational data; such as the amount and type of traffic carried over various segments of the control and data planes. Automated mechanisms will need to be created in 6G which allow access to important data, whilst not compromising security of the network nor the privacy of the customers. 4) AI-Native Design Enabling Human-Machine Teaming: Machine Learning and AI have been part of 3GPP ever since the introduction of self-organizing networking (SON) in Release 8. However, the degrees of freedom, the high dynamics, the high disaggregation of 6G networks as well as more stringent policies will almost certainly require a complete rethink of how AI is embedded into the telco eco-system. 6G is an exciting challenge for the AI community as there is no global technology eco-system which has such stringent design requirements on spatial distribution, temporal low-latency and high data volumes. Emerging paradigms, such as distributed AI, novel forms of transfer learning as well as ensemble techniques, need to natively fit the overall telecom architecture. Importantly, consumer-facing decisions taken by AI need to be compliant with various consumer-facing policies around the world, such as Article 13 in Europe's general data protection regulation (GDPR). This requires the disclosure of any "meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject". As a result, novel paradigms such as Explainable AI will need to natively sit within 6G. Furthermore, AI will be used in the design process and not only operationally. We may not see it with 6G, but future networks will be designed by AI. We envisage a future where advanced AI/ML is able to scrape telcorelated innovation from the Internet, translate it into code, self-validate that code, implement it into a softarized infrastructure, test it on beta users and roll it out globally, all in a few minutes rather than decades. It could potentially be the underpinning technology for a next generation industry platform, industry 5.0. All of above will underpin novel design and operational paradigms leading to an unprecedented human-machine teaming to leverage on the strength of both. 5) User-Centric Networks: The network-centric design in 2G and 3G had been superseded by a device-centric design in 4G and 5G. 6G has the opportunity to be user (citizen)-centric in that it is societally aware and technologically adaptable, so that important societal needs or Black Swan events can be dealt with more efficiently and effectively. A fundamental change is vital as today's networking infrastructures have become too fragmented and heterogeneous to meaningfully support societal challenges; examples of these shortcomings were laid bare with the ongoing COVID-19 crisis: a massive shift of networking resources from corporate premises to private homes was needed but unattainable; privacy concerns over tracing apps emerged but could not be dispelled since privacy was not fundamentally embedded into the infrastructure but rather provided through T&C's; a significant increase in security breaches was reported by various agencies around the world. The telco eco-system has been notoriously complacent in communicating the impact of new technologies on health and well-being. As a result, each new generation is being greeted with dooms-news which is not helpful to consumers nor the industry. 6G has the potential to revert this by spending considerable time analysing the impact of the frequency bands to be used onto human health and well-being, with findings well communicated. 6G will have a profound impact onto overall innovation cycle and the skills landscape of telecoms, providing a phenomenal opportunity for growth. All of the above is illustrated through a high-level architecture in Fig. 5 , with the challenges and opportunities summarized in Tab. III. Keeping this in mind, we now review the novel PHY features of 6G systems which will complement the newly designed radio access and core networks. We begin the section by discussing the current progress and future directions of modulation, waveforms and coding techniques essential for the next generation air interface design. This is followed by a detailed discussion on multiple antenna techniques spanning ultra massive MIMO systems, distributed antenna systems, intelligent surface-assisted communications and oribtal angular momentum (OAM)-based systems. We then discuss the state-of-the-art in multiple access techniques complementing the multiple antenna techniques. Motivated by THz frequencies, we analyze the realistic possibilities in freespace optical communications. Following this, we provide a discussion on the PHY applications requiring AI and ML. We conclude the section by discussing the current state of affairs and practical possibilities in vehicular communications. For space reasons, we do not present other important topics such as dynamic spectrum sharing, dual connectivity, full-duplex communication, as well as integrated access and backhaul. Readers can refer to [75] [76] [77] for a discussion on these topics. A. Modulation, Waveforms, and Codes 1) Multicarrier Techniques: Over the past decade, orthogonal frequency-division multiplexing (OFDM) has, by-far, become the most dominant modulation format. It is being applied in the downlink for both 4G and 5G, while the uplink could either be discrete Fourier transform (DFT)-precoded OFDM (for 4G and optionally for 5G) or conventional OFDM (5G). OFDM's popularity is rooted in two factors: 1) Its wellknown information-theoretic optimality for the maximization of system capacity over frequency selective channels. 2) Backward compatibility -OFDM was chosen as a modulation method for 4G, and as a result has also been employed in 5G. While the trend in 5G has been the unification of modulation formats to OFDM for its three major use cases adapting the numerology and frame structure, we anticipate that the increased heterogeneity of applications in 6G will bring a much wider range of modulation formats. In particular, those which are suitable for the various edge cases of 6G systems, such as massive access from IOT devices and Tbps directional links. Having said this, for some 6G applications, OFDM may still be retained due to backward compatibility. Nonetheless, it has long been pointed out that OFDM has a number of drawbacks arising in non-ideal situations, which motivates further research into either modified multicarrier systems, or other alternatives. 7 The three key challenges of OFDM are: 1) Sensitivity to frequency dispersion, 2) Reduction of spectral efficiency due to the cyclic prefix that combats delay dispersion effects, 3) High peak-to-average power ratio (PAPR). All of these effects are becoming more critical at mmWave and THz frequencies, since frequency dispersion increases due to the higher Doppler shifts and phase noise. However, combating its effects by increasing the subcarrier spacing would reduce spectral efficiency due to the cyclic prefix (contrary to the popular opinion: delay spreads do not decrease significantly with carrier frequency [78] ). In particular, interference between the subcarriers of different UEs inevitably reduces performance of OFDM. High PAPR drives the requirement for highly linear power amplifiers (PAs) and high resolution data converters, e.g., analog-to-digital (ADC) and digital-toanalog (DAC) converters. This proves to be highly problematic since PAs need to operate with high backoff powers sacrificing their efficiency; and the energy consumption of ADCs/DACs becomes too high. The ADC/DAC resolution scales with bandwidth, making their design increasingly difficult and expensive. To this end, investigation into modulation techniques which strike the right balance between optimality of capacity and ADC/DAC resolution are required, keeping in mind the maximum admissible complexity in the equalization process [79] . The equalization methods could also include reconfigurable analog structures [80] . In this line, a promising method is given by the temporally oversampled zero-crossing modulation, where information is encoded in the temporal distance between two zero crossings [81, 82] . As shown in Tab. IV, a number of other modulation methods have been introduced, which can be classified into orthogonal, bi-orthogonal and non-orthogonal categories. All of these methods fulfill any of the following three goals: 1) Enable a critically-sampled lattice, such that the symbols are centered in the time-frequency plane, leading to high spectral efficiency; 2) Achieve orthogonality in the complex domain to facilitate simple demodulation; and 3) Have pulses that are well-localized in the time-frequency plane. As shown by the Balian-Low theorem in Fourier analysis, the three conditions cannot be fulfilled at the same time. Tab. IV summarizes the various trade-offs and offers means of comparison to analyze the capability of each method. Besides classical OFDM, other orthogonal techniques include null suffix OFDM, filtered multitone (FMT) [83] , universal filtered multicarrier (UFMC) [84] , lattice OFDM and staggered multitone (FBMC) [85] . Among bi-orthogonal methods, there exists cyclic prefix OFDM, windowed OFDM [86] , and Bi-orthogonal frequency-division multiplexing (FDM) [87] . For non-orthogonal schemes which need to eliminate inter-symbol interference via more complex receivers include generalized FDM (GFDM) [88] and fasterthan-Nyquist signalling [89] . In contrast to the above, an alternative method that is recently developed is known as orthogonal time frequency space (OTFS) [90] . OTFS performs quadrature amplitude modulation (QAM) not in the time-frequency domain, but rather in the delay-Doppler domain. This allows us to exploit frequency dispersion as a source of diversity. Furthermore, OTFS allows for a much more flexible and efficient multiplexing of UEs with different power delay profiles and Doppler spectra. While real-time prototypes for OTFS already exist, further investigations of efficient equalization architectures, in particular for multiple antenna systems, as well as other realtime implementation aspects constitute an important research topic for the future. Yet another alternative to the above techniques is the use of non-coherent or differentially coherent detection. While non-coherent multiple antenna systems have been explored since early 2000s following the seminal work of Hochwald and Marzetta [91] , recent efforts have been devoted to develop suitable detection methods for multiple antenna systems with large antenna arrays. Nonetheless, further research into optimization of the trade-off between complexity and performance is required (see e.g., [92] and references therein). Since 6G is expected to provide a unified framework of even more diverse applications than 5G, modulation techniques that require extremely low energy consumption, such as for massive connectivity via IOT or internet-of-bio-things deserve further attention. Often in such applications, a remote "node" operates on energy harvesting, or must survive for years in a single battery charge. While theoretical investigations have shown that flash signalling to be optimal [93] , it is practically infeasible, since it requires high PAPR and precise synchronization. For such applications, new modulation methods that minimize the total energy consumption for the transmitter (for the uplink) and receiver (for the downlink) are required. 8 Research in this line will include clock-free receivers, since the clock and clock distribution can constitute a significant "floor" in the overall energy consumption. To this end, even a "sleep mode" requires significant energy if the clock needs to run to determine when the device needs to "wake-up". Here related ideas from molecular communications may prove to be useful [94] , as well as the use of passive backscatter communication [95] which help in improving the energy efficiency of devices. A major challenge of all modulation formats is the acquisition of channel state information (CSI) at the receiver and at the transmitter. In 5G systems, pilot signals are transmitted, which are later used for channel estimation at the receiver. For CSI acquisition at the transmitter, the system either relies on either reciprocity, or feedback from the receiver [96] . However, the pilot overhead can become prohibitive, particularly in systems employing a massive number of transmit and/or receive antennas. Hence, for 6G, better CSI acquisition schemes need to be determined. Promising research efforts in this area include adaptation of the pilot signal spacing in time and frequency domain, exploitation of limited angular spread of the channel [97, 98] , and advanced signal processing methods for reduction of pilot contamination [99, 100] . A related problem is the quantization and feedback of CSI. While reciprocity calibration, and spatial/temporal extrapolation have shown promise [101] [102] [103] , system-level imperfections limit the desired gains, where further research needs to be conducted. Here, ML applications are worthy of exploration [104, 105] . 2) Advances in Coding: In addition to novel modulation and waveforms, new codes also need to be designed. This is particularly the case for applications which require short packets, such as in IOT systems. Low Density Parity Check (LDPC) codes and Polar codes that have short block lengths have been employed for 5G systems for use in traffic and control uplink/downlink channels [106] . The information-theoretic basis of achievable packet error rate as a function of block length has been established in [107] . On one hand, codes with short block lengths are less reliable, such that error-free transmission cannot easily be guaranteed [108] . An increase in the error probability may increase the need for automatic repeat request (ARQ) re-transmissions, which may not be suitable for time sensitive applications requiring ultra low-latencies. On the other hand, codes with longer block lengths also imply increasing latency. To this end, the interplay between the minimum required block length and robustness against transmission errors needs to be optimized keeping in mind the 6G KPIs listed in Tab. I. Furthermore, low energy applications are often not well suited to ARQ, since this requires leaving the device in a non-sleep mode for an extended period of time, leading to an increase in energy consumption. New coding strategies should encompass both forward error correction and include novel iterative re-transmission/feedback mechanisms [109] and ML-based methods [110] . 9 B. Multiple Antenna Techniques 1) Ultra Massive MIMO Systems: The use of large antenna arrays have been one of the defining features of 5G systems. We foresee this trend to continue towards 6G systems, where the number of antenna elements will be scaled up by a further order-of-magnitude. The fundamental advantages of large antenna arrays have been discussed in overview papers for the past seven years [1, [112] [113] [114] [115] . A number of new research topics have emerged for study, which could prove valuable for 6G research. Firstly, the question of optimal beamforming architecture arises. For 5G deployments within the C-band, i.e., around 3.4-3.8 GHz, digital beamforming remains the choice of interest [116] , due to its ability to provide a higher beamforming gain, while utilizing the channel's spatial degrees-of-freedom [113] . In sharp contrast, most current commercial deployments at mmWave frequencies, i.e., around 24.5-29.5 GHz, use analog beamforming to explicitly steer the array gain in desired directions [116] . This is since digital beamforming at mmWave frequencies yields high circuit complexity, energy consumption and cost of operation. Having said this, recent progress in high frequency electronics has facilitated digital beamforming for 64 antennas at 28 GHz as shown in [117, 118] . In the future, closer investigations of fully digital implementations at mmWave frequencies are merited [119] . In addition, the compromise solution of hybrid beamforming striking the right balance between processing in the analog and digital domains has also received considerable attention [120] [121] [122] [123] . In Sec. VII-C, we provide a more detailed discussion on the real-time processing and transceiver design trade-offs. Secondly, the impact of electrically ultra massive arrays arises as an important research direction. Most current massive MIMO implementations have limited electrical dimensions: for instance, a 256 element array might extend at most 8 wavelengths in one direction. However, as the dimensions increase even further, effects such as wavefront curvature due to scattering in the near-field of the array, shadowing differences in different parts of the array [124] , and beam squinting due to the non-negligible run time of the signal across the array [125] , start to become much more pronounced. All of these physical artifacts need to be taken into account in the design and implementation of beamforming architectures and signal processing algorithms at the transmitter and receiver. Algorithms which provide the right balance between run-time complexity, ease of real-time implementation and optimality in performance need to be investigated, such as spatial modulation -a lower complexity alternative to traditional multiple antenna methods. Here the index of antennas are used to communicate part of the coded symbol [126] . More recently, various aspects of such methods are investigated for channel estimation [127] , differential implementation [128] and hybrid methods [129] . Spatial modulation has also drawn interest at very high frequencies, such as those used for visible light communication (VLC) [130, 131] (see also Sec. V-D). Progress in distributed antenna systems has also been tremendous during the past five years, see e.g., [132] [133] [134] [135] [136] [137] [138] [139] and references therein. The concept of cell-free massive MIMO has been pointed out as a promising way to realize distributed antenna systems below 6 GHz which can scale to large physical areas. While the spectral and energy efficiency improvements brought by such systems are now well understood in theory, it remains to be seen whether the promised theoretical gains can be retained in practice for realistic scenarios with distances spanning up to hundreds of meters and variations in UE/scatterer mobility (see also Sec. VI-B). Since UEs can communicate with multiple access points at the same time, a major research challenge in real-time implementations is to maintain synchronization between many distributed access points, UEs and the central processing unit. 2) Intelligent Surface-Assisted Communications: Another important development is of large intelligent surfaces (LISs) [140, 141] , which aim to have large physical apertures that are electromagnetically active. The surface can be seen as an ultra massive MIMO array (as described above) capable of performing fully digital processing [140, 142] . Browsing in the literature, many names of the same concept exist, such as re-configurable intelligent surfaces and holographic beamforming [142] . The inception of LISs led to the development of intelligent reflecting surfaces (IRSs) [24, [142] [143] [144] [145] , which are designed to quasi-passively reflect the incoming signals to an adaptable set of outgoing directions via tunable phase shifters without any active down/up-conversion. A large number of papers are now appearing on both LISs and IRSs (see e.g., [21, 140-144, 146, 147] and references therein). In particular, for IRSs, a number of questions need further research, such as real-time steering and control of reflections, interference minimization and energy consumption optimization. Among the challenges whose solutions need to be researched are: • Relative advantages and drawbacks relative to active relays, as well as to non-reconfigurable passive reflector structures. A joint communications, electromagnetics and operation expenditure analysis needs to be carried out, which not not only considers the enhancement of coverage (which is frequency-dependent), but also circuit-level implications on performance, electromagnetic behavior of the array, and the actual cost of such deployments, such as the renting of space and ongoing maintenance. As such, guidelines for efficient deployment can be developed. • A detailed assessment on the reliability of such structures needs to be carried out, which includes an analysis of the impact of possible "pixel failures", i.e., elements in the array that which do not operate due to cracks and/or other environmental factors, such as large variations in temperature, rain and wind. Potential solutions also need to take into account the impact of beam misalignment due to these factors. • A detailed investigation into the control protocols to implement efficient signalling between the BS and the surface, as well as UEs and the surface need to be studied. In this line, important questions need to be answered, such as: how will the surface response be maintained with massive changes in radio traffic conditions due to e.g., handovers? How will insertion of the surface influence the design of core networks? Furthermore, novel algorithms for re-calibration on-the-fly need to be developed, or the IRSs need to be designed a-priori to work without any calibration, i.e., purely based on online pilot tones. • The backplane complexity of the surface relative to its aperture also needs to be investigated in detail. Besides conventional spatial multiplexing, which is the fabric of existing multiple antenna systems, OAM [148] is an alternative spatial multiplexing method that has shown great potential for 6G systems. This technique imposes "twists" on the phases of the propagating laser beams, such that modes with different amounts of twist are orthogonal to each other. They can be easily separated through analog means, such as spiral phase plates [149] . OAM is especially suitable for line-of-sight (LOS) propagation, such as in data centers, and for wireless backhaul, and are limited in range since due their underlying principle for multiplexing only works in the radiating near-field of the antenna. Investigations on how to make such systems robust to practical impairments of multipath, misalignment of orientation, etc., are critical to increase their practical utility. While some preliminary work has been done in that direction, e.g., in multipath propagation [150] and turbulence [151] , further work is required. Since OAM performs better with electrically smaller antennas, it is better suited for high mmWaves and THz systems, and in particular for free-space optics applications, see e.g., [11, 152] . Multiple access techniques require a re-think in 6G, especially due to the integration of massive connectivity and extremely low energy applications. Current systems use carrier sense multiple access (CSMA) and/or non-contention access methods such as orthogonal time-frequency division multiple access for cellular systems. However, these multiple access schemes do not scale well to scenarios where thousands of devices or more aim to access a single BS, but with a low duty cycle. Current work in this regime concentrates on spread-spectrum type approaches, such as long range (LoRa) communication, which results in low spectral efficiencies. Hence, new structures that allow for better scaling and possibly further reduce latency need to be studied [153] . Another direction of future research is improvement of multiple access in the traditional high spectral efficiency approaches. Here, non-orthogonal multiple access (NOMA) was originally intended to be part of 5G systems [154] [155] [156] , yet was left out of the early releases due to the rush to finish the specifications. Another promising approach is known as rate splitting (RS) [157, 158] . RS splits UE messages into common and private parts, and encodes the common parts into one or several common streams, while encoding the private parts into separate streams. The streams are precoded using the available (perfect or imperfect) CSI at the transmitter, superposed and transmitted. All the receivers then decode the common stream(s), perform Successive Interference Cancellation and decode their private streams. Each receiver reconstructs its original message from the part of its message embedded in the common stream(s) and its intended private stream. The key benefit of RS relative to other techniques is to flexibly manage interference by allowing it to be partially decoded and partially treated as noise. We anticipate possible simplified versions of NOMA or RS to be in contention for 6G systems. In addition, 6G research should concentrate on how to further improve the performance up to the theoretical limits, while taking into account practical constraints on precoding, and amount of available CSI. More generally, free-space optical communications have great promise for extremely high data rate communications over small-to-medium distances, as long as LOS can be guaranteed. While some operations are possible also in non LOS (NLOS) situations, the achievable data rates, and required modulation as well as signal processing structures can be quite different. To this end, more investigations are required to investigate architectures that provide the right complexitycost-performance trade-off. We can generally distinguish between laser-based and light emitting diode (LED)-based techniques. The latter (a.k.a. VLC or LiFi) is mostly intended for exploiting LEDs that already exist as lighting source, for also transmitting information [159] . Furthermore, the optical transmission is intended for the downlink, while the uplink needs to be provided by traditional radio links [130] . This raises interesting challenges in the integration with 6G cellular and 6G WiFi, which need much more attention. Furthermore, the adaptation to mobility constitutes an important challenge. Laser-based systems allow much higher data rates, yet having small beamwidths, they are mainly suitable for fixed wireless scenarios. Furthermore, they are extremely sensitive to blockage of the LOS paths, since no multipath diversity is available. Modulation and detection methods that are suitable in environments with fast variations of channel conditions also require further investigations. A comprehensive survey of AI and ML applications for 5G and beyond is given in [160] . For PHY research, ML techniques are currently being explored for a variety of tasks. Firstly, it can be used for symbol detection and/or decoding. While de-modulation/decoding in the presence of Gaussian noise or interference by classical means has been studied for many decades [161] , and optimal solutions are available in many cases, ML could be useful in scenarios where either the interference/noise situation does not conform to the assumptions of the optimal theory, or where the optimal solutions are too complex. Given the recent trend, 6G will likely to utilize even shorter codewords than 5G (where Shannon theory does not hold) with low-resolution hardware (which inherently introduce non-linearity that is difficult to handle with classical methods). Here, ML could play a major role, from symbol detection [162] , to precoding [163] , to beam selection [164] , and antenna selection [165] . ML is generally very well suited for these PHY techniques, due to the large amount of training data that can be generated with comparatively little effort, and due to the "labeled data" (ground truth) being readily available. Another promising area for ML is the estimation and prediction of propagation channels. Previous generations, including 5G, have mostly exploited CSI at the receiver, while CSI at the transmitter was mostly based on roughly quantized feedback of received signal quality and/or beam directions. In systems with even larger number of antenna elements, wider bandwidths, and higher degree of time variations, the performance loss of these techniques is non negligible. Here ML may be a promising approach to overcome such limitations, see e.g., [166] . In particular, questions related to the most optimal ML algorithms given certain conditions, required amount of training data, transferability of parameters to different environments, and improvement of explainability will be the major topics of research in the foreseeable future. Modern vehicles are equipped with up to 200 sensors, requiring much higher data rates [167] . Vehicles may also be equipped with video cameras, infrared cameras, automotive radars, light detection and ranging systems, as well as global positioning systems. The sensors and additional devices provides an opportunity to collaborate and share information in order to facilitate accurate and safer automated driving, particularly in congested scenarios. The raw aggregate data rate from the above sensors could be up 1 Gbps, which is well beyond the capability of digital short range communication (DSRC) -the current protocol for connected vehicles [46] . Moving forward, we see the utilization of bands below 6 GHz for high reliability and mmWave bands to achieve Gbps data rates [168, 169] . Fundamentally, some important research challenges that need attention are: 1) Lack of accurate wave propagation models; 2) Assessment of the impact of cars through car penetration loss and antenna arrangements; 3) Lack of accurate modeling of channel non-stationarities. On the network side, we predict that the current 5G network architecture will not meet the latency needs of reliable autonomous driving until MEC is fully integrated. Besides the channel and network aspects, for V2X scenarios, a large number of PHY-related questions need to be investigated. In particular, the processing of the sensor data, including sensor fusion, will become a major bottleneck due to the combination of large amount of data and tight processing deadlines. The optimal tradeoffs between processing at the point of origin, at the BS (if involved), and at the end-point need to be determined, taking into account its relationship with a given level of traffic density, the amount of available infrastructure, as well as the real-time computational capabilities of involved cars. We expect that much of information fusion will occur via ML algorithms. With high mobility of cars and blockages by intervening vehicles, beam management is another aspect which needs much more research. In particular, the beam adjustment mechanisms designed for 5G are often too slow in adapting to vehicular scenarios, calling for new methods. For V2X/V2I systems, the fast association/disassociation with the various road-side units may require a distributed antenna deployment (discussed further in Sec. VI-B from a propagation aspect), and its implications on PHY need to be studied. Importantly, even if all of these research challenges can be addressed, it must be noted that the large number of old cars on the roads will limit the true gains of V2X/V2I systems until late 2030's when the majority of the cars may have V2X/V2I capability. Combinations of DSRC, long-term evolution (LTE), cellular V2X and mmWaves offers a unique opportunity to simultaneously improve reliability, data rates and intelligence of vehicular networks [63] . Since 6G services are expected to be planned over an extremely wide range of frequencies, we now review the propagation characteristics over which 6G systems will operate. The performance of 6G systems will ultimately be limited by the propagation channels they will operate over. It is thus of vital importance to investigate the propagation characteristics relevant for 6G systems, in particular those that have not already been explored for earlier-generation systems. This section provides an overview of wave propagation mechanisms for sub-6 GHz, mmWave and THz frequencies. Across these frequencies, we characterize ultra massive MIMO channels, distributed antenna channels, V2V and V2I channels, industrial channels, UAV channels, and wearable channels, respectively. Other important topics such as full-duplex channels and device-to-device channels are omitted due to space reasons. Interested readers can refer to [1, 114] , as well as references therein, for a more comprehensive overview. Moving to new frequency bands usually entails determination of the fundamental propagation processes. As has long been pointed out by wireless textbooks, using constant gain antennas, the free-space pathloss increases with f 2 , where f is the carrier frequency, and decreases with f 2 when constant-area antennas are used at both link ends [170, 171] . As such, for a given form-factor, highly directional antennas can provide low free-space pathloss. This has driven the need for massive MIMO arrays at mmWave frequencies and ultra massive MIMO arrays THz frequencies. In the mmWave bands, the atmosphere can become absorbing (depending on f ), attenuating the received signal as exp(α atm d), where d is the distance between the BS and UE. The attenuation coefficient, α atm , is a function of f , as well as the atmospheric conditions, such as fog, rain, etc. [172] . As depicted from Fig. 4 , atmospheric attenuation in the THz bands is much higher than the mmWave bands. Notably, the only strong attenuation below 100 GHz is the oxygen line at 60 GHz, giving rise to a loss of approximately 10 dB/km; while from 100-1000 GHz, multiple attenuation peaks exist, that can exceed 100 dB/km. The physical origin of this absorptiona.k.a. molecular absorption -is that electromagnetic waves of specific frequencies excite air molecules causing internal vibration, during which part of the energy driving the propagating wave is converted to kinetic energy and lost [58] . The authors in [58] provide several methods to predict molecular absorption coefficients, but a standardized model to predict attenuation by gasses is presented in [61] . Here only absorption by oxygen and water vapour molecules is considered. The model in [58] is more general, as it allows for absorption to be computed using the more precise composition of the medium through which the waves propagate. For example, in the case of an office environment, the air consists of 78.1% nitrogen, 20.9% oxygen and 0.1-10 % water vapour. The above discussions show that band selection must be carefully aligned with the anticipated link distance between the BS and UEs. As seen by the Fresnel principle, the efficiency of diffraction is greatly reduced at mmWave and even more at THz frequencies, since common objects introduce sharp shadows [114] . On the other hand, diffuse scattering becomes highly relevant, since the roughness of surfaces (in terms of wavelengths) becomes considerable [173, 174] . Unlike the lower frequencies where it is common to assume that a plane wave incident on a rough surface results in a specularly reflected wave and its diffuse components are scattered uniformly into all directions, at THz bands, there is a general lack of validation of this concept via measurements [175, 176] . It is speculated that the amplitude of scattered paths may not be large enough to significantly contribute to the impulse response -an effect that is also observable at mmWave frequencies. Furthermore, attenuation by vegetation, as well as penetration losses in outdoor-to-indoor propagation increase dramatically at mmWave frequencies [177, 178] . Several studies have been conducted to better understand the material dependence on propagation characteristics for bands below 100 GHz, see e.g., [179, 180] . However, relatively fewer such studies exist for the THz bands, where a full assessment of reflection, transmission, and scattering coefficients of many building materials has been done only in few papers [9, 181] . Specular reflections at a dielectric half-space (most commonly ground reflections) are frequency dependent so long as the dielectric constant is frequency dependent, while reflection at the dielectric layer, such as a building wall, depends on the electrical thickness of the wall, and thus on frequency. Having said this, it is not clear whether reflection coefficients increase or decrease with frequency. Conversely, power transmitted through objects decreases almost uniformly with frequency due to the presence of the skin effect in lossy media [114] . Last but not the least, Doppler shifts scale linearly with frequency, while the first Fresnel zone decreases with square root of the wavelength. For realistic simulations, all of these physical effects need to be incorporated into ray tracers and statistical models. Accurately accounting for the physical environmental features is a major challenge for ray tracing, as well as obtaining sufficiently high-resolution databases of the terrain. It is known already that standard databases only offer resolution on the order of a few meters, and thus do not show effects such as critical transitions from smooth windows to rough stucco material, for example. Keeping in mind the above discussions, there exist several standardized and non-standardized models for impulse response generation in the mmWave bands, see e.g., [1, 114, 178, [182] [183] [184] [185] [186] for a detailed taxonomy and model parameters. However, the same can not be said for THz bands due to largely unknown measured characteristics. There exist some recent investigations by the authors in [9, 181, [187] [188] [189] around 140 GHz and in the 275-325 GHz band, from which finite multipath component (MPC) models of the THz channels are derived. Notably, the authors in [181] propose a relatively detailed hybrid model for indoor channels combining spatial, temporal and frequency domains with parameters from 140-150 GHz and 275-325 GHz, respectively. Going forward, a lot more work is required to improve the lack of models for both high mmWaves and THz channels, with the main challenges being as follows: 1) Design and construction of suitable measurement equipment: even for mmWave channels, the construction of channel sounders with high directional resolution, large bandwidth, and high phase stability is very difficult, expensive and time consuming; the lack of available phased arrays and the low output power beyond 200 GHz make measurements even more difficult at those frequencies. Significant effort by the wave propagation community will be required to be able to perform large-scale measurements of static and dynamic channels. 2) Most current channel models are for very specific indoor scenarios, and the presence of a larger variety of environments, as well as different objects in the surroundings will require a mixed deterministic-stochastic modelling approach [181] . In order to characterize the stochastic part of the model, extensive measurements are required, which are currently missing, pointing to the large open gaps at THz frequencies. A summary of the key THz propagation characteristics and its impact on THz systems, as well as a comparison relative to lower bands is depicted in Tab. V. 6G systems will significantly evolve distributed BSs, in the form of either enhanced cloud RAN systems, coordinated multipoint transmission (CoMP, a.k.a. cooperative multipoint) or cell-free massive MIMO systems. As it currently stands, the majority of the deployments will be carried out for bands below 6 GHz. However, in order to complement the high reliability with high data rates, we foresee the use of mmWave bands, where not so many investigations exist. For multiuser scenarios in either of the two bands, the joint channel conditions for multiple UEs have need be provided. A greater challenge is the modelling links from a single UE to multiple BSs. Much of the earlier work has concentrated on the correlation of shadowing between different links. More recent measurement campaigns have quantified the correlation of parameters such as angular spreads, delay spreads, and mean directions [190] . Typically, it is found that significant link correlation can exist even if the BSs are far away from each other; positive correlation can be found when the BSs are in the same direction from the UE. The correlation of BSs can be modeled through the concept common clusters, i.e., clusters that interact with MPCs from different UEs as shown in [191] . For instance, if these clusters are shadowed, it affects the net received power, as well as the angular and temporal dispersion of multiple UEs simultaneously. This concept has been adopted in the design of the European Cooperation in Science and Technology (COST) 2100 channel model. From a measurement standpoint, several distributed massive MIMO channel measurement systems exist, such as the re-cofigurable setup at Katholieke Universiteit Leuven, Belgium, which supports 64 BS antennas designed for the 2.4-2.62 GHz and 3.4-3.6 GHz bands [192] . Another setup exists at Bristol University, UK, where a 128 element BS can be distributed at 2.6 GHz to serve 16 distributed single antenna UEs [193] . The authors at Austrian Institute of Technology present an alternative system at 2.6 GHz where 32 antennas are distributed in two groups of 16 antennas, drawing conclusions on the optimal placement of the arrays [194] . With the maturity of co-located and/or distributed massive MIMO systems, along with the emergence of LISs and IRSs, the number of radiating elements are foreseen to increase beyond those which are conventional today [56, 62, 68, [140] [141] [142] . Ultra massive MIMO arrays are primarily envisioned to operate at high mmWave and/or THz frequency bands, where potentially thousands of antenna elements can be integrated into small form factors [56, 62, 68] . The authors of [56, 62, 68] provide a taxonomy of ultra massive MIMO operation at THz frequencies using the arrays of subarrays concept. Since antenna arrays at high mmWave and/or THz bands become physically small, from a propagation viewpoint, they do not contribute to additional insights than those already described in the mmWave and THz propagation section, i.e., Sec. VI-A. In contrast, bands below 6 GHz also provide interesting research opportunities for ultra massive MIMO channels [140, 141, 144, 195, 196] -though deployment of such large arrays at these frequencies is challenging. As the number of antenna elements are increased, the total physical aperture of the radiating elements is also increased. As this happens, conventional propagation theories and results exploiting the plane wave assumption start to breakdown. Fundamentally, the Fraunhofer distance denoted by d f , is given by d f = 2D 2 /λ, where D is the maximum dimension of the array and λ denotes the wavelength. An increasing D with a fixed λ would imply that the UEs, as well as the scatterers would be increasingly likely to be within the Fresnel zone of the antennas -one which corresponds to the radiating near field. This has some fundamental consequences on the overall propagation behavior. Firstly, spatial non-stationarities in the channel impulse responses start to appear over the size of the array, where different parts of the array "sees" (partially) unique set of scatterers and UEs [196] [197] [198] [199] [200] [201] [202] . As a consequence, the effects of wavefront curvature starts to vary not only the phases of the MPCs, but also the amplitudes over the array size. To this end, the effectiveness of channel hardening and favorable propagation -two pillars of massive MIMO channels start to lose effect leading to increased variability in channel statistics. Secondly, any propagation model to/from ultra massive MIMO arrays need to be directly linked to physics of near-field propagation to compute the near-field channel impulse response. A detailed procedure is given in [140, 141, 144] to generate such response. Several measurement-based studies have demonstrated the above effects quantitatively, see e.g., [195] [196] [197] 203] . The authors of [197, 203] show the effects of spatial non-stationarities from a 128 element virtual linear array (movement of a single element along the horizontal track) in outdoor environments at 2.6 GHz over a 50 MHz bandwidth. The array spanned 7.4 m with half wavelength spacing between the position of successive elements was serving a single UE in LOS or NLOS propagation. The authors in [195, 196] report a similar measurement-based analysis of ultra massive MIMO channels, where a geometrical model is discussed to capture the effects of spatial non-stationarities. The discussed model is based on the massive MIMO extension of the COST 2100 model, which includes the concept of dynamic cluster appearance and disappearance that is unique to both link ends via separable scatterer visibility regions [204] . In a similar line, a discussion on the implication of IRSs is presented in [144] , where the implications of large-scale fading variability is characterized via first principles. From a measurement perspective, the major limitation of characterizing propagation channels of such large dimensions is the extended measurement run time (true for switched and/or virtual arrays), during which the channel is assumed to remain quasi-static. Typically, it is expected that one measurement will take on the order of tens-of-minutes or longer (depending on the measurement bandwidth), limiting the potential measurement scenarios. Fully parallel measurements are not foreseen due to the high cost of up/downconversion chains and net energy consumption. Tremendous progress is observed in understanding the nature of wave propagation in industrial environments at both Fig. 6 . A typical semiconductor factory environment as discussed in [64] . sub-6 GHz and mmWave frequencies (see e.g., [65, [205] [206] [207] for a taxonomy). Naturally, the typical industrial environment is unlike the residential or other indoor environments, since the effects of mechanical and electrical noise, as well as interference are high due to the broad operating temperatures, heavy machinery and ignition systems [65, [205] [206] [207] [208] . Generally, industrial buildings are taller than ordinary office buildings and are sectioned into several working areas, between which there usually exist straight aisles for transportation of materials or for human traffic. Modern factories usually have perimeter walls made of precise concrete or steel material. The ceilings are often supported by metal trusses. Most industrial buildings have concrete floors that can support vehicles and heavy machinery. The object type, size, density, and distribution within a specific environment varies significantly across different environments, playing an important role in characterizing the channel [205] . The presence of random/periodic movements of workers, automated guided vehicles (AGVs) in the form of robots or trucks, overhead cranes, suspended equipment, or other objects will cause time-varying channel conditions. An example of a typical environment of a semiconductor plant at Robert Bosch in Reutlingen, Germany is shown in Fig. 6 . Here one can observe an AGV interacting with the factory worker in a relatively narrow straight pathway with semiconductor fabrication machinery on either side. The exterior surfaces of the machinery are smooth and are metallic in nature made from polished steel. Low-level transport structures attached to the ceilings can also be observed just above the AGV. The arrowed labels A, B, C, and D denote positions where narrow corridors exist with tall machinery. Such corridors can introduce sharp shadows in the received signal and are difficult to overcome, particularly at mmWave frequencies. In order to achieve ultra high reliability, overcoming the shadowing effects by distributing the placement of transmit antennas seems to be crucial. Outside of such narrow corridors, one would expect the propagation channel to be richly dominated by specular reflections, diffuse scattering and LOS propagation. A number of propagation measurements and models in various industrial settings have been conducted. The authors in [207] characterize the large-scale parameters of the industrial channel at 2.37 and 5.4 GHz at the Siemens factory in Neumberg, Germany. In both LOS and NLOS conditions, the shadow fading decorrelation distance was approximately 15 m and 30 m -much larger than the corresponding values of 6 m and 10 m in the standardized 3GPP model [63] . The azimuth and elevation AOD and AOA spreads did not show much difference relative to the 3GPP model. The study in [209] proposes a double-directional model with parameters that are tailored at 5 GHz from measured data. A detailed comparison between propagation characteristics at 3.7 GHz and 28 GHz is presented over a bandwidth of 2 GHz in [206] , where LOS and NLOS pathloss exponents different to those seen in [207] are reported due to the environmental differences. No substantial difference in the delay spread is seen across the two bands of 3.7 GHz to 28 GHz. At 28GHz, AOA information was extracted and angular power profiles and RMS angular spread were evaluated showing an almost uniformly distributed AOA distribution in NLOS conditions across 360 • . The characterized parameters agree with those standardized by the 3GPP. Many further investigations are required to understand the time-varying nature of industrial channels at both below 6 GHz and mmWave frequencies, where not many results exist. For further discussions, the reader is referred to [63, 65, [205] [206] [207] [208] [209] . UAVs include small drones flying below the regular airspace -low altitude platforms, drones in the regular airspace and high altitude platforms in the stratosphere. Depending on how and where they are operated, the channel properties naturally differ [210] . In all cases, one should distinguish the Air-to-Ground (AG) channel and the Air-to-Air (AA) channel. There are a number of recent survey papers for UAV operation below 6 GHz at low altitudes, see e.g., [210] [211] [212] . Typically, the AA channel behaves as a free-space channel with very limited scattering and fading [210] . Given proper alignment, the use of higher frequencies and even free-space optics are well supported [213] . For the AG channel, there is typically more scattering in general, especially at lower frequencies. Often, reflection at the dielectric half-space is strong, giving rise to a two-path fluctuating behavior of the channel. For ground stations located close to the ground level, shadow fading arises as a major limitation, especially at mmWave and above frequencies [214] . Small-scale fading in AG channels usually follows the Ricean distribution with K-factors in excess of 12 dB. The AG channel can exhibit significant rates of change, with higher order Doppler shifts. In addition to the path loss, the airframe of the UAV can introduce significant shadowing, when the body of the aircraft may obstruct the LOS path. The 3GPP has a study of LTE support for UAVs [215] . Here a channel model is provided for system-level simulations catering to three environments: rural macrocell, urban macrocell, and urban microcell, respectively. For mmWave UAV channels, the literature is more scarce, especially with respect to empirical studies. The authors in [216] analyze 60 GHz UAV-based communication with ray-tracing approach where a detailed description of the environment is achieved by a photogrammetric approach. With an accurate and detailed description of the environment and proper calibration, raytracing methods are able to provide accurate predictions of the expected channel behavior in this use case [216] . UAVs are also explored to provide cellular coverage in remote areas via high altitude platforms. The authors of [217] gives an overview of propagation properties of high altitude platforms. In June 2020, Loon and Telkom in Kenya launched their first commercial service providing 4G services from a set of balloons circling in the stratosphere at an approximate altitude of 20 km. This is in stark contrast to LEO or geostationary satellites operating from altitudes of 300-1200 km and 36000 km, respectively. This is important because of the latency induced. The propagation delay for two-way communication is in the order of 0.1 ms rather than in the 2-8 ms range for LEO satellites or 240 ms for geostationary satellites. To this end, such platforms have the possibility to support real-time services with tight latency requirements. The behavior of V2V and V2I channels below 6 GHz is well investigated and understood. The authors of [218] give an overview of important characteristics and considerations for sub-6 GHz V2V communication. Six important propagation characteristics are: 1) The channel can not be seen as wide sense stationary with uncorrelated scattering; the statistics both in terms of time correlation and frequency correlation change over time [219] . 2) High Doppler spreads may occur due to the high relative movements from transmitter to the receiver. In certain cases, up to 4× higher Doppler spread is experienced compared to a conventional cellular scenario with a stationary BS [220] . 3) In a highway scenario, the channel is often sparse with a few dominant MPCs. V2V channels in urban scenarios tend to be much richer in its multipath structure [221] . 4) MPCs (especially in urban settings) tend to have a limited lifetime with frequent deaths and births [222] . 5) Blocking of the LOS by other vehicles tend to have significant impact of the path loss. The median loss by an obstructing truck was reported to be 12-13 dB in [223] . 6) The influence of the antenna position and antenna pattern should not be underestimated [218] . They affect not only the path loss, but also the statistics of the channel parameters. When going up in frequency, it can be expected that those properties not only remain, but become even more exaggerated. The authors of [224, 225] give an up-to-date overview of mmWave V2V channel properties. It is noteworthy that there is a lack of measurement results for mmWave vehicular channels, and most conclusions are drawn from stationary measurements. For both below and above 6 GHz, 3GPP TR 37.885 [226] presents a standardized V2V channel model for system simulations, that is based on the tapped delay line principle. Above 6 GHz, it is assumed that the simulated bandwidth is 200 MHz with an aggregated bandwidth of up to 1 GHz. For 6G, one of the main use cases is cooperative perception, where raw sensor data from, e.g., camera and radars is shared between vehicles. The anticipated data rates for such applications are up to 1 Gbps calling for use of the wider bandwidths available at mmWave frequencies. One of the few dynamic mmWave measurement campaign for a V2I scenario is presented in [227] . For a highway scenario, with vehicle mobility of 100 km/h, the Doppler spread experienced for a carrier frequency of 28 GHz was up to 10 kHz. As a rough estimate, this gives a worst-case coherence time as low as 100 µs, which is extremely small for conventional pilot-based OFDM transmission. The study in [228] analyzed the sparsity of the 60 GHz V2I channel. It was concluded that the sparsity in the delay-Doppler domain holds true also in the measured urban street crossing scenario, and that a single cluster with a specific delay Doppler characteristics was dominating, hence enabling compensation of the delay and Doppler shifts and being suitable for OTFS type of modulation. The authors of [229] analyzed the influence of a realistic antenna mount near the vehicle headlights. The measured antenna pattern showed similar irregularities as seen at sub-6 GHz, with excess path loss typically ranging from 10 to 25 dB depending on the AOA, and more pronounced variations from 74-84 GHz in contrast to 26-33 GHz. In [224] , the influence of LOS was discussed. With directional antennas, the channel can be modelled with two-paths at the measured frequencies of 38, 60 and 76 GHz. Blocking the LOS results in excess losses in the range of 5-30 dB depending on the particular scenario and frequency, i.e., in the same range as reported for sub-6 GHz V2V communication. The blockage of the LOS also results in sudden increases in the angular spread and delay spread, again affecting the channel statistics. For other types of channels, in particular the ones experienced in railway systems, we refer the reader to discussions in [71] . Wearable devices are important in healthcare systems, robotics, immersive video applications. Thus far, there are no standardized models for body area networks, though many studies are reported, see e.g., [230] [231] [232] [233] [234] . The existing measurements can be categorised as narrowband for 300 kHz-1 MHz at sub-1 and 2 GHz frequencies. In contrast, there also exist ultra wideband measurements with a measurement bandwidth of 499 MHz in the C-band and 6-10 GHz. Here one of the most extensive studies is by the authors in [235] , which takes into account 60 human subjects. Models for large-scale and smallscale fading are provided, yet the models given are specific to the measured body locations (i.e., where the sensors are placed), antenna types, frequency bands, proving difficult to generalize to other bands and locations. This seems to be a major challenge requiring much further work. Continuing the top-down look at 6G systems, the following section evaluates the design challenges in real-time signal processing and RF front-end architectures, as well as describes possible solutions to realize working systems across a wide range of frequencies. The section begins with a discussion on the implications of increasing carrier frequencies. VII. REAL-TIME PROCESSING AND RF TRANSCEIVER DESIGN: CHALLENGES, POSSIBILITIES, AND SOLUTIONS While the operating bandwidths of some of the windows in Tab. II spans tens of GHz, building a radio with a single carrier over the entire bandwidth is almost impossible, especially if one wants to maintain equally high performance and energy efficiency across the band by retaining the linearity of RF front-end circuits. In recognition of this, even for 5G systems in case of mmWave bands, the maximum permissible carrier bandwidth is 400 MHz. On a similar line, close proximity services even in the THz bands are being considered to be given a maximum bandwidth of 1 GHz [236] . This is rather astonishing, since in the first place, the adoption to mmWave and THz frequency bands was driven by the fact that ordersof-magnitude more bandwidths could be leveraged relative to canonical systems. Current commercial equipment at mmWave frequencies is made up of aggregating 4 carriers, each 100 MHz wide. Relative to a 100 MHz carrier, the noise floor of a receiver using 1 GHz bandwidth will be 10 dB higher. As such, in practice, the bandwidth of a single carrier could be limited to 100 MHz, yet higher bandwidths can be obtained by aggregating component carriers. Following this line of thought, if 10 GHz bandwidth is desired, one has to aggregate 100 such carriers. A direct consequence of this is that the radio hardware has to be in calibration across the 100 carriers -something which poses a tremendous challenge at such high frequencies, particularly as the effects of phase noise start to dominate. With such wide bandwidths, the radio performance at the lower end of the band can be expected to be entirely different from the upper end of the band. To this end, the maximum number of carriers, and in turn the maximum operable bandwidth, will be a compromise based on the ability to obtain antenna integrated RF circuits and effective isotropic radiated power limits for safety. We note that this is a significant design challenge. It is clear that the high electromagnetic losses in the THz frequency bands pose a tremendous research and engineering challenge. Realistically, it is difficult to imagine (some) 6G services beyond window W1, between 140-350 GHz in Fig. 4 . Here, the free-space loss at a nominal link distance of 10 m is well in excess of 100 dB. 10 A direct consequence of this is limited cell range -a trend which is emerging from 5G systems from network densification. To overcome this issue, the proposal of ultra massive MIMO systems has been made in the THz literature, which is envisaged to close the link budget by integrating a very large number of elements in minuscule footprints to increase the link distance [237] . This is critical for the earlier mentioned 6G use cases requiring Tbps connectivity. Ultimately, the energy consumption along with the exact type of beamforming architecture will put a practical constraint on the realizable number of elements which are considered at the BS and UE link ends. To meet the target of up to Tbps connectivity, threedimensional spatial beamforming will be critical. The complete three-dimensional nature of the propagation channel is not utilized even in 5G systems at mmWave frequencies, where analog beamforming is mostly implemented in commercial products with multiple antenna panels (with or without shared fronthauling), each being able to form one beam towards a pre-defined direction. On the other hand, progress in RF circuits has been tremendous to realize radio transceivers with fully digital beamforming for bands below 6 GHz, and more recently at mmWave bands from 24.5-29.5 GHz [117, 118] . Nevertheless, implementing fully digital beamforming at THz frequencies is a formidable task, with an order-ofmagnitude higher complexity relative to mmWave bands. It should not be taken for granted that in a "matter of time", RF electronics will mature, and we will be able realize digital beamforming even at THz. As for 5G systems, for the shortto-medium term, phased array implementations performing analog or hybrid beamforming seem most likely. Unlike for microwave and mmWave frequencies, for the THz bands, the phased array processing architecture needs to be redesigned due to the complexities in antenna fabrication, high speed/high power mixed signal components, RF interconnects and heat dissipation. The most common type of antenna implementation in microstrip patch elements do not operate efficiently at THz frequencies due to the high dielectric and conductor losses at the RF substrate level. As such, phased arrays fabricated with nano materials, such as graphene have been extensively discussed to build miniature plasmonic antennas with dynamic operational modes to reap the benefits of spatial multiplexing and beamforming [56, 57] . On the other hand, metamaterialbased antennas, hypersurfaces, and RF front-end solutions are also emerging as a key technology [11, 56] . To increase the beamforming gain, the concept of metasurface lenses is introduced, which acts as a RF power splitting, phase shifting, and power combining network that is applied to the radiated signal from an antenna array [11] . Such a structure has the potential to replace conventional RF power splitting, phase shifting and power combining circuits, which are complex and power hungry, with a relative cheap passive device (in the form of a lens), yielding significant gains in circuit complexity and energy consumption [238] . A more detailed discussion about such technologies is given in [11, 56] . From a real-time processing viewpoint, the major challenge at both mmWave and THz frequencies is in the dynamic control and management of RF interconnects of the array elements and the associated beamforming networks. While this problem was present in the mmWave bands, the challenge is elevated even higher due to the even shorter channel coherence times (for a fixed Doppler spread), higher phase noise, and higher number of antenna elements. Even with hybrid beamforming, to manage the processing complexity as well as the cost, fully-connected architectures which require dedicated phase shifters per-RF signal path will be cost prohibitive -and a design based on the array of sub-arrays principle must be leveraged [11, 56] . Here, a subset of antennas are accessible to one specific RF chain, while at baseband, a digital processing module is implemented for both structures to control the data streams and manage interference among users. Low-resolution ADCs and DACs must also be exploited to manage the cost and implementation of transceivers. For THz bands, further discussion is given in the following subsection, while for mmWave bands, further details can be found in [239] . To assess when it may be likely for us to achieve Tbps rates, we carry out a toy example. For the sake of argument, we assume perfect CSI and ideal transceiver architectures at both the BS and UE sides, where 4096 elements are employed at the BS, and 16 elements are employed at the UE, both in uniform planar arrays (UPAs) of 64×64 and 4×4 elements, respectively. For both UPAs, the horizontal spacing was set 0.5λ, while the vertical spacing was 0.7λ, with an example perelement pattern from [63] . The antennas were driven across two separate bandwidths: 140-141 GHz and 140-240 GHz across a link distance of 15 m. For both bandwidths, the noise floors computed using the classical expression in [170, 171] . The propagation channel impulse responses were obtained from the model in [181] . Figure 7 demonstrates the singleuser MIMO capacity cumulative distribution functions (CDFs) at SNR=10 dB and SNR=3 dB. As seen from the top subfigure, with bandwidth of 100 GHz at 10 dB SNR, peak capacity of 1 Tbps can be achievable in theory (indicated on the figure with a green diamond) under the assumptions mentioned above. An almost constant loss in capacity is observable across all CDF values when the operating SNR is reduced from 10 dB to 3 dB. A comparison of the same SNR levels with a bandwidth of 1 GHz yields less than a 100× capacity difference due to bandwidth appearing in the pre-log factor of the capacity formulation. It is noteworthy that the bandwidth term plays a much more prominent role in the capacity predictions, in contrast to the improved SNR (which features inside the logarithm) due to lower noise floor at 1 GHz relative to 100 GHz. With this in mind, one can readily ask many questions about how such high capacities can be achievable under realistic CSI and transceiver architecture constraints, despite the aforementioned difficulties in real-time operation. If we would like to operate a system on a common constellation, is it practically feasible to achieve forward link SNRs on the order of 10 dB? Would the modulation and coding gains be able to maintain such high SNRs for a long time period? Large bandwidths are indeed available at THz frequencies, however are we able to utilize these bandwidths with realizable beamforming architectures? These are all major research questions that need to be answered. In the context of multiuser systems, as a simple approximation, the per-UE capacity, R, can be thought of as where B and L are the bandwidth and number of MIMO layers for a total of K UEs, and SE is the instantaneous spectral efficiency given by SE ≈ log 2 (1 + SINR), where SINR denotes the signal-to-interference-plus-noise ratio of a given UE. Now to increase the capacity, we need to increase B, L and the SINR [200] . Increasing B is certainly possible in the THz bands, yet power density decreases with increasing bandwidth. Increasing MIMO layers will need ultra massive MIMO arrays at both ends, yet they can only be exploited fully if the propagation channel can support a reasonable rank -something which is largely unknown from the sparsely explored THz literature (except for studies such as [181] ). Ultra high dimensional arrays will result in extreme directivity in transmitted beams, which will reduce interference. Yet the increasing bandwidth will also increase the noise floor (as mentioned previously). Finally, network densification will decrease the number of competing users K, yet this will also increase the network operational expenditure and BS coordination overheads. Going forward, all of these factors must be carefully considered in the context of THz research. For sub-6 GHz and mmWave frequencies, a typical BS transceiver architecture is depicted in Fig. 8 [240] , where an amalgamation of radio-over-fiber and active integrated antennas are utilized. In order to avoid cluttering the figure only one radiating element is demonstrated. The up-conversion and down-conversion processes are controlled in real-time via the depicted control modules and the RF circulator. The transmitter and receiver, denoted as TX and RX in the fig. perform the mixing and de-mixing operations. For transmission and reception, a two-stage cascaded amplifier sequence is used to provide additional power gain. Additional filtering and control circuits which are critical to the transceiver operation are also demonstrated. While such architectures can be realized at sub-6 GHz and mmWave frequencies thanks to the progress in RF circuits, the same can not be said for the THz bands. Using the THz band will impose major challenges on the transceiver hardware design. First and foremost, operating at such high frequencies puts stringent requirements on the semiconductor technology. Even when using state-of-the-art technology, the frequency of operation will approach, or in extreme cases even exceed, the frequency where the semiconductor is able to successfully provide power gain, f max . The achievable receiver noise figure as well as transmitter efficiency will then be severely degraded compared to operation at lower frequencies. To maximize the high frequency gain, the technology must use scaled down feature sizes, requiring low supply voltage to achieve reliability, reducing the achievable transmitter output power. Combined with the degraded receiver noise figure, the reduced antenna aperture, and the wide signal bandwidth will naturally results in very short link distances, unless an ultra massive number of elements are combined coherently with sharp beamforming. Thousands to tens-of-thousands of antenna elements may be required for THz BSs. For the sake of example, operating at 500 GHz with ten thousand antenna elements brings the size of the required array down to just 3 cm × 3 cm, with the elements spaced half Fig. 8 . Illustration of a typical BS transceiver architecture for sub-6 GHz and mmWave frequencies with radio-over-fiber and active integrated antenna elements. In order to avoid ambiguity, only one radiating element is shown. The figure is reproduced from [240] . The terms IF, PA, LNA and MCU denote intermediate frequency, power amplifier, low-noise amplifier, and microcontroller unit, respectively. wavelength apart, i.e., 0.3 mm. The RF electronics must have the same size, to minimize the length of THz interconnect, serving as a major research challenge. Each chip must then feature multiple transceivers. For instance, a 3 mm × 3 mm chip can have 100 transceivers, and 100 such chips need to be used in the ten-thousand antenna element array. The antennas may be implemented on or off chip, where on chip antennas generally have less efficiency, yet they eliminate the loss in chip-to-carrier interfaces. In addition, heat dissipation becomes a major problem. Since THz transceivers will have low efficiency, the area for heat dissipation will be very small. If each transceiver consumes 100 mW, the total power consumption of the array becomes 1 kW, having major implications on the system not being able to be continuously active. If heat dissipation becomes too problematic, more sparse arrays may have to be considered, for e.g., using compressive sensing-based array thinning principles with more than half wavelength element spacing [241] . However, this would cause side lobes that need to be managed which inturn may pose constraints on spectrum sharing with existing or adjacent services. To create e.g., 10000 transceivers with high level of integration, a silicon-based technology must be used. While silicon metal oxide semiconductor field effect transistor (MOSFET) transistors are predicted to have reached their peak speed, and will actually degrade with further scaling, silicon germanium (SiGe) bipolar transistors are predicted to reach a f max of close to 2 THz within a 5 nm unit cell [242] . In such a technology, amplifiers and oscillators up to about 1 THz could be realized with high performance and integration. With today's silicon technology, however, 500 GHz amplifiers and oscillators cannot be realized, and to operate at such frequencies, frequency multiplication in a non-linear fashion is necessary. A transmitter based on a frequency multiplier, or a receiver with a sub-harmonic mixer, however, will not reach attractive performance. Currently, a better option may then be to use indium phosphide (InP) technology for the highest frequency parts, combined with a silicon complementary met-alŰoxideŰsemiconductor driven baseband circuit. Amplifiers and mixers at 800 GHz have been demonstrated in 25 nm InP high-electron-mobility transistor (HEMT) technology with an f max of 1.5 THz [243] . When 5 nm SiGe technology becomes available, the level of integration will be higher, resulting in reduced production costs. We believe this to be a must for implementations of ultra massive MIMO arrays. Another important challenge is the generation of coherent and low noise local oscillator (LO) signals for ten thousand or more transceivers. The generation of a central 500 GHz signal to be distributed to all transceivers, perhaps 100, on a chip seems impractical, as it would consume very large power in the buffers. As such, a more distributed solution with local phase locked loops (PLLs) is more appealing, since a lower frequency reference can then be distributed over the chip [244] . The phase noise of different PLLs will then be non-correlated, so that results can be combined with low phase noise beams. On the other hand, doing this results in depth reduction when forming notches, limiting the performance of multiple simultaneous beams [245] . To this end, there is a trade-off in choosing the number of PLLs. Nonetheless, given the high power of LO signal distribution, a large number of PLLs seems favorable. This is further pronounced by the difficulty of reaching high resonator energy in a single oscillator at such high frequencies, making it attractive to increase the total energy by increasing the number of oscillators in the system. Using a large number of PLLs also provides LO beamforming possibilities, as the PLL phase can accurately be controlled [244] . Regardless of LO architecture, another challenge is frequency tuning of oscillators, since the quality factor of variable reactances (varactors) is inversely proportional to the operating frequency. As such, at THz frequencies, other tuning mechanisms should be investigated, like using resistance for tuning [246] . All of these challenges call for substantial reseach efforts in this important direction, and must be overcome to realize systems that are envisioned for 6G networks. To the best of our knowledge, this paper is the first to take a holistic top-down approach in describing 6G systems. The paper begins by presenting a vision for 6G, followed by a detailed breakdown of the next generation use cases, such as high fidelity holographic communications, immersive reality, tactile internet, vastly interconnected society and spaceintegrated communications. For each use case, we present a breakdown of its technical requirements. This is followed by a discussion on the potential deployment scenarios which 6G systems will likely operate in. A rigorous discussion of the research challenges and possible solutions that must be addressed from applications, to design of the next generation core networks, down to PHY is presented. Unlike other studies, we differentiate between what is theoretically possible, and what may be practically achievable for each aspect of the system. After a lengthy analysis dissecting many system components, as well as exploring possible solutions, we can conclude that there is an exciting future that lies ahead. The road to overcome the challenges given in this paper is full of obstacles, yet we provide enough insights to begin research towards the many promising open directions. This will in turn serve as a motivation for research approaching the next decade. 5G: A tutorial overview of standards, trials, challenges, deployment, and practice 5G-enabled tactile internet What will 5G be? 6G wireless networks: Vision, requirements, architecture, and key technologies The roadmap to 6G: AI empowered wireless networks Network 2030: Market drivers and prospects Network 2030: A blueprint of technology, applications, and market drivers toward the year 2030 Wireless communications and applications above 100 GHz: Opportunities and challenges for 6G and beyond Special issue on 6G mobile networks: Emerging technologies and applications 6G: The next hyper-connected experience for all 6G wireless communication systems: Applications, requirements, technologies, challenges, and research directions Telehuman 2.0: A cylindrical light field teleconferencing system for life-size 3D human telepresence Digitally stimulating the sensation of taste through electrical and thermal stimulation Emotion sensing for mobile computing The tactile internet: Applications and challenges Realizing the tactile internet: Haptic communications over next generation 5G cellular networks Toward haptic communications over the 5G tactile internet Toward 6G networks: Use cases and technologies Comparing radio propagation channels between 28 and 140 GHz bands in a shopping mall European Conference on Antennas and Propagation (EuCAP) Enabling technologies for 6G future wireless communications: Opportunities and challenges A speculative study on 6G 6G: Opening new horizons for integration of comfort, security and intelligence The road to 6G: Ten physical layer challenges for communications engineers A prospective look: Key enabling technologies, applications and open research topics in 6G networks Vision, requirements, and technology trend of 6G: How to tackle the challenges of system coverage, capacity, user data-rate and movement speed Potential key technologies for 6G mobile communications A survey on terahertz communications Sub-THz spectrum as enabler for 6G wireless communications up to 1 Tbit/s," 6G Wireless Summit Integration of molecular communications into future generation wireless networks Massive wireless energy transfer: Enabling sustainable IoT towards 6G era Learning-driven wireless communications, towards 6G 6G: the wireless communications network for collaborative and AI applications 6G-next gen mobile wireless communication approach Present and future of terahertz communications Radio access networking challenges towards 2030 Future networks 2030: Architecture and requirements Focus group on technologies for network 2030: Representative use cases and key network requirements Holoportation systems Toward truly immersive holographic-type communication: Challenges and solutions Direct finger manipulation of 3D object image with ultrasound haptic feedback A systematic review of multilateral teleoperation systems Wireless requirements and challenges in industry 4.0 How virtualization, decentralization and network building change the manufacturing landscape: An industry 4.0 perspective Study on NR vehicle-to-everything (V2X) White paper on CV2-X use cases: Methodology, examples and service level requirements A remote surgery experiment between Japan and Thailand over internet using a low latency CODEC system Vehicle-to-everything (V2X) communication in 5G and beyond Toward interconnected virtual reality: Opportunities, challenges, and enablers Availability indication as key enabler for ultra-reliable communication in 5G Enabling virtual reality for the tactile internet: Hurdles and opportunities Optimal control of wireless computing networks Last meter indoor terahertz wireless access: Performance insights and implementation roadmap A technical comparison of three low earth orbit satellite constellation systems to provide global broadband 2410-0: Minimum requirements related to technical performance for IMT-2020 radio interface(s) Combating the distance problem in the millimeter wave and terahertz frequency bands Terahertz band: Next frontier for wireless communications Channel modeling and capacity analysis for electromagnetic wireless nanonetworks in the terahertz band Propagation modeling for wireless communications in the terahertz band Imaging through the atmosphere at terahertz frequencies ITU-R P.676-12, Attenuation by atmospheric gasses and related effects Channel modeling and capacity analysis for electromagnetic wireless nanonetworks in the terahertz band Study on channel model for frequencies from 0.5 to 100 GHz 5G E2E technology to support vertical uRLLC requirements 5G ultra-reliable low-latency communication for factory automation at millimetre wave bands Prototype of KIOSK data downloading system at 300 GHz: Design, technical feasibility, and results Infostations: A new system model for data and messaging services Terahertz band: Next frontier for wireless communications Breaking the blockage for big data transmission: Gigabit road communication in autonomous vehicles On millimeter wave and THz mobile radio channel for smart rail mobility 5G key technologies for smart railways Enabling gigabit services for IEEE 802.11ad-capable high-speed train networks A survey of mobile information-centric networking: Research issues and challenges Chairman's report FG-IMT2020 Dynamic spectrum access with reinforcement learning for unlicensed access in 5G and beyond Integrated access and backhaul 5G evolution: 3GPP releases 16 & 17 overview Outdoor wideband channel measurements and modeling in the 3âȂŞ18 GHz band Achievable rate with 1-bit quantization and oversampling using continuous phase modulation-based sequences All-analog adaptive equalizer for coherent data center interconnects Zero crossing modulation for communication with temporally oversampled 1-bit quantization Channel estimation in broadband millimeter wave MIMO systems with few-bit ADCs Filtered multitone modulation for very high-speed digital subscriber lines Universal-filtered multi-carrier technique for wireless systems beyond LTE OFDM versus filter bank multicarrier Windowed OFDM for small-cell 5G uplink Nonorthogonal pulseshapes for multicarrier communications in doubly dispersive channels GFDM-generalized frequency division multiplexing Multistream faster than Nyquist signaling Orthogonal time frequency space (OTFS) modulation for millimeter-wave communications systems Unitary space-time modulation for multiple-antenna communications in Rayleigh flat fading Non-coherent multiuser massive MIMO-OFDM with differential modulation Spectral efficiency in the wideband regime The clock-free asynchronous receiver design for molecular timing channels in diffusion-based molecular communications Ambient backscatter communications: A contemporary survey Scaling up MIMO: Opportunities and challenges with very large arrays Joint spatial division and multiplexing-the large-scale array regime Joint spatial division and multiplexing for mm-Wave channels Blind pilot decontamination Toward massive MIMO 2.0: Understanding spatial correlation, interference suppression, and pilot contamination Scalable synchronization and reciprocity calibration for distributed multiuser MIMO Reciprocity calibration for massive MIMO: Proposal, modeling, and validation Performance analysis of channel extrapolation in FDD massive MIMO systems Artificial intelligence-enabled cellular networks: A critical path to beyond-5G and 6G Towards practical FDD massive MIMO: CSI extrapolation driven by deep learning and actual channel measurements Asilomar Conference on Signals, Systems, and Computers Systematic polar coding Iterative decoding threshold analysis for LDPC convolutional codes Code design for short blocks: A survey Convolutional codes for iterative decoding On deep learning-based channel decoding Turbo decoding as an instance of pearl's belief propagation algorithm Foundations of MIMO Communication Fundamentals of Massive MIMO Microwave vs. millimeter-wave propagation channels: Key differences and impact on 5G cellular systems Massive MIMO networks: Spectral, energy, and hardware efficiency Real-time deployment aspects of C-band and millimeter-wave 5G-NR systems Digital beamforming-based massive mimo transceiver for 5G millimeter-wave communications Impact of spatially consistent channels on digital beamforming for millimeter-wave systems Design of energy-and cost-efficient massive MIMO arrays Variable-phase-shift-based rf-baseband codesign for MIMO antenna selection Hybrid beamforming for massive MIMO: A survey On the impact of spillover losses in 28 GHz rotman lens arrays for 5G applications Uplink interference analysis with RF switching for lens-based millimeter-wave systems Massive MIMO performance evaluation based on measured propagation data Spatial-and frequencywideband effects in millimeter-wave massive MIMO systems Spatial modulation for generalized MIMO: Challenges, opportunities, and implementation Channel estimation for spatial modulation Differential spatial modulation 50 years of permutation, spatial and index modulation: From classic RF to visible light communications and data storage Spatial modulation for generalized MIMO: Challenges, opportunities, and implementation Angle diversity receiver in LiFi cellular networks On the performance of cell-free massive MIMO in Ricean fading Cell-free massive MIMO versus small cells Energy efficiency of massive MIMO: Cell-free vs. cellular Distributed multicell-MISO precoding using the layered virtual SINR framework Performance analysis of macrodiversity MIMO systems with MMSE and ZF receivers in flat Rayleigh fading Analytical handle for ZF reception in distributed massive MIMO Decentralized massive MIMO systems: Is there anything to be discussed? Channel hardening and favorable propagation in cell-free massive MIMO with stochastic geometry Beyond massive MIMO: The potential of positioning with large intelligent surfaces Beyond massive MIMO: The potential of data transmission with large intelligent surfaces Real-time implementation aspects of large intelligent surfaces Using intelligent reflecting surfaces for rank improvement in MIMO communications Intelligent reflecting surfaces: Physics, propagation, and pathloss modeling Keynote talk #2: 6G wireless: Wireless networks empowered by reconfigurable intelligent surfaces A survey of intelligent reflecting surfaces (IRSs): Towards 6G wireless communication networks Reconfigurableintelligent-surface empowered 6G wireless communications: Challenges and opportunities High-capacity millimetre-wave communications with orbital angular momentum multiplexing Ultralow reflectivity spiral phase plate for generation of millimeterwave OAM beam Multipath effects in millimetre-wave wireless communication using orbital angular momentum multiplexing The new purity and capacity models for the OAM-mmwave communication systems under atmospheric turbulence Optical communications using orbital angular momentum beams Massive nonorthogonal multiple access for cellular IoT: Potentials and limitations Non-orthogonal multiple access (NOMA) for cellular future radio access Nonorthogonal multiple access for 5G: Solutions, challenges, opportunities, and future research trends Application of non-orthogonal multiple access in LTE and 5G networks Rate splitting for MIMO wireless networks: A promising phy-layer strategy for LTE evolution A rate splitting strategy for massive MIMO with imperfect CSIT Line-of-sight visible light communication system design and demonstration Artificial intelligence enabled wireless networking for 5G and beyond: Recent advances and future challenges On the achievable throughput of a multiantenna gaussian broadcast channel Deep MIMO detection Deep learning for direct hybrid precoding in millimeter wave massive MIMO systems 5G MIMO data for machine learning: Application to beam-selection using deep learning Transmit antenna selection in MIMO wiretap channels: A machine learning approach Machine learning for wireless communication channel modeling: An overview Millimeter-wave vehicular communication to support massive automotive sensing The impact of beamwidth on temporal channel variation in vehicular channels and its implications Mmwave vehicle-to-infrastructure communication: Analysis of urban microcellular networks Wireless Communications Wireless Communications: Principles and Practice Centimeter and millimeter wave attenuation and brightness temperature due to atmospheric oxygen and water vapor Diffuse scattering from rough surfaces in THz communication channels A modified Beckmann-Kirchhoff scattering model for slightly rough surfaces at terahertz frequencies A novel ray-tracing algorithm for non-specular diffuse scattered rays at terahertz frequencies Measurement, simulation, and characterization of trainto-infrastructure inside-station channel at the terahertz band 28 GHz foliage propagation channel measurements Standardization of propagation models: 800 MHz to 100 GHz -A historical perspective The development of the new ITU-R model for building entry loss Outdoor-to-indoor propagation channel measurements at 28 GHz Stochastic modeling of THz indoor radio channels Real-time millimeter-wave MIMO channel sounder for dynamic directional measurements Wideband millimeter-wave propagation measurements and channel models for future wireless communication system design 28 GHz millimeter-wave ultrawideband small-scale fading models in wireless channels Radio channel sounding campaigns in EU H2020 mmMAGIC project for 5G channel modeling Indoor wireless channel properties at millimeter wave and sub-terahertz frequencies Double directional channel measurements for tHz communications in an urban environment Propagation measurement system and approach at 140 GHz-moving to 6G and above 100 GHz Channel sounding techniques for applications in thz communications: A first correlation based channel sounder for ultra-wideband dynamic channel measurements at 300 GHz Correlation properties of large scale parameters from 2.66 GHz multi-site macro cell measurements Multi-link MIMO channel modeling using geometrybased approach Massive MIMO in sub-6 GHz and mmWave: Physical, practical, and use-case differences A distributed massive MIMO testbed to assess real-world performance and feasibility Empirical and simulated performance evaluation of distributed massive MIMO Non-stationarities in extra-large scale massive MIMO Linear receivers in nonstationary massive MIMO channels with visibility regions Massive MIMO channelsmeasurements and models Impact of line-of-sight and unequal spatial correlation on uplink MU-MIMO systems On the general analysis of coordinated regularized zero-forcing precoding: An application to two-tier small-cell networks Analysis of multiuser cellular systems over heterogeneous channels Revisiting MMSE combining for massive MIMO over heterogeneous propagation channels Spatial correlation variability in multiuser systems Massive MIMO in real propagation environments: Do all antennas contribute equally? Massive MIMO extensions to the COST 2100 channel model: Modeling and validation Propagation channel characteristics of industrial wireless sensor networks Directional wideband channel measurements at 28 Ghz in an industrial environment Industrial indoor measurements from 2-6 GHz for the 3GPP-NR and QuaDRiGa channel model Radio channel quality in industrial wireless sensor networks Deriving an empirical channel model for wireless industrial indoor communications A survey of air-to-ground propagation channel modeling for unmanned aerial vehicles A survey of channel modeling for uav communications A comprehensive survey on uav communication channel modeling Channel modeling and parameter optimization for hovering UAV-based free-space optical links Outage analysis for millimeterwave fronthaul link of UAV-aided wireless networks Study on enhanced LTE support for aerial vehicles Characterization of radio links at 60 GHz using simple geometrical and highly accurate 3-D models Airborne communication networks: A survey Vehicular channel characterization and its implications for wireless system design and performance Delay and doppler spreads of nonstationary vehicular channels for safety-relevant scenarios A geometry-based stochastic mimo model for vehicle-to-vehicle communications The COST IRACON geometry-based stochastic channel model for vehicle-tovehicle communication in intersections Tracking of wideband multipath components in a vehicular communication scenario Impact of a truck as an obstacle on vehicle-to-vehicle communications in rural and highway scenarios Multi-band vehicle-to-vehicle channel characterization in the presence of vehicle blockage Propagation channels of 5G millimeter-wave vehicle-to-vehicle communications: Recent advances and future challenges Study on evaluation methodology of new vehicleto-everything (V2X) use cases for LTE and NR (Release 15) 28 GHz doppler measurements in high-speed expressway environments Sparsity in the delay-doppler domain for measured 60 GHz vehicle-to-infrastructure communication channels Investigating the V2V millimeter-wave channel near a vehicular headlight in an engine bay Antennas and propagation for on-body communication systems An on-body channel model for UWB body area communications for various postures Characterization of onbody communication channel and energy efficient topology design for wireless body area networks The influence of the wearable antenna type on the on-body channel modeling at 2.45 ghz A study of hand gesture recognition with wireless channel modeling by using wearable devices Impact of body mass index on ultrawideband MIMO BAN channels âȂŤ Measurements and statistical model 2417-0: Technical and operational characteristics of landmobile service applications in the frequency range 275-450 GHz Combating the distance problem in the millimeter wave and terahertz frequency bands Constant-r lens beamformer for low-complexity millimeter-wave hybrid MIMO Architecture and circuit choices for 5G millimeter-wave beamforming transceivers 2334-0 -passive and active antenna systems for base stations of IMT systems Machine learning-aided design of thinned antenna arrays for optimized network level performance SiGe HBT technology: Future trends and TCAD-based roadmap 850 GHz receiver and transmitter frontends using InP HEMT A 60 GHz receiver front-end with PLL based phase controlled LO generation for phased-arrays Optimizing the LO distribution architecture of mm-Wave massive MIMO receivers An 82-107.6 GHz integer-N ADPLL employing a DCO with split transformer and dual-path switchedcapacitor ladder and a clock-skew-sampling delta-sigma TDC