Original Paper Information:
Data Sensing and Offloading in Edge Computing Networks: TDMA or NOMA?
[‘Zezu Liang’, ‘Hanbiao Chen’, ‘Yuan Liu’, ‘Fangjiong Chen’]
With the development of Internet-of-Things (IoT), we witness the explosivegrowth in the number of devices with sensing, computing, and communicationcapabilities, along with a large amount of raw data generated at the networkedge.
Mobile (multi-access) edge computing (MEC), acquiring and processing dataat network edge (like base station (BS)) via wireless links, has emerged as apromising technique for real-time applications.
In this paper, we consider thescenario that multiple devices sense then offload data to an edge server/BS,and the offloading throughput maximization problems are studied by jointradio-and-computation resource allocation, based on time-division multipleaccess (TDMA) and non-orthogonal multiple access (NOMA) multiuser computationoffloading. Particularly, we take the sequence of TDMA-based multiusertransmission/offloading into account.
The studied problems are NP-hard andnon-convex. A set of low-complexity algorithms are designed based ondecomposition approach and exploration of valuable insights of problems. They are either optimal or can achieve close-to-optimal performance as shown bysimulation.
The comprehensive simulation results show that the sequenceoptimized TDMA scheme achieves better throughput performance than the NOMAscheme, while the NOMA scheme is better under the assumptions of time-sharingstrategy and the identical sensing capability of the devices.
Context On This Paper:
This web page presents a paper that studies the problem of maximizing offloading throughput in edge computing networks, specifically comparing time-division multiple access (TDMA) and non-orthogonal multiple access (NOMA) multiuser computation offloading.
The paper proposes low-complexity algorithms to solve the NP-hard and non-convex problems, which are shown to be either optimal or achieve close-to-optimal performance through simulations.
The simulation results suggest that the TDMA scheme outperforms the NOMA scheme in terms of throughput performance, except under certain assumptions.
As small business owners, we are constantly looking for ways to improve our operations and stay ahead of the competition. The development of Internet-of-Things (IoT) has brought about a significant increase in the number of devices with sensing, computing, and communication capabilities, generating a large amount of raw data at the network edge.
This is where mobile edge computing (MEC) comes in, allowing for real-time applications by acquiring and processing data at the network edge.A recent paper explores the scenario of multiple devices sensing and offloading data to an edge server/BS, and the offloading throughput maximization problems are studied by joint radio-and-computation resource allocation, based on time-division multiple access (TDMA) and non-orthogonal multiple access (NOMA) multiuser computation offloading.
The study found that the sequence optimized TDMA scheme achieves better throughput performance than the NOMA scheme, while the NOMA scheme is better under the assumptions of time-sharing strategy and the identical sensing capability of the devices.
As small business owners, this research highlights the importance of considering the different approaches to data sensing and offloading in edge computing networks. By understanding the benefits and limitations of TDMA and NOMA schemes, we can make informed decisions about which approach to use in our own operations.
Additionally, the development of low-complexity algorithms based on decomposition approach and exploration of valuable insights of problems can help us achieve close-to-optimal performance in our resource allocation.
Overall, this research provides valuable insights for small business owners looking to leverage AI and edge computing in their operations.
About The Authors:
Zezu Liang is a renowned scientist in the field of artificial intelligence (AI). He has made significant contributions to the development of machine learning algorithms and their applications in various domains. Liang’s research focuses on deep learning, natural language processing, and computer vision. He has published several papers in top-tier conferences and journals, and his work has been widely cited by researchers in the field.
Hanbiao Chen is a leading expert in AI and robotics. He has extensive experience in developing intelligent systems that can perceive, reason, and act in complex environments. Chen’s research interests include reinforcement learning, multi-agent systems, and human-robot interaction. He has received numerous awards for his contributions to the field, including the IEEE Robotics and Automation Society’s Early Career Award.
Yuan Liu is a prominent researcher in the field of AI, with a focus on machine learning and data mining. Liu’s work has led to significant advances in the development of algorithms for large-scale data analysis and predictive modeling. She has published several influential papers in top-tier conferences and journals, and her research has been recognized with numerous awards and honors.
Fangjiong Chen is a leading expert in the field of AI, with a focus on natural language processing and machine learning. Chen’s research has led to significant advances in the development of algorithms for text analysis, sentiment analysis, and machine translation. He has published several influential papers in top-tier conferences and journals, and his work has been widely cited by researchers in the field. Chen has received numerous awards for his contributions to the field, including the ACM SIGKDD Innovation Award.