-
Research Article
Computer Vision-based Prediction and Mathematical Optimization of 5G Wireless Cellular Network Parameters
Chikezie Kennedy Kalu*
Issue:
Volume 12, Issue 2, December 2025
Pages:
55-71
Received:
6 June 2025
Accepted:
23 June 2025
Published:
19 July 2025
Abstract: Objective - To investigate, analyze and optimize (where needed) the properties and predictive analyses of selected 5G Mobile wireless network parameters (i.e. Signal to interference noise ratio (SINR) and Throughput as measures of network performance) and Interference conditions in the presence of building obstacles; using the novel approach of combining signal data and visual data in wireless communications. Methods- Using a sample set (i.e., 200 data points) of real life 5G Outdoor Micro cellular tests data and urban building image datasets from validated open source data stores; experimental, investigative and comparative analyses were carried out using the novel approach of combining signal data and visual data using Machine Learning (i.e. Computer Vision) Hybrid deep learning artificial intelligence CNN-based model (i.e., High performance CNN), analytical and mathematical optimization algorithms. The key idea is to leverage camera imagery and Machine Learning (Computer Vision) to successfully predict and analyze network parameters like SINR, Throughput and amount of Interference in the presence of signal obstacles which usually attenuate received signals aperiodically. Additionally obstacle related losses were analysed and network parameter optimization was also demonstrated. Results - The predictive analyses in the presence of obstacles (i.e. concrete buildings) of selected 5G wireless network parameters of SINR and Throughput were carried out successfully using the Hybrid High performance CNN model (HP CNN); with the model showing excellent efficiency by using lesser resources and image datasets from a different environment. Furthermore, the analytical and predictive analyses of a representation of the user interference (i.e. I/PG) in the presence of obstacles were also successfully carried out, and a new OPL algorithm was also proposed in relation to important user obstacle penetration losses. Additionally, the 5G network parameter (i.e. SINR) was mathematically optimized with reference to minimal interference as a demonstration of being an effective tool for engineers and network designers to analytically tune and manage network performance in subsystems and systems more efficiently. Conclusions - This work and diverse related works being carried out; gives no doubt that this novel hybrid intelligent approach presents great possibilities and capabilities for the modern wireless communications field and associated technologies for now and in the future; and its a key approach to autonomous, more efficient network performance management and AI-driven network parameter, attenuation, and interference management.
Abstract: Objective - To investigate, analyze and optimize (where needed) the properties and predictive analyses of selected 5G Mobile wireless network parameters (i.e. Signal to interference noise ratio (SINR) and Throughput as measures of network performance) and Interference conditions in the presence of building obstacles; using the novel approach of com...
Show More
-
Research Article
Robust Radar-driven Gesture Recognition for Contactless Human-computer Interaction Using Support Vector Machine and Signal Feature Optimization
Issue:
Volume 12, Issue 2, December 2025
Pages:
72-80
Received:
15 July 2025
Accepted:
24 July 2025
Published:
8 August 2025
Abstract: Radar-based gesture recognition has emerged as a reliable alternative to vision-based systems for human-computer interaction, especially in environments with low illumination, occlusion, or privacy constraints. This study explores the implementation of a radar-based gesture recognition system using advanced signal processing and machine learning techniques to classify dynamic hand movements with high precision. The central challenge addressed involves extracting discriminative features from radar signals and developing robust classifiers capable of performing effectively under real-world conditions. The proposed approach includes preprocessing radar data through bandpass filtering (5-50 Hz) and normalization, followed by the extraction of key features such as signal energy, mean Doppler shift (7.6-7.9 Hz), and spectral centroid. A Support Vector Machine (SVM) classifier with a radial basis function (RBF) kernel is employed and optimized for gesture classification. Comparative analysis reveals that the SVM model outperforms the K-nearest neighbors (KNN) method, achieving a classification accuracy of 86% and an F1-score of 0.89, compared to 82% accuracy and a 0.84 F1-score obtained with KNN at. These results demonstrate the effectiveness of radar-based systems in detecting and classifying hand gestures accurately, achieving up to 97.3% accuracy in controlled environments. Unlike traditional camera-based systems, radar maintains functionality in poor lighting and occluded conditions while preserving user privacy by avoiding optical recordings. The system also offers low power consumption and real-time processing capabilities, making it suitable for deployment in privacy-sensitive and resource-constrained applications. This work confirms radar’s potential in fine-grained gesture interpretation and aligns with prior studies in crowd tracking and digit recognition, where similar performance metrics were observed. The integration of radar sensing with machine learning offers a promising path toward more secure, responsive, and environment-agnostic interaction systems.
Abstract: Radar-based gesture recognition has emerged as a reliable alternative to vision-based systems for human-computer interaction, especially in environments with low illumination, occlusion, or privacy constraints. This study explores the implementation of a radar-based gesture recognition system using advanced signal processing and machine learning te...
Show More
-
Research Article
Lightweight Blockchain Framework for Securing Internet of Things Payment Systems
Issue:
Volume 12, Issue 2, December 2025
Pages:
81-92
Received:
14 August 2025
Accepted:
26 August 2025
Published:
15 September 2025
Abstract: The integration of Internet of Things (IoT) devices into modern payment systems has introduced innovative functionalities, but also significant security and performance challenges. IoT devices, such as smart sensors, wearables, and automated vending machines, are typically resource-constrained yet handle sensitive financial transactions that demand robust security mechanisms. Conventional cryptographic solutions are often unsuitable for these environments due to their high computational and memory requirements. This paper presents the design of a lightweight blockchain-based model to secure IoT payment systems by leveraging the Ethereum blockchain and AES-128 encryption. The blockchain token is encrypted with AES-128 to add layer of security before being stored in a database. The model is designed to employ a decentralised digital ledger to record and validate transactions without a central authority, and the transaction is grouped into a block and linked to the preceding block through cryptographic hashes. The chain of blocks forms an immutable record that enhances transparency and security, and the distributed nature of blockchain networks, wherein multiple participants validate each transaction, minimises the risk of fraudulent activities while ensuring consensus is achieved through predefined protocols. Analysis of results from the implementation established the minimization of computational overhead and robust security measures, and was particularly beneficial where the scalability of decentralized systems is required alongside heightened security protocols.
Abstract: The integration of Internet of Things (IoT) devices into modern payment systems has introduced innovative functionalities, but also significant security and performance challenges. IoT devices, such as smart sensors, wearables, and automated vending machines, are typically resource-constrained yet handle sensitive financial transactions that demand...
Show More
-
Research Article
Slice-Specific Machine Learning Models for Intrusion Detection in 5G Telecommunication Networks
Issue:
Volume 12, Issue 2, December 2025
Pages:
93-118
Received:
26 September 2025
Accepted:
9 October 2025
Published:
26 November 2025
DOI:
10.11648/j.wcmc.20251202.14
Downloads:
Views:
Abstract: The security challenges introduced by 5G network slicing demand tailored intrusion detection systems (IDS). Traditional intrusion detection systems (IDS) and intrusion detection and prevention systems (IDPS) frameworks, built for static network configurations, are inadequate for the dynamic and heterogeneous nature of 5G networks. To address this gap, this study develops and evaluates slice-specific machine learning models to enhance intrusion detection across different 5G slices, namely: enhanced Mobile Broadband (eMBB), massive Machine-Type Communication (mMTC), and Ultra-Reliable Low-Latency Communication (URLLC). Random Forest, Support Vector Machine (SVM), and Long Short-Term Memory (LSTM) models were applied to publicly available datasets representing each slice. These models are assessed based on their accuracy, precision, recall, F1-score, area under the receiver operating characteristic curve (AUC-ROC), confusion matrix and execution time. The results reveal that the LSTM model achieved the highest accuracy and AUC-ROC scores for the eMBB and mMTC slices, making it suitable for applications where detection accuracy is critical despite higher computational demands. In contrast, Random Forest demonstrated superior computational efficiency, making it the most preferred model for latency-sensitive URLLC slice, where real-time detection is essential. While the SVM model performed well in terms of accuracy, its high computational cost renders it less practical for real-time applications, particularly in URLLC environments. This research provides insights for enhancing 5G network security through the deployment of slice-specific machine learning models, thereby addressing the critical need for adaptable and efficient IDS frameworks.
Abstract: The security challenges introduced by 5G network slicing demand tailored intrusion detection systems (IDS). Traditional intrusion detection systems (IDS) and intrusion detection and prevention systems (IDPS) frameworks, built for static network configurations, are inadequate for the dynamic and heterogeneous nature of 5G networks. To address this g...
Show More