Volume 9, Issue 3

Download Complete Issue

This issue looks at new technology in different fields, covering eight important research papers. It shows how new tools are being used to solve problems and make things work better. The papers talk about using smart computer systems to catch credit card fraud, predict solar power, and help farmers. They also cover ways to improve video streaming, study health effects of chemicals, make AI more trustworthy, and keep data safe. These studies show how new tech like artificial intelligence, drones, and special computer networks are changing how we do things. They’re helping us deal with issues in banking, energy, farming, health, and data protection, and opening doors for more improvements in the future.

Editorial

Front Cover

Adv. Sci. Technol. Eng. Syst. J. 9(3), i-i, (2024);

Editorial Board

Adv. Sci. Technol. Eng. Syst. J. 9(3), ii-iii, (2024);

Editorial

Adv. Sci. Technol. Eng. Syst. J. 9(3), iv-v, (2024);

Table of Contents

Adv. Sci. Technol. Eng. Syst. J. 9(3), vi-vi, (2024);

Articles

Adaptive Heterogeneous Ensemble Learning Model for Credit Card Fraud Detection

Tinofirei Museba, Koenraad Vanhoof

Adv. Sci. Technol. Eng. Syst. J. 9(3), 1-11 (2024);

View Description

The proliferation of internet economies has given the corporate world manifold advantages to businesses, as they can now incorporate the latest innovations into their operations, thereby enhancing ease of doing business. For instance, financial institutions have leveraged credit card usage on the aforesaid proliferation. However, this exposes clients to cybercrime, as fraudsters always find ways to breach security measures and access customers’ confidential information, which they then use to make fraudulent credit card transactions. As a result, financial institutions incur huge losses amounting to billions of United States dollars. To avert such losses, it is important to design efficient credit card fraud detection algorithms capable of generating accurate alerts. Recently, machine learning algorithms such as ensemble classifiers have emerged as the most effective and efficient algorithms in an effort to assist fraud investigators. There are many factors that hinder the financial sector from designing machine learning algorithms that can efficiently and effectively detect credit card fraud. Such factors include the non-stationarity of data related to concept drift. In addition, class distributions are extremely imbalanced, while there is scant information on transactions that would have been flagged by fraud investigators. This can be attributed to the fact that, owing to confidentiality regulations, it is difficult to access public data. In this article, the author designs and assesses a credit card fraud detection system that can adapt to the changes in data distribution and generate accurate alerts.

Read more…

Evaluation of Various Deep Learning Models for Short-Term Solar Forecasting in the Arctic using a Distributed Sensor Network

Henry Toal, Michelle Wilber, Getu Hailu, Arghya Kusum Das

Adv. Sci. Technol. Eng. Syst. J. 9(3), 12-28 (2024);

View Description

The solar photovoltaic (PV) power generation industry has experienced substantial, ongoing growth over the past decades as a clean, cost-effective energy source. As electric grids use ever-larger proportions of solar PV, the technology’s inherent variability—primarily due to clouds—poses a challenge to maintaining grid stability. This is especially true for geographically dense, electrically isolated grids common in rural locations which must maintain substantial reserve generation capacity to account for sudden swings in PV power production. Short-term solar PV forecasting emerges as a solution, allowing excess generation to be kept offline until needed, reducing fuel costs and emissions. Recent studies have utilized networks of light sensors deployed around PV arrays which can preemptively detect incoming fluctuations in light levels caused by clouds. This research examines the potential of such a sensor network in providing short-term forecasting for a 575-kW solar PV array in the arctic community of Kotzebue, Alaska. Data from sensors deployed around the array were transformed into a forecast at
a 2-minute time horizon using either long short-term memory (LSTM) or gated recurrent unit (GRU) as base models augmented with various combinations of 1-dimensional convolutional (Conv1D) and fully connected (Dense) model layers. These models were evaluated using a novel combination of statistical and event-based error metrics, including Precision, Recall, and Fβ. It was found that GRU-based models generally outperformed their LSTM-based counterparts along statistical error metrics while showing lower relative event-based forecasting ability. This research demonstrates the potential efficacy of a novel combination of LSTM/GRU-based deep learning models and a distributed sensor network when forecasting the power generation of an actual solar PV array. Performance across the eight evaluated model combinations was mostly comparable to similar methods in the literature and is expected to improve with additional training data.

View Description

 This research is an extension of the research (ISEEIE 2023), which dealt with Time-Series Clustering (TSC) of Vegetation Index (VI) for paddy rice. The novelty of this research is “Visualization of growth changes before and after additional fertilization,” “Analyzing the appropriate amount of additional fertilizer,” and “Optimization of monitoring period to minimize the number of monitoring days for its workload reduction using Unmanned Aerial Vehicle (UAV).” For visualization of growth changes before and after fertilization, a UAV was used to obtain VIs for each mesh and make its time-series data during the monitoring period. Then, TSC was performed on the data. As a result of clustering, NDVI and NDRE increased with additional fertilizer, making it possible to visualize the fertilizer effects. For analyzing the appropriate amount of fertilizer, the its amount applied was changed for each paddy field (2.8, 3.5, 4.2g/m2). In a field experiment conducted, both the TSC results and the crop estimates by unit acreage sampling for each paddy field revealed no difference in yield among fields, indicating that the paddy field with the least fertilizer amount (2.8g/m2) is optimal. It was estimated that this would reduce nitrate nitrogen, which is harmful to soil and the human body, by 0.070mg/L. In addition, for optimization of the monitoring period, the importance of each independent variable outputted by Random Forest (RF) was used to find a subset of monitoring dates. In any VI, there is a period, determined by the range of effective accumulated temperature, when the clustering result does not change even if the number of monitoring dates is reduced (The period could be reduced from 30 to 40 days, which is particularly important for three vegetation indices). From these results, the technologies can help reduce fertilizer costs, excessive fertilization and environmental impacts, and promote the use of UAV.

Solar Photovoltaic Power Output Forecasting using Deep Learning Models: A Case Study of Zagtouli PV Power Plant

Sami Florent Palm, Sianou Ezéckiel Houénafa , Zourkalaïni Boubakar, Sebastian Waita, Thomas Nyachoti Nyangonda, Ahmed Chebak

Adv. Sci. Technol. Eng. Syst. J. 9(3), 41-48 (2024);

View Description

Forecasting solar PV power output holds significant importance in the realm of energy management, particularly due to the intermittent nature of solar irradiation. Currently, most forecasting studies employ statistical methods. However, deep learning models have the potential for better forecasting. This study utilises Long Short-Term Memory (LSTM), Gate Recurrent Unit (GRU) and hybrid LSTM-GRU deep learning techniques to analyse, train, validate, and test data from the Zagtouli Solar Photovoltaic (PV) plant located in Ouagadougou (longitude:12.30702o and latitude:1.63548o), Burkina Faso. The study involved three evaluation metrics: Root Mean Square Error (RMSE), Mean Absolute Error (MAE), and coefficient of determination (R2). The RMSE evaluation criteria gave 10.799(LSTM), 11.695(GRU) and 10.629(LSTM-GRU) giving the LSTM-GRU model as the best for RMSE evaluation. The MAE evaluation provided 2.09, 2.1 and 2.0 for the LSTM, GRU and LSTM-GRU models respectively, showing that the LSTM-GRU model is superior for MAE evaluation. The R2 criteria similarly showed the LSTM-GRU model to be best with 0.999 compared to 0.998 for LSTM and 0.997 for GRU. It becomes evident that the hybrid LSTM-GRU model exhibits superior predictive capabilities compared to the other two models. These results indicate that the hybrid LSTM-GRU model has the potential to reliably predict the solar PV power output. It is therefore recommended that the authorities in charge of the solar PV Plant in Ouagadougou should consider switching to the deep learning LSTM-GRU model.

Efficient Deep Learning-Based Viewport Estimation for 360-Degree Video Streaming

Nguyen Viet Hung, Tran Thanh Lam, Tran Thanh Binh, Alan Marshal, Truong Thu Huong

Adv. Sci. Technol. Eng. Syst. J. 9(3), 49-61 (2024);

View Description

While Virtual reality is becoming more popular, 360-degree video transmission over the Internet is challenging due to the video bandwidth. Viewport Adaptive Streaming (VAS) was proposed to reduce the network capacity demand of 360-degree video by transmitting lower quality video for the parts of the video that are not in the current viewport. Understanding how to forecast future user viewing behavior is therefore a crucial VAS concern. This study presents a new deep learning-based method for predicting the typical view for VAS systems. Our proposed solution is termed Head Eye Movement oriented Viewport Estimation based on Deep Learning (HEVEL). Our proposed model seeks to enhance the comprehension of visual attention dynamics by combining information from two modalities. Through rigorous experimental evaluations, we illustrate the efficacy of our approach versus existing models across a range of attention-based tasks. Specifically, viewport prediction performance is proven to outperform four reference methods in terms of precision, RMSE, and MAE.

Leveraging Machine Learning for a Comprehensive Assessment of PFAS Nephrotoxicity

Anirudh Mazumder, Kapil Panda

Adv. Sci. Technol. Eng. Syst. J. 9(3), 62-71 (2024);

View Description

Polyfluoroalkyl substances (PFAS) are persistent chemicals that accumulate in the body and environment. Although recent studies have indicated that PFAS may disrupt kidney function, the underlying mechanisms and overall effects on the organ remain unclear. Therefore, this
study aims to elucidate the impact of PFAS on kidney health using machine learning techniques. Utilizing a dataset containing PFAS chemical features and kidney parameters, dimensionality reduction and clustering were performed to identify patterns. Machine learning models, including XGBoost classifier, regressor, and Random Forest regressor, were then developed to predict kidney type from PFAS descriptors, estimate PFAS accumulation in the body, and predict the ratio of glomerular surface area to proximal tubule volume, which indicates kidney filtration efficiency. The kidney type classifier achieved 100% accuracy, confirming that PFAS exposure alters kidney morphology. The PFAS accumulation model attained an R2 of 1.00, providing a tool to identify at-risk individuals. The ratio prediction model reached an R2 of 1.00, offering insights into PFAS effects on kidney function. Furthermore, PFAS descriptors and anatomical variables were identified through analyses using feature importance, demonstrating discernible links between PFAS and kidney health, offering further biological significance. Overall, this study can significantly contribute to the current findings on the effect of PFAS while offering machine learning as a contributive tool for similar studies.

Deploying Trusted and Immutable Predictive Models on a Public Blockchain Network

Anirudh Mazumder, Kapil Panda

Adv. Sci. Technol. Eng. Syst. J. 9(3), 72-83 (2024);

View Description

Machine learning-based predictive models often face challenges, particularly biases and a lack of trust in their predictions when deployed by individual agents. Establishing a robust deployment methodology that supports validating the accuracy and fairness of these models is a critical endeavor. In this paper, we introduce a novel approach to deploying predictive models, such as pre-trained neural network models, in a public blockchain network using smart contracts. Smart contracts are encoded in our approach as self-executing protocols for storing various parameters of the predictive models. We develop efficient algorithms for uploading and retrieving model parameters from smart contracts on a public blockchain, thereby ensuring the trustworthiness and immutability of the stored models, making them available for testing and validation by all peers within the network. In addition, users can rate and comment on the models, which are permanently recorded in the blockchain. To demonstrate the effectiveness of our approach, we present a case study focusing on storing vehicle price prediction models and review comments. Our experimental results show that deploying predictive models on a public blockchain network provides a proficient and reliable way to ensure model security, immutability, and transparency.

Automated Performance Analysis E-services by AES-Based Hybrid Cryptosystems with RSA, ElGamal, and ECC

Rebwar Khalid Muhammed, Kamaran Hama Ali Farj, Jaza Faiq Gul-Mohammed, Tara Nawzad Ahmad Al Attar, Shaida Jumaah Saydah,  Dlsoz Abdalkarim Rashid

Adv. Sci. Technol. Eng. Syst. J. 9(3), 84-91 (2024);

View Description

Recently Network safety has become an important or hot topic in the security society (i.e. Encryption and Decryption) developed as a solution of problem that have an important role in the security of information systems (IS). So protected/secure the shared data and information by many methods that require in all internet faciality, data health and the cloud computing that suggestively increased our data every in milliseconds unit. This performance analysis by two factors namely Encryption , Decryption and throughput time of three Hybrid Encryption schemes namely; Hybrid AES-RSA, Hybrid AES-ECC, and Hybrid AES-ElGamal which are based on Encryption and Decryption times by milliseconds unit in the form of throughput. The results evaluation show clear distinctions schemes capabilities such as; Encryption and Decryption as well as throughput time consume. Nevertheless, the Hybrid AES-RSA emerges as the fastest types. Both encryption and decryption outcome with superior throughput. Hybrid AES-ECC and Hybrid AES-ElGamal results are slower processing times and making them more suitable for scenarios where performance is not the primary concern. The choice between these schemes should consider not only performance but also security requirements and specific application required for testing and realize to select Hybrid AES-RSA due to better performance in milliseconds . the programing language for proposed system is JAVA, this mean that all testing is by JAVA and discover that the Hybrid AES-RSA is better in performance. The security proposed is Hybrid AES-RSA for automated recruitment system is best.

Special Issues

Special Issue on Computing, Engineering and Multidisciplinary Sciences
Guest Editors: Prof. Wang Xiu Ying
Deadline: 30 April 2025

Special Issue on AI-empowered Smart Grid Technologies and EVs
Guest Editors: Dr. Aparna Kumari, Mr. Riaz Khan
Deadline: 30 November 2024

Special Issue on Innovation in Computing, Engineering Science & Technology
Guest Editors: Prof. Wang Xiu Ying
Deadline: 15 October 2024