Editorial
Articles
Accelerating Decision-Making in Transport Emergency with Artificial Intelligence
Alexander Raikov
Adv. Sci. Technol. Eng. Syst. J. 5(6), 520-530 (2020);
Media: Presentation File
View Description
The paper addresses speeding up meetings in a networked environment during rescue works in a transport emergency. Several groups of representatives of various services and observers participate in those meetings. The number of wrong decisions tends to increase because remote participants cannot understand each other quickly. First, the meetings must be efficiently held to avoid making wrong decisions, including medical diagnoses for injuries. The ultimate goals are to improve injured’ health and life. Artificial intelligence (AI), big data analysis, and deep learning methods suggested in this paper for decision-making support have a cognitive character, i.e., try to take into account the thoughts and emotions of participants. The author’s convergent approach ensures the purposefulness and sustainability of decision-making. This approach transforms divergent decision-making processes into convergent. The approach is based on the inverse problem-solving method in topological space, genetic algorithms, control thermodynamic theory, and using the ideas of creating AI models’ cognitive semantics with quantum mechanics methods. This approach gives meetings’ members the list of decision-making rules with accelerating consensus achievement. The examples of the rules are: the goals have to be arranged as a 3-level tree and ordered by importance; semantic interpretations of computer models’ factors and their connections must be separated; rescue resources must be represented in a finite number of separated components, and so on. The approach also exploits traditional technical tools of augmented reality, virtual collaboration, and situational awareness. It has been repeatedly used to build socioeconomic and manufacturing sectoral strategies and is currently being adapted for emergencies.
Determinism of Replicated Distributed Systems–A Timing Analysis of the Data Passing Process
Adriano A. Santos, António Ferreira da Silva, António P. Magalhães, Mário de Sousa
Adv. Sci. Technol. Eng. Syst. J. 5(6), 531-537 (2020);
Media: Presentation File
View Description
Fault-tolerant applications are created by replicating the software or hardware component in a distributed system. Communications are normally carried out over an Ethernet network to interact with the distributed/replicated system, ensuring atomic multicast properties. However, there are situations in which it is not possible to guarantee that the replicas process the same data set in the same order. This occurrence will lead to inconsistency in the data set produced by the replicas, that is, the determinism of the applications is not guaranteed.
To avoid these inconsistencies, a set of Function Blocks has been proposed which, taking advantage of the inherent properties of Ethernet, can guarantee the synchronism and determinism of the real-time application. This paper presents this set of Function Blocks, focusing our action on the development of reliable distributed systems in real-time. This demonstrates that the developed Function Blocks can guarantee the determinism of the replicas and, as such, that the messages sent are processed, in the same order and according to the time in which they were made available.
Variation Between DDC and SCAMSMA for Clustering of Wireless MultipathWaves in Indoor and Semi-Urban Channel Scenarios
Jojo Blanza, Lawrence Materum
Adv. Sci. Technol. Eng. Syst. J. 5(6), 538-543 (2020);
Media: Presentation File
View Description
The performance of Simultaneous Clustering and Model Selection Matrix Affinity (SCAMSMA) and Deep Divergence-Based Clustering (DDC) in clustering wireless mul- tipaths generated by COST 2100 channel model (C2CM) is compared. Enhancing the accuracy of clustering multipaths is an open area of research which the clustering ap- proaches try to improve. Jaccard index is used as the clustering validity metric of the clustering approaches. The results of the clustering approaches are compared using the analysis of variance (ANOVA) toolbox of MATLAB and displayed using the box plots. Re- sults show that the cluster-wise Jaccard index is different between SCAMSMA and DDC for indoor scenarios, while the membership-wise Jaccard index is not. On the other hand, the cluster-wise Jaccard index is not different between the clustering approaches for semi-urban scenarios, while the membership-wise Jaccard index is a little different. The clustering approaches can be used in indoor scenarios based on accuracy.
Interface for Visualization of Wireless Propagation Multipath Clustering Outcomes
Jojo Blanza, Lawrence Materum
Adv. Sci. Technol. Eng. Syst. J. 5(6), 544-549 (2020);
Media: Presentation File
View Description
A graphical user interface (GUI) is presented to visualize the multipaths generated by COST 2100 channel model (C2CM) and the results of clustering the wireless propagation multipaths using Modified Simultaneous Clustering and Model Selection (MSCAMSMA). The usual practice of authors is to show their data and results using figures, tables, and graphs which are already sufficient to present their studies. However, the manner of displaying the data and results is static; that is, the user cannot examine them to analyze further the relationship of the different variables involved in the study. The paper presents a step further by showing dynamically the data and results which the user can manipulate according to its needs.
The Effect of Different Starches in the Environmental and Mechanical Properties of Starch Blended Bioplastics
Adriana C. Neves, Tew Ming, Marta Mroczkowska, David Culliton
Adv. Sci. Technol. Eng. Syst. J. 5(6), 550-554 (2020);
Media: Presentation File
View Description
The problematic of plastic pollution in the world has led to an interest in the research and development of thermoset starch-protein blend bioplastics as a possible alternative for single use non-recyclable plastics. However, these bioplastics lack the physico-chemical characteristics that make them useful as a replacement for the currently used plastics. This work aims to assess what differences in the mechanical and environmental impact can be seen in the thermoset starch-protein blend bioplastics when different starches are used in their formulation. Rice, kuzu, corn, wheat and potato starches were used to generate bioplastics, and these were tested in terms of colour, lightness, roughness, chemical composition, moister content, water solubility and soil toxicity when degraded. Characteristics such as chemical composition, colour and moister content do not change significantly when the different starches are used, however it was possible to identify change in characteristics such as lightness, roughness, and water solubility of the bioplastics. Moreover, not only none of the bioplastics showed to be toxic to the soil when degraded, but also promoted the growth of the plant species tested. It was possible to conclude that the use of different starches in the formulation of thermoset starch-protein blend bioplastics allows the generation of bioplastics with different characteristics. This leads to an increase in applicability of theses bioplastics and consequently a higher positive impact in the environment.
Proposal of a New Descriptive-Correlational Model of Population Lifestyle Analysis and Disease Diagnosis
Selene Tamayo Castro, Kristian Aldapa Salcido, Linda García Rodríguez
Adv. Sci. Technol. Eng. Syst. J. 5(6), 555-560 (2020);
Media: Presentation File
View Description
This document proposes a new methodology for lifestyle analysis and disease diagnosis for young academics using a strategic planning and disruptive innovation approach. Its objective is to consider and study a new form of treatment to improve the quality of life and health of students in parallel with their development, this through a quantitative methodology with a descriptive-correlational scope, being possible to develop a diagnosis and analysis, resulting A high relationship between variables is evident, thus developing a preliminary model based on engineering in the process of improving lifestyles through the Deming cycle and based on the S curve of innovation, hand in hand with technology and motivation.
sharpniZer: A C# Static Code Analysis Tool for Mission Critical Systems
Arooba Shahoor, Rida Shaukat, Sumaira Sultan Minhas, Hina Awan, Kashif Saghar
Adv. Sci. Technol. Eng. Syst. J. 5(6), 561-570 (2020);
Media: Presentation File
View Description
Until recent years, code quality was not given due significance, as long as the system produced accurate results. Taking into account the implications and recent losses in critical systems, developers have started making use of static code analysis tools, to assess the eminence of source code in terms of quality. Static code analysis is conducted before the system is sent into production. The analysis aims to identify the veiled defects and complex code structures that result in the decrement of code quality or are likely to become a cause of malfunction during execution. To address this line of work, this research paper presents a static code analyzer for C#, named as sharpniZer.
The key purpose of this tool is to verify the compliance of the source code written in C#, in congruence with the target set of rules defined for analysis as per the accepted industry standards set particularly for the development of mission-critical systems. sharpniZer efficiently figures out the lines of source code that hold probable concern appertain to the category of design rules, usage rules, maintainability rules, inefficient code, complexity, object model and API rules, logical rules, exception, incomplete code, and naming conventions. Each violation encountered in source code is ranked by the severity level as: critical, major, and minor. The tool shall prove to be worthwhile, especially if utilized in critical systems.
Experimental Study on Mechanical Behavior of Polypropylene-based Blends with Talc Fillers
Pham Thi Hong Nga, Van-Thuc Nguyen
Adv. Sci. Technol. Eng. Syst. J. 5(6), 571-576 (2020);
Media: Presentation File
View Description
In this report, polymeric composites made from Polypropylene (PP) and talc powder were studied. The talc powder with different portions of 10%, 20%, 30% by weight was used to create samples. The samples were examined by the tensile test and the flexural test according to ASTM D638 and ASTM D790. The surface morphology of the samples were investigated by scanning electron microscope (SEM). Increased the talc powder portion led to a decrease in the tensile strength and flexural strength. Among the added talc powder samples, the 10% talc powder sample presented the highest mechanical tensile strength of 25.91 MPa and flexural strength of 47.99 MPa, while the 30% talc powder sample shown the lowest mechanical tensile strength and flexural strength. The results of SEM analysis indicated the existence of talc plates and porosity between talc plates and PP substrate compared to the neat PP sample. The surface morphology of the samples was observed by SEM, showing higher porosity in samples with increasing filler content. Moreover, increasing the talc powder portion resulted in a higher chance of brittle fracture. The research indicated and explained the effect of talc power on the characteristic of PP.
Ontology-based Data Management Tool for Studying Radon Concentration
Felix Fernandez-Pena, Alex Maigua-Quinteros, Pilar Urrutia-Urrutia, Diana Coello-Fiallos
Adv. Sci. Technol. Eng. Syst. J. 5(6), 577-583 (2020);
Media: Presentation File
View Description
Interpreting data for solving information needs demands an understanding of the semantics of row data. In this paper, an ontology-driven management of data of the presence of radon gas in soil is presented. The main contribution of this research relies on the formal definition of the semantic of the presence of radon in the dimensions of space and time. As a result, a dynamic data view generator for the observation of radon measurements was deployed without actual software programming but with the proper adjustment and instantiation of the ontology ViewOnto, a web ontology which was created for the formal description of data views. An experimental assessment of the proposal was carried out with data obtained from stone aggregate mines and civil constructions in Tungurahua. The usability of the resulting data management system was successfully validated with the criteria of experts on usability and researchers with experience in radon measurement.
Strategic Plan for the Achievement of the Competitiveness of Small Companies with Respect to Large Ones
Alan Guadalupe Ochoa Navarro, Juan De Dios Cota Apodaca, Dario Fuentes Guevara
Adv. Sci. Technol. Eng. Syst. J. 5(6), 584-587 (2020);
Media: Presentation File
View Description
The main objective of this research is to analyze the impact that the implementation of a strategic plan would have on achieving the competitiveness of the micro, small and medium enterprises (MSMEs) that are dedicated to bottling, marketing and distributing water in the city of Los Mochis. It is intended to offer a tool to this type of company that allows them to be competitive against large transnational companies, which currently dominate the market. For this, a mixed methodology was used, presenting a descriptive-correlational scope with a non-experimental cross-sectional design; In which a population of 21 MSMEs was analyzed, through the application of a measurement instrument, to quantify the levels of competitiveness that they currently present. The results of the investigation showed a low performance in the competitiveness of these businesses, exposing similarities between their deficiencies, highlighting the mismanagement and ignorance of the economic environment in which they are found.
“Traffic Congestion Triangle” Based on More than One-Month Real Traffic Big Data Analysis in India
Tsutomu Tsuboi
Adv. Sci. Technol. Eng. Syst. J. 5(6), 588-593 (2020);
Media: Presentation File
View Description
This research describes new traffic congestion analysis method “Traffic Congestion Triangle” based on more than one month real traffic big data analysis in India. The location of the research is one of typical economical growing cities Ahmedabad in Gujarat state of India. The traffic congestion becomes more serious issues in most developing countries and the congestion causes so called negative impact of environment destruction, un-necessity fuel consumption, health problem, economical loss and fatality by traffic accidents. Therefore there are lots of challenges for transportation under many projects focused on “smart city” development. The traffic analysis and theory have been developed after 1950s and it has helped many countries. However there are still lot of unknown condition in traffic analysis of developing countries.
This research started from 2015 and it enables to get real traffic flow data. This manuscript is described not only the typical traffic flow characteristics such as daily traffic volume but also introduces unique traffic congestion characteristics by using “occupancy” which has triangle shape characteristics as traffic congestion daily trend.
Effective Application of Information System for Purchase Process Optimization
Pearl Keitemoge, Daniel Tetteh Narh
Adv. Sci. Technol. Eng. Syst. J. 5(6), 594-605 (2020);
Media: Presentation File
View Description
This paper comprehensively focused on the information systems for purchase process optimization (IS) that are currently being used. It elaborates four crucial and important areas in relation to purchasing and supply management. These includes; Identifying the process involved in purchasing, how information systems are implemented and used in enhancing the supply process, understanding the key features that constitute enterprise resource planning, purchasing database and electronic communication between customer and supplier, how Enterprise Resource Planning (ERP) system facilities and improves effective communication in managing supplier selection process and lastly describing the future requirements of electronic purchasing system that will enable transparency, visibility and accessibility of information across the overall supply process. The parameters effectively enable optimization of purchase process. In today’s era where Internet is accessible, the most thriving firms are those that are in-step with the fast pace of technological revolution. Currently, there are more high-end devices at the finger-tip of the customers, which has significantly enabled customers and many business firms to prefer executing transactions on their portable devices instead. The available technologies such as ERP system has opened an e relevant in the market.
Cognitive Cybernetics in the Foresight of Globalitarianism
Zdenko Balaž, Krystian Wawrzynek
Adv. Sci. Technol. Eng. Syst. J. 5(6), 718-723 (2020);
Media: Presentation File
View Description
This paper presents the results of the research conducted with the help of cognitive cybernetics about the “mass” factor from the theory of totalitarianism. According to the expert system model, “big data” analysis sought to discover knowledge for assessing the future status of digital social connectivity. Originally developed models and methods of cognitive and computer research and processing of “eminent text” with the help of “convolution” from the theoretical background of many works about the past totalitarianism, recognize the same characteristics of emerging globalitarianism. Using search correlation, algorithms have confirmed a suspicion that intelligent interactive technologies impact the changes of the human psychophysical structure, through the digital social network of globalitarianism. Applying new intelligent interactive technologies without being familiar with their deeper impact, could plausibly make people accept them as part of themselves. People’s digital obsession with the internet in a “global village”, is increasingly similar to blind obedience, of the sympathizers and followers, gathered around past totalitarianism’s grand leaders. There is a visible correlation between such technological-integration engagement, and the loss of social intelligence, leading to a conclusion that the future interactions between people and intelligent technologies will turn the history into the implosion of current events. History today is already pretty accelerated, due to its own mass digital integration and interaction, turning it into an illusion. There is a danger that globalitarianism, as an elusive end of the history, will escape cyclical times, due to unavoidable repetitiveness.
Ultra Wide Band-based Control of Emulated Autonomous Vehicles for Collision Avoidance in a Four-Way Intersection
Jashandeep Bhuller, Paolo Dela Peña, Vladimir Christian Ocampo II, Julio Simeon, Lawrence Materum
Adv. Sci. Technol. Eng. Syst. J. 5(6), 724-730 (2020);
Media: Presentation File
View Description
Traditional crossroads use traffic lights, but may cause delays due to its stop and wait process. Moreover, autonomous vehicles are being made, although it is not yet mass-produced. This work presents a design and implementation of an agile system that communicates and coordinates autonomous vehicles entering the intersection by adjusting their speeds and maneuvers to avoid collisions, allowing them to weave and pass through intersection traffic and avoiding them to completely stop, leading to an increase in the throughput of the intersection, and therefore reducing congestion. A centralized computer implements agile communication and controls the system based on a first-come-first-served policy. The hardware implementation emulates four prototype vehicles modeled as radio-controlled cars integrated with an ultra wide band (UWB) localization technology. The cars were used to simulate the passage without collision in a typical four-way intersection. Pozyx UWB modules were chosen for tracking the vehicles due to its compatibility with Arduino as well as its numerous other functionalities and accuracy relative to other indoor-positioning systems. Results show that Pozyx could track autonomous vehicles. The study resulted in a system that can control self-driving cars through the intersection with key measures. Moreover, the characterization of Pozyx in this hardware implementation was observed.
Minimizing Collisions of Self-Driving Cars by a Control System Using Predetermined Two-Dimensional Grid Localization
Jashandeep Bhuller, Paolo Dela Peña, Vladimir Christian Ocampo II, Julio Simeon, Lawrence Materum
Adv. Sci. Technol. Eng. Syst. J. 5(6), 731-737 (2020);
Media: Presentation File
View Description
Most crossroads use a traffic light to communicate between human-driven vehicles by conveying to the drivers when to stop and when to proceed in an organized fashion. However, the stop and wait nature of traffic lights cause slower mobility, leading to congestion. This paper presents a localization scheme based on two-dimensional (2D) flat geometry adaptation of collision avoidance of autonomous vehicles simultaneously crossing an intersection. Using a first-come-first-served policy with tile retrieval process, the collision reduction performance was evaluated through a unit test. Four self-driving prototype cars were made with the desired functionality of controlling the speed and maneuver. Through the predetermined 2D grid locales, simulations indicate the system that can control the self-driving cars through a crossroad with zero or minimal collisions using key measures such as time duration in the intersection, average speeds of the cars, and collision count.
Comparison of Support Vector Machine-Based Equalizer and Code-Aided Expectation Maximization on Fiber Optic Nonlinearity Compensation Using a Proposed BER Normalized by Power and Distance Index
Mark Renier M. Bailon, Lawrence Materum
Adv. Sci. Technol. Eng. Syst. J. 5(6), 738-743 (2020);
Media: Presentation File
View Description
Advances in optimizing optical fiber communications have been on the rise these recent years due to the increasing demand for larger data bandwidths and overall better efficiency. Coherent optics have focused on many kinds of research due to its ability to transport greater amounts of information, have better flexibility in network implementations, and support different baud rates and modulation techniques. These result in fiber-optic lines to provide faster speeds to end-users. Recent literature has looked into further developing digital signal processing techniques, while others have focused on fiber material optimization. Machine learning is another area of research that has garnered traction due to such demands. This survey discusses support vector machine (SVM) and code-aided expectation-maximization (CAEM) techniques on how they compensate for nonlinearity in coherent fiber optical communications. The study mainly focuses on how these techniques impact the performance of the transmissions where they are implemented and how they compensate for fiber optic nonlinearity through either the reduction of bit error rates (BERs), the improvements in the quality factor, or through a suggested index based on BER, power, and distance. Collating the results and based on a distinctive index, SVM is preferable in mid-range haul transmissions while CAEM for longer hauls.