This article explores the opportunities and challenges of integrating cloud, fog, and edge computing in the context of digital transformation. The analysis reveals that the synergy of these technologies enables optimization of big data processing, enhances system adaptability, and ensures information security. Special attention is given to hybrid architectures that combine the advantages of centralized and decentralized approaches. Practical aspects are addressed, such as the use of the ENIGMA simulator for modeling scalable infrastructures and the EC-CC architecture for smart grids and IoT systems. The role of specialized frameworks in optimizing routing and improving infrastructure reliability is also highlighted. The integration of these technologies drives advancements in key industries, including energy, healthcare, and the Internet of Things, despite challenges related to data security.
Keywords: cloud computing, fog computing, edge computing, hybrid architectures, Internet of Things, digital transformation, big data, decentralized systems, computing integration, distributed computing, data security, resource optimization, data transfer speed
The paper analyzes existing effective technologies of waste recycling and utilization. The authors consider various approaches in the international practice of recycling production and consumption waste. An assessment is given of the possibilities of using effective technologies for waste recycling and disposal and the necessary costs for their implementation in relation to the conditions of an industrial enterprise. The types and volumes of waste that can be recycled and disposed of irrevocably are considered, for which the carbon footprint parameters are calculated using the materials management model. A statistical regression analysis of data on the production, processing, disposal and incineration of polyethylene waste, solid municipal waste and paper was carried out. The principles of building a system for reducing technogenic risks and managing production and consumption waste were determined.
Keywords: waste processing; waste disposal; carbon footprint; carbon footprint calculation methods; man-made risk management system; hazardous impact factors; industrial waste management
This article is devoted to a comparative analysis of methods for extracting knowledge from texts used to build ontologies. Various extraction approaches are reviewed, such as lexical, statistical, machine learning and deep learning methods, as well as ontology-oriented methods. As a result of the study, recommendations are formulated for choosing the most effective methods depending on the specifics of the task and the type of data being processed.
Keywords: ontology, knowledge extraction, text classification, named entities, machine learning, semantic analysis, model
This study presents a comparative analysis of machine learning models used for driver classification based on microelectromechanical system (MEMS) sensor data. The research utilizes the “UAH-DriveSet” open dataset, which includes over 500 minutes of driving data with annotations for aggressive driving events, such as sudden braking, sharp turns, and rapid acceleration. The models evaluated in this study include gradient boosting algorithms, a recurrent neural network and a convolutional neural network. Special attention is given to the impact of data segmentation parameters, specifically window size and overlap, on classification performance using the sliding window method. The effectiveness of each model was assessed based on classification metrics such as accuracy, precision, and F1 score. The results show that gradient boosting “LightGBM” outperforms the other models in terms of accuracy and F1 score, while long short-term memory model demonstrates good performance with time-series data but requires larger datasets for better generalization. Convolutional neural network, while effective for identifying short-term patterns, faced difficulties with class imbalances. This research provides valuable insights into selecting the most appropriate machine learning models for driver behavior classification and offers directions for future work in developing intelligent systems using MEMS sensor data.
Keywords: driver behavior analysis, microelectromechanical system sensors, machine learning, aggressive driving, gradient boosting, recurrent neural networks, convolutional neural networks, sliding window, driver classification
The results of fire dynamics simulation based on the FDS software kernel are a large amount of data describing the dynamics of various parameters in the space of the studied object. Solving various research problems based on them may require quite complex processing, which goes beyond the functionality of existing software solutions. The article is devoted to the method of efficiency increasing for numerical fire dynamics simulation results processing by automating the implementation of relevant operations. The article describes the functional model of the developed technology and its main stages. Approbation of the proposed method was carried out using the example of solving the problem of forming initial data arrays in high spatial and time resolution for the subsequent study of enclosing tunnel structures heating in case of fire. Graphs of the gas medium temperature at various points under the roof of the tunnel structure from the coordinate are presented, as well as temperature fields in the vertical section of the investigated structure in the plane passing through the fire focus at different times. Based on the comparative analysis, it was shown that the speed of calculation results automated processing is several orders of magnitude higher compared to methods that use the functionality of existing software solutions designed to view the output of the fire dynamics simulation.
Keywords: fire dynamics simulation, automation, data processing, tunnel structures, mathematical model, FDS
The article discusses current issues related to the design of a smart home wireless local area network based on splitter-repeater modules. Special attention in the study is paid to the modules of wired and wireless hubs and switches. The results of the comparative characteristics of PLC and FBT splitter-repeaters are also presented. Particular emphasis is placed on the network topology and its main components.
Keywords: wireless network, topology, data, transmission, power, traffic, packet, failures, adapter, cable, connection
The article explores the actor model as implemented in the Elixir programming language, which builds upon the principles of the Erlang language. The actor model is an approach to parallel programming where independent entities, called actors, communicate with each other through asynchronous messages. The article details the main concepts of Elixir, such as comparison with a sample, data immutability, types and collections, and mechanisms for working with the actors. Special attention is paid to the practical aspects of creating and managing actors, their interaction and maintenance. This article will be valuable for researchers and developers interested in parallel programming and functional programming languages.
Keywords: actor model, elixir, parallel programming, pattern matching, data immutability, processes, messages, mailbox, state, recursion, asynchrony, distributed systems, functional programming, fault tolerance, scalability
The transition from scheduled maintenance and repair of equipment to maintenance based on its actual technical state requires the use of new methods of data analysis based on machine learning. Modern data collection systems such as robotic unmanned complexes allow generating large volumes of graphic data in various spectra. The increase in data volume leads to the task of automating their processing and analysis to identify defects in high-voltage equipment. This article analyzes the features of using computer vision algorithms for images of high-voltage equipment of power plants and substations in the infrared spectrum and presents a method for their analysis, which can be used to create intelligent decision support systems in the field of technical diagnostics of equipment. The proposed method uses both deterministic algorithms and machine learning. Classical computer vision algorithms are applied for preliminary data processing in order to highlight significant features, and models based on unsupervised machine learning are applied to recognize graphic images of equipment in a feature space optimized for information space. Image segmentation using a spatial clustering algorithm based on the density distribution of values taking into account outliers allows detecting and grouping image fragments with statistically close distributions of line orientations. Such fragments characterize certain structural elements of the equipment. The article describes an algorithm that implements the proposed method using the example of solving the problem of detecting defects in current transformers, and presents a visualization of its intermediate steps.
Keywords: diversification of management, production diversification, financial and economic purposes of a diversification, technological purposes of ensuring flexibility of production
The annual growth of the load on data centers increases many times over, which is due to the growing growth of users of the information and telecommunications network Internet. Users access various resources and sources, using search engines and services for this. Installing equipment that processes telecommunications traffic faster requires significant financial costs, and can also significantly increase the downtime of the data center due to possible problems during routine maintenance. It is more expedient to focus resources on improving the software, rather than the hardware of the equipment. The article provides an algorithm that can reduce the load on telecommunications equipment by searching for information within a specific subject area, as well as by using the features of natural language and the process of forming words, sentences and texts in it. It is proposed to analyze the request based on the formation of a prefix tree and clustering, as well as by calculating the probability of the occurrence of the desired word based on the three sigma rule and Zipf's Law.
Keywords: Three Sigma Rule, Zipf's Law, Clusters, Language Analysis, Morphemes, Prefix Tree, Probability Distribution
The article presents the existing methods of reducing the dimensionality of data for teaching machine models of natural language. The concepts of text vectorization and word-form embedding are introduced. The task of text classification is being formed. The stages of classifier training are being formed. A classifying neural network is being designed. A series of experiments is being conducted to determine the effect of reducing the dimension of word-form embeddings on the quality of text classification. The results of evaluating the work of trained classifiers are compared.
Keywords: natural language processing, vectorization, word-form embedding, text classification, data dimensionality reduction, classifier
The paper discusses the use of the M/M/n mass service model to analyze the performance of cloud storage systems. Simulations are performed to identify the impact of system parameters on average latency, blocking probability, and throughput. The results demonstrate how optimizing the number of servers and service intensity can improve system performance and minimize latency. The relevance of the study is due to the need to improve the performance of cloud solutions in the context of growing data volumes and increasing load on storage systems.
Keywords: cloud storage, mass service theory, M/M/n model, Python, modeling, performance analysis
This paper considered the problem of detection and classification of surface objects in low visibility conditions such as rain and fog. The focus is on the application of state-of-the-art deep learning algorithms, in particular the YOLO architecture , to improve detection accuracy and speed. The introduction to the problem includes a discussion of the limitations of visibility degradation, the change in shape and size of objects depending on the viewing angle, and the lack of training data. The paper also presents the use of discrete wavelet transform to improve image quality and increase the robustness of the systems to adverse conditions. Experimental results show that the proposed algorithm achieves high accuracy and speed, which makes it suitable for application in drone video monitoring systems.
Keywords: YOLO, wavelet transform, overwater objects, drones, low visibility condition, Fourier transforms, Haar
In the work describes the extreme filtering method and the author's approaches that allow adapting it to work in real time: frame-by-frame processing and the method with signal loading. Further, solutions are presented that can be used to implement the above on real devices. The first solution is to use the Multiprocessing library for the Python language. The second approach involves creating a client-server application and sending asynchronous POST requests to implement the frame-by-frame signal processing method. The third method is also associated with the development of a client-server application, but with the WebSocket protocol, not HTTP, as in the previous approach. Then, the results are presented, and conclusions are made about the suitability of the author's approaches and solutions for working on real devices. It is noted that the solution based on the use of the WebSocket protocol is of particular interest. This solution is suitable for both the frame-by-frame signal processing method and the method with value loading. It is also noted that all approaches proposed by the author are workable, which is confirmed by the time values and the coincidence of the graphs.
Keywords: extreme filtering, frame-by-frame signal processing method, method with value loading, Multiprocessing, HTTP, WebSocket, REST, JSON, Python, microcontrollers, single-board computers
The main maintenance of a diversification of production as activity of subjects of managing is considered. being shown in purchase of the operating enterprises, the organizations of the new enterprises, redistribution of investments in interests of the organization and development of new production on available floor spaces. The most important organizational economic targets of a diversification of management are presented by innovative activity of the industrial enterprise.
Keywords: software systems, visualization, data, graphic systems, parts, models, diagrams, drawings
The article is devoted to the developed code designer for the Scilab environment, which is intended to automate the process of creating software modules. The program allows you to generate code for Scilab through an intuitive interface, providing users with tools for working with variables, loops, graphs, system analysis and user-defined functions. The constructor allows you to write programs for Scilab without knowledge of a programming language.
Keywords: Scilab, code designer, programming automation, code generation, visual programming
Linear feedback shift registers (LFSR) and the pseudo-random sequences of maximum length (m-sequences) generated by them have become widely used in solving problems of mathematical modeling, cryptography, radar and communications. The wide distribution is due to their special properties, such as correlation. An interesting, but rarely discussed in the scientific literature of recent years, property of these sequences is the possibility of forming quasi-orthogonal matrices on their basis.In this paper, was conducted a study of methods for generating quasi-orthogonal matrices based on pseudo-random sequences of maximum length (m-sequences). An analysis of the existing method based on the cyclic shift of the m-sequence and the addition of a border to the resulting cyclic matrix is carried out. Proposed an alternative method based on the relationship between pseudo-random sequences of maximum length and quasi-orthogonal Mersenne and Hadamard matrices, which allows generating cyclic quasi-orthogonal matrices of symmetric structure without a border. A comparative analysis of the correlation properties of the matrices obtained by both methods and the original m-sequences is performed. It is shown that the proposed method inherits the correlation properties of m-sequences, provides more efficient storage, and is potentially better suited for privacy problems.
Keywords: orthogonal matrices, quasi-orthogonal matrices, Hadamard matrices, m-sequences
The article considers the options for visual programming of information support means for software and information complexes for UAV operators training. The main criterion indicators for systematically organizing the set of components for reusing program code are identified. An example of an unmanned payload carrier in various representative forms of visualization is given. A comparison of the labor intensity of developing the specified software and information implementations for the same unmanned robotics object with their normative labor intensity is shown. The variants of content filling during the development of the same material part of the considered device for various aspects of training specialists in the management and operation of UAV are considered. The principle of systematization of components by means of ordering the complexity of presentation and softwarе implementation is shown.
Keywords: risk forecasting, information support, training of unmanned aircraft systems operators, labor intensity assessment
The paper presents a method for quantitative assessment of zigzag trajectories of vehicles, which allows to identify potentially dangerous behavior of drivers. The algorithm analyzes changes in direction between trajectory segments and includes data preprocessing steps: merging of closely spaced points and trajectory simplification using a modified Ramer-Douglas-Pecker algorithm. Experiments on a balanced data set (20 trajectories) confirmed the effectiveness of the method: accuracy - 0.8, completeness - 1.0, F1-measure - 0.833. The developed approach can be applied in traffic monitoring, accident prevention and hazardous driving detection systems. Further research is aimed at improving the accuracy and adapting the method to real-world conditions.
Keywords: trajectory, trajectory analysis, zigzag, trajectory simplification, Ramer-Douglas-Pecker algorithm, yolo, object detection
A class of mathematical methods for code channel division has been developed based on the use of pairs of orthogonal encoding and decoding matrices, the components of which are polynomials and integers. The principles of constructing schemes for implementing code channel combining on the transmitting side and arithmetic code channel division on the receiving side of the communication system and examples of such schemes are presented. The proposed approach will significantly simplify the design of encoding and decoding devices used in space and satellite communication systems.
Keywords: telecommunications systems, telecommunications devices, multiplexing, code division of channels, matrix analysis, encoding matrices, synthesis method, orthogonal matrices, integers
A method is proposed for cascading connection of encoding and decoding devices to implement code division of channels. It is shown that by increasing the number of cascading levels, their implementation is significantly simplified and the number of operations performed is reduced. In this case, as many pairs of subscribers can simultaneously exchange information, what is the minimum order of the encoding and decoding devices in the system. The proposed approach will significantly simplify the design of encoding and decoding devices used in space and satellite communication systems.
Keywords: telecommunications systems, telecommunications devices, multiplexing, code division of channels, orthogonal matrices, integers, cascaded connection
Regression analysis based on the use of statistical data and their processing by special methods is an effective method of researching and forecasting the number of employees of structural units. In this paper, based on statistical information on 81 regional offices of the Social Fund of Russia, a regression analysis of the staffing of individual information protection divisions was carried out taking into account the total area and population of the regions. It is shown that a number of subjects are understaffed and some of them, on the contrary, are overstaffed.
Keywords: information protection, regression model, adequacy criteria, forecasting, staffing analysis, information protection units
The effectiveness of advanced pavement defect detection algorithms is considered depending on the data collection devices used, such as cameras, GPR, LiDAR and IMU sensors installed in smartphones. Rational use of these hardware and software tools will allow utilities to identify and eliminate road surface defects in a timely manner, thereby improving road safety.
Keywords: transportation sector, pavement defects, mobile road laboratories, neural network algorithms, computer vision
A Simulink model is considered that allows calculating transient processes of objects described using a transient function for any type of input action. An algorithm for the operation of the S-function that performs calculations using the Duhamel integral is described. It is shown that due to the features of the S-function, it can store the values of the previous step of the Simulink model calculation. This allows the input signal to be decomposed into step components and the time of occurrence of each step and its value to be stored. For each step of the input signal increment, the S-function calculates the response by scaling the transient response. Then, at each step of the calculation, the sum of such reactions is found. The S-function provides a procedure for freeing memory when the end point of the transient response is reached at each step. Thus, the amount of memory required for the calculation does not increase above a certain limit, and, in general, does not depend on the length of the model time. For calculations, the S-function uses matrix operations and does not use cycles. Due to this, the speed of model calculation is quite high. The article presents the results of calculations. Recommendations are given for setting the parameters of the model. A conclusion is formulated on the possibility of using the model for calculating dynamic modes.
Keywords: simulation modeling, Simulink, step response, step function, S-function, Duhamel integral.
The development, research and construction of devices that speed up the process of interaction between various modules (for example, telemetry and remote control systems), and in general, hybrid communication systems of a digital city that include a variety of systems used in an Intelligent Building is an urgent problem. One of these devices presented in the article is the optimal multi–frequency modem developed. In addition to the developed modem, the article presents examples of the development of similar types of devices and systems by both Russian and foreign researchers. At the same time, the authors proved that the use of the proposed modem provides a gain in spectral and energy efficiency in comparison with analogues. The proposed approach can be used to organize high-speed data transmission over frequency-limited communication channels based on new wired technologies of the digital subscriber line standard, as well as wireless systems.
Keywords: telemetry and remote control system, intelligent building, digital city hybrid communications system, modem, multi-frequency modulation, digital subscriber line, optimal finite signal, modulator, demodulator, wireless communication system
This study describes approaches to automating full-text keyword search in the field of patent information. Automating the search by keywords (n-grams) is a significantly more difficult task than searching by individual words, in addition, it requires morphological and syntactic analysis of the text. To achieve this goal, the following tasks were solved: (a) the full-text search systems were analyzed: Apache Solr, ElasticSearch and ClickHouse; (b) a comparison of the architectures and basic capabilities of each system was carried out; (c) search results in Apache Solr, ElasticSearch and ClickHouse were obtained on the same dataset. The following conclusions were drawn: (a) all the systems considered perform full-text keyword search; (b) Apache Solr is the system with the highest performance, it also has very convenient functions; (b) ElasticSearch has a fast and powerful architecture; (c) ClickHouse has a high data processing speed.
Keywords: search, keyphrases, patent, Apache Solr, Elasticsearch, ClickHouse