×

You are using an outdated browser Internet Explorer. It does not support some functions of the site.

Recommend that you install one of the following browsers: Firefox, Opera or Chrome.

Contacts:

+7 961 270-60-01
ivdon3@bk.ru

  • Investigation of the influence of wind on the flight path in a Simulink model of a lightweight aircraft

    A Simulink model of a lightweight aircraft is being studied as part of the Aerospace Blockset package, including a system model of the aircraft, an environmental model, a model of pilot influences, and a visualization block. The structure of the flight model is considered and models of the effects of the environment and wind are disclosed in detail, consisting of blocks of physical terrain features, wind models and an atmospheric model, a gravity model, each of which is set to an altitude. The Wind Shear Model block calculates the amount of wind shear as a function of altitude and measured speed wind. The Discrete Wind Gust Model block determines the resulting wind speed as a function of the distance traveled, the amplitude and length of the gust. The turbulence equations comply with the MIL-F-8785C specification, which describes turbulence as a random process determined by velocity spectra. Simulation results are presented that reflect changes in the trajectory of movement under various wind influences specified in the wind speed gradient block.

    Keywords: modeling, airplane flight, Simulink, Aerospace Blockset, crosswind, turbulence, turbulence equations, gravity model, motion trajectory

  • Optimizing the database-based deduplication process

    It is impossible to imagine the present time without software. Huge flows of information pass through computer computing systems. It is absolutely impossible to process unstructured, endlessly incoming data, so it is necessary to identify specific tasks and prepare information for processing. One such action is deduplication. This article discusses possible optimizations for the method of removing duplicates using databases.

    Keywords: deduplication, database, field, string, text data, query, software, unstructured data

  • Modelling of cargo transportation parameters based on benchmark analysis of the transport companies’ market

    The paper examines current issues of modeling and forecasting market parameters for transport companies providing services for the transportation of industrial enterprises’ good, such as cost, time, speed and volumes of delivery of finished products to consumers, and also assesses the potential capabilities of transport companies to provide the required quantity and quality of transport and logistics services. The aim of the study is to determine the area of reliable forecasts of transportation indicators for each interval value of the cargo delivery shoulder, taking into account the company’s market share. Modeling of the time parameters of cargo transportation was carried out based on road transportation conditions and the time of year. When implementing modeling procedures, the required statistical basis for parameters of travel time and distance on the route was formed on the basis of data from specialized applications for analyzing indicators of transport and logistics services of freight vehicles. A family of forecast curves was obtained for various variants of forecast models of speed and travel time, as well as interval values of delivery lengths for the initial set of transport and logistics companies. and development of new production on available floor spaces. The most important organizational economic targets of a diversification of management are presented by innovative activity of the industrial enterprise.

    Keywords: statistical forecasting, transportation efficiency, benchmark models, tariffs for cargo transportation, piecewise linear approximation, areas of reliable forecasts, cargo transportation parameters, benchmark analysis, transport company market

  • Models and application of neural network post-recognition image interpreters

    The work outlines the concept of “post-interpretation” of images and for its algorithmic implementation a model of a post-recognition interpreter is proposed. The recognition results of the initial images entering the recognition system are considered as post-images, and an artificial neural network is used as a post-recognizer. To assess the effectiveness of using the model, it is proposed to use the “expediency criterion” and numerical examples are considered to illustrate the features of its use in systems for recognizing and interpreting images with high risks. Data from preliminary results of experimental testing of a model for recognizing speech commands as part of an interactive operator's manual for performing various tasks and an assessment of its effectiveness are presented.

    Keywords: intelligent data processing system, image interpretation, recognition reliability, decision-making criterion, artificial neural network

  • Analysis of standard models of titanium oxide-based memristors for use in artificial intelligence systems

    The article discusses standard models of titanium dioxide-based memristors. A memristor is similar to a memory resistor and demonstrates a nonlinear resistance characteristic in which the charge parameter is a state variable. They can be used to create new types of electronic devices with high energy efficiency and performance, as well as to create machines that can learn and adapt to changing environmental conditions and in many practical applications: data storage memory (binary and multilevel), switches in logical electronic circuits, plastic components in neuromorphic artificial systems intelligence based on nanoelectronic components. It has been shown that when voltage is applied to charged ions, they begin to drift, and the boundary between the two regions shifts. When a sinusoidal alternating voltage of a given frequency is applied to the memristor, the shape of the volt-ampere characteristic (VAC) resembles a Lissajous diagram centered at the origin.

    Keywords: memristor, model, voltage characteristic, nonlinearity

  • Development of a data indexing system for the production, economic and labor sectors of the penitentiary system

    The development of business analytics, decision-making and resource planning systems is one of the most important components of almost any enterprise. In these matters, enterprises and production facilities of the penitentiary system are no exception. The paper examines the problem of the relationship between existing databases and statistical reporting forms of the production, economic and labor sectors of the penitentiary system. It has been established that indirectly interrelated parameters are quite difficult to compare due to different data recording systems, as well as approved statistical forms. One of the first steps in solving this problem could be the introduction of a generalized data indexing system. The paper discusses data indexing systems, the construction of their hierarchical structures, as well as the possibility of practical application using SQL. Examples of implementation using ORM technology and the Python language are considered.

    Keywords: databases, indexing, ORM, SQL, Python, manufacturing sector, economic indicators, penitentiary system

  • Empirical analysis of the predictive properties of the continuous form of the maximum consistency method

    The article studies the possibility of using the continuous form of the maximum consistency method when constructing regression models to calculate the forecast values of the air transport passenger turnover indicator in the Russian Federation. The method under study is compared with classical methods of regression analysis - least squares and moduli. To assess the predictive properties of the methods, the average relative forecast error and the continuous form of the criterion for the consistency of behavior between the calculated and actual values of the dependent variable are used. As a result of the analysis, a conclusion was made about the possibility of using the method under study to solve forecast problems.

    Keywords: least squares method, continuous form of the maximum consistency method, modeling, passenger turnover, air transport, adequacy criteria

  • Comparative analysis of the effectiveness of software tools for splitting videos into frames using the example of the field of road surface quality assessment

    Roads occupy an important place in the life of almost every person. The quality of the coating is the most significant characteristic of the roadway. To evaluate it, there are many systems, among which there are those that analyze the road surface using video information streams. In turn, the video is divided into frames, and the images are used to directly assess the road quality. Splitting video into frames in such systems works based on special software tools. To understand how effective a particular software is, a detailed analysis is needed. In this article, OpenCV, MoviePy and FFMpeg are selected as software tools for analysis. The research material is a two-minute video of the road surface with a frame rate 29.97 frames/s and mp4 format. The average time to get one frame from a video is used as an efficiency indicator. For each of the three software tools, 5 different experiments were conducted in which the frame size in pixels was consistently increased by 2 times: 40000, 80000, 160000, 320000, 640000. Each program has a linear dependence of O(n) average frame retrieving time on resolution, however, FFMpeg has the lowest absolute time indicators, as well as the lowest growth rate of the function, therefore it is the most effective tool compared to the others (OpenCV, MoviePy).

    Keywords: comparison, analysis, effectiveness, software tool, library, program, video splitting, frame size, resolution, road surface

  • IT infrastructure monitoring systems based on Big Data methods

    The article examines a new class of IT infrastructure monitoring systems that has been actively emerging in the last decade, the key feature of which is the widespread use of methods and techniques for working with big data. Depending on the market positioning, the systems under study are known under such names as AIOps, observability platform, all-in-one monitoring, umbrella monitoring. In their review of existing foreign and domestic commercial solutions, the authors focus on the use of big data methods in them. Based on the review, a classification of such products is proposed, which makes it possible to streamline the existing diversity and select the most suitable system for the tasks facing the organization in the field of monitoring an increasingly complex IT infrastructure. The relevance of the study is due to the lack of classification of the objects under study due to their relative novelty and pronounced practical nature.

    Keywords: monitoring system, IT infrastructure, observability platform, AIOps, big data, machine learning

  • Development of algorithms for processing time series when working with statistical reporting forms of the production sector of the penitentiary system

    To date, the penitentiary system of the Russian Federation has collected quite extensive databases for the production sector. The collected data is a time series. However, when studying the mutual distributions of parameters, a number of problems arise, the main one of which is that a different data accounting system is maintained for different parameters: in some cases, data accounting is cumulative throughout the year, in other cases, actual values are taken into account (in other words, some time series are trending, while others are seasonal (cyclical)). Data accounting periods also differ: monthly, quarterly, or per year. Thus, at first glance, the related parameters are almost impossible to compare. The paper proposes a number of algorithms that would solve this problem. The aim of the work was to develop new algorithms that allow comparing trend and seasonal time series using the example of the industrial sector of the penitentiary system. The objectives of the study can be designated as: classification of parameters that are taken into account as seasonal and as trend time series; development of algorithms for their comparison; study of the applicability of the results obtained.

    Keywords: algorithm, data processing, python, time series, penitentiary system, manufacturing sector.

  • Development of a retiling microservice in the Python programming language

    In the modern world, it is increasingly necessary to process geographical information in a variety of forms. This paper discusses the concept of «tile», its purpose, features, as well as the process of retiling, which is a method of creating and updating tiles. This technology helps to increase the efficiency of modern cartographic services, reducing the loading time of maps. The main stages of the development of a microservice implementing the retiling logic are presented sequentially. The main data provider is the OpenStreetMap (OSM) open source project. The spatial data set is a core OSM product and contains up-to-date geographic data and information from around the world. The technology stack is based on the Python language, to which specialized modules for working with tiles are added, as well as a library for implementing a simple and high-quality API.

    Keywords: Python, tile, retiling, OpenStreetMap, microservice, Flask-RESTX, mercantile

  • Migration of variable services from proprietary to open popular software

    The article discusses one of the possible ways to transfer (migrate) variable services from proprietary to an available free and popular solution, as well as ways to improve the structure and eliminate problem areas.

    Keywords: variables, configuration, service, Octopus, Git, Vault, migration

  • On the quality of learning of root-based decision making of partially connected neural networks under conditions of limited data

    The quality of training of incompletely connected neural networks based on decision's roots is discussed. Using the example of limited data on patients with clinically diagnosed Alzheimer's disease and conditionally healthy patients, a decision's root and the corresponding neural network structure are found by preprocessing the data. The results of training an incompletely connected artificial neural network of this type are demonstrated for the first time. The results of training of this type of neural network allowed us to find a neural network with an acceptable level of accuracy for the practical application of the obtained neural network to support medical decision making - in the considered example for the diagnosis of Alzheimer's disease.

    Keywords: neural networks, complex assessment mechanisms; decision roots, criteria trees, convolution matrices, data preprocessing

  • Development of a mathematical model and a software package for automating scientific research in the field of financial industry news analysis

    The article is devoted to the development of a mathematical model and a software package designed to automate scientific research in the field of financial industry news analysis. The authors propose an approach based on the use of graph theory methods to identify the most significant scientific hypotheses, the methods used, as well as the obtained qualitative and quantitative results of the scientific community in this field. The proposed model and software package make it possible to automate the process of scientific research, which contributes to a more effective analysis of it. The research results can be useful both for professional participants in financial markets and for the academic community, since the identification of the most cited and fundamental works serves as the starting point of any scientific work.

    Keywords: software package, modeling, graph theory, news streams, Russian stock market, stocks, citation graph

  • A website for debugging of robots artificial intelligence technologies

    The article presents the state of technology of websites for designing robots with artificial intelligence. The image of a modern technical site-book as a place for the development of artificial intelligence applications is considered, the possibility of executing algorithms from the page to ensure the connection of robots with real and virtual objects is shown.

    Keywords: mathematical network, technical website-book, artificial intelligence, algorithms executed on the website-book, network development of robots

  • Developing a Piecewise Linear Regression Model for a Steel Company Using Continuous Form of Maximum Consistency Method

    The paper presents a brief overview of publications describing the experience of using mathematical modeling methods to solve various problems. A multivariate piecewise linear regression model of a steel company was built using the continuous form of the maximum consistency method. To assess the adequacy of the model, the following criteria were used: average relative error of approximation, continuous criterion of consistency of behavior, sum of modules of approximation errors. It is concluded that the resulting model has sufficient accuracy and can be used for forecasting.

    Keywords: mathematical modeling, piecewise linear regression, least modulus method, continuous form of maximum consistency method, steel company

  • On image masking as the basis for building a visual cryptography scheme

    The features of the (m,m) implementation scheme of visual cryptography are considered, which differs from the existing ones by the formation of shadow images (shares) of an image containing a secret. The proposed approach is based not on the decomposition of the secret image into shares, but on their step-by-step transformation by multiplication by orthogonal Hadamard matrices. The images obtained during each transformation of the stock are noise-resistant in the data transmission channel.

    Keywords: image with a secret, image decomposition, image transformation, orthogonal Hadamard matrices, two-way matrix multiplication, noise-resistant image encoding

  • Modeling the probabilistic characteristics of blocking requests for access to radio resources of a wireless network

    Fifth-generation networks are of great interest for various studies. One of the most important and relevant technologies for efficient use of resources in fifth-generation networks is Network Slicing technology. The main purpose of the work is to simulate the probabilistic characteristics of blocking requests for access to wireless network radio resources. The main task is to analyze one of the options for implementing a two–service model of a wireless network radio access scheme with two slices and BG traffic. In the course of the work, the dependence of the probability of blocking a request depending on the intensity of receipt of applications of various types was considered. It turned out that the probability of blocking a type i application has the form of an exponential function. According to the results of the analysis, request blocking occurs predictably, taking into account the nature of incoming traffic. Previously, there are no significant drawbacks in the considered model. The developed model is of great interest for future, deeper and long-term research, for example, using simulation modeling, with the choice of optimal network parameters.

    Keywords: queuing system, 5G, two - service queuing system, resource allocation, Network Slicing, elastic traffic, minimum guaranteed bitrate

  • An error correction algorithm in the modular code of deduction classes, which provides increased fault tolerance of OFDM systems

    One of the directions that makes it possible to increase the efficiency of low-orbit satellite Internet in conditions of destructive influences is the use of OFDM systems that support the frequency hopping mode. It is obvious that the effectiveness of countering interference generated by electronic warfare (EW) is largely determined by the algorithm for selecting operating frequencies. In this paper, it is proposed to implement a block of SSF based on the SPN cipher "Grasshopper", which provides high resistance to the selection of the operating frequency by the SREB. However, in the event of failures and failures in the operation of such a unit, the transmitter and receiver operating in the microwave mode will not be able to establish information transmission. To solve this problem, the article proposes to use polynomial modular residue class codes (PMCCS). However, the analysis of the well-known error correction algorithms in PMCS has shown that they cannot be used to increase the reliability of the SPN-based CCF unit.

    Keywords: Keywords: OFDM systems supporting frequency hopping, pseudorandom number generation methods, Grasshopper SPN cipher, polynomial modular residue class codes, error correction algorithm

  • Using fuzzy cognitive maps to solve the problem of municipal development

    In the context of rapid urbanization of society, modeling the processes of sustainable urban development has attracted considerable attention from scientists. This paper presents a study of fuzzy cognitive maps (FCMs) as an interdisciplinary model for simulating urban development processes. This highlights the versatility of FCM in integrating expertise and quantifying the impact of indicators that shape urban space, from infrastructure and housing to environmental sustainability and community well-being. The study uses a synthesis of an extensive literature review and expert opinions to create and refine a cognitive map tailored for municipal development. The methodology outlined formulates a systematic approach to selecting concepts, assigning weights, and validating the model. Through collaboration with cross-disciplinary experts, the study confirms the value of FCM for identifying cascading effects in the decision-making process when shaping urban development strategies. Recognizing the limitations of expert methods and the fuzzy nature of data, the article argues for the effectiveness of FCM in not only identifying but also addressing emerging urbanization problems. Ultimately, this article contributes a nuanced perspective to strategic planning discourse by advocating for the use of NCC as a management decision support tool that can assist policymakers in achieving a sustainable and equitable urban future.

    Keywords: fuzzy cognitive maps, urban development, urban planning, sustainable urbanization, expert systems, social well-being

  • Programming the robot controller to implement the technological process of laser cutting

    Stepper motors are often used in automated laser cutting systems. The control circuit of a stepper motor requires a special electronic device - a driver, which receives logical signals as input and changes the current in the motor windings to provide motion parameters. This research study evaluated stepper motor drivers to determine the feasibility of their use - PLDS880, OSM-42RA, OSM-88RA. To control the system, software code was written, which was connected to the controller via a link board. With each driver, in different modes, optimal parameters were selected (initial speed, final speed and acceleration), that is, the movement of the carriage without stalling for ten passes with a minimum travel time. The results of the experiments are presented in the form of tables.

    Keywords: laser, laser cutting, automation, technological process, stepper motor, performance, driver, controller, control circuit, optimal parameters

  • Forecasting and managing traffic of telecommunication systems using artificial intelligence systems

    In this paper, we reviewed and analyzed various time series forecasting models using data collected from IoT mobile devices. The main attention is paid to models describing the behavior of traffic in telecommunication systems. Forecasting methods such as exponential smoothing, linear regression, autoregressive integrated moving average (ARIMA), and N-BEATS, which uses fully connected neural network layers to forecast univariate time series, are covered. The article briefly describes the features of each model, examines the process of their training, and conducts a comparative analysis of the quality of training. Based on data analysis, it was noted that for the UDP protocol, the ARIMA model has the best learning quality, for the TCP protocol - linear regression, and for the HTTPS protocol - ARIMA.

    Keywords: telecommunication systems, traffic analysis, forecasting models, QoS, artificial intelligence, linear regression, ARIMA, Theta, N-BEATS

  • Implementation of neural network models for predicting performance in a smart greenhouse

    This article explores the introduction and implementation of neural network models in the field of agriculture, with an emphasis on their use in smart greenhouses. Smart greenhouses are innovative systems for controlling the microclimate and other factors affecting plant growth. Using neural networks trained on data on soil moisture, temperature, illumination and other parameters, it is possible to predict future indicators with high accuracy. The article discusses the stages of data collection and preparation, the learning process of neural networks, as well as the practical implementation of this approach. The results of the study highlight the prospects for the introduction of neural networks in the agricultural sector and their important role in optimizing plant growth processes and increasing the productivity of agricultural enterprises.

    Keywords: neural network, predicting indicators, smart greenhouse, artificial intelligence, data modeling, microclimate

  • Refinement of the regression multifactor model of water level in the Iya River (Eastern Siberia)

    The paper presents a refined regression model of water level dynamics in the Siberian river Iya, which includes six natural factors on the right side (the number of days with precipitation in the Sayan Mountains, average day and night temperatures for the month, the amount of precipitation, snow depth, average atmospheric pressure for the month ) taking into account the delay, as well as a specially generated seasonal variable. The high adequacy of the model is indicated by the values ​​of the criteria of multiple determination, Fisher, and the average relative error of approximation. The constructed model can be effectively used to solve a wide range of forecasting problems.

    Keywords: regression model, river water level, lag time, seasonal variable, forecast

  • Analysis of U-Net-Attention and SegGPT neural networks in the problem of crack segmentation in road surface images

    This paper examines and compares two neural networks, U-Net-Attention and SegGPT, which use different attention mechanisms to find relationships between different parts of the input and output data. The U-Net-Attention architecture is a dual-layer attention U-Net neural network, an efficient neural network for image segmentation. It has an encoder and decoder, combined connections between layers and connections that pass through hidden layers, which allows information about the local properties of feature maps to be conveyed. To improve the quality of segmentation, the original U-Net architecture includes an attention layer, which helps to enhance the search for the image features we need. The SegGPT model is based on the Visual Transformers architecture and also uses an attention mechanism. Both models focus attention on important aspects of a problem and can be effective in solving a variety of problems. In this work, we compared their work on segmenting cracks in road surface images to further classify the condition of the road surface as a whole. An analysis and conclusions are also made about the possibilities of using architectural transformers to solve a wide range of problems.

    Keywords: machine learning, Transformer neural networks, U-Net-Attention, SegGPT, roadway condition analysis, computer vision