An example of simulation modeling of the process of functioning of a hydraulic system. Simulation Modeling: Creating Terms

When creating a methodology for simulation modeling, I needed to understand the terms. The problem was that conventional terms were not suitable for describing the statistical data collected during the simulation process. Terms: process And process instances were unacceptable because I could not work within Aristotle's paradigm. Aristotle's paradigm does not fit with the hardware I used. At the same time, the practical application of this technique was simple - modeling and simulation of business objects for the purpose of making management decisions. The program created a virtual object, the description of which consisted of a description of the scenarios and their interaction. Scenarios were run inside the program, and resources and their interactions were modeled.

Let me remind you that:

Simulation modeling- a method for studying objects, based on the fact that the object being studied is replaced by a simulating object. Experiments are carried out with a simulating object (without resorting to experiments on a real object) and as a result, information about the object being studied is obtained. The simulating object is an information object.

Purpose of Simulation Modeling- obtaining approximate knowledge about a certain parameter of an object without directly measuring its values. It is clear that this is necessary if and only if measurement is not possible, or it costs more than simulation. Moreover, to study this parameter, we can use other known parameters of the object and a model of its design. Assuming that the design model describes the object quite accurately, it is assumed that the statistical distributions of parameter values ​​of the modeling object obtained during the simulation will, to one degree or another, coincide with the distribution of parameter values ​​of the real object.

It is clear that the hardware that was used is statistical mathematics. It is clear that mathematical statistics does not use the terms instances and types. It works with objects and sets. As a result, to write the methodology, I was forced to use the logical paradigm on the basis of which the ISO 15926 standard was created. Its basis is the presence of objects, classes and classes of classes.

Example definitions:

Operation

Event


The figure shows the relationship between entities: events are collected into event classes. The event class is described using the “Events” directory object. Events of one class are depicted on process diagrams using graphic elements. Based on the Events directory object, the simulation engine creates simulated events.

Process

  1. Simulated process: Sequence of simulated operations. It is convenient to describe this sequence in the form of a Gantt chart. Description contains events. For example, the events: “start of process” and “end of process”.
  2. Simulating process: An object created to simulate the process being modeled. This object is created in the computer's memory as the simulation runs.
  3. Class of simulated processes: A set of simulated processes, combined according to some characteristic. The most common union is the union of processes that have a common model. A process diagram made in any modeling notation can be used as a model: Process, Procedure, EPC, BPMN.
  4. Class of simulating processes: A variety of simulated processes created within the framework of simulation to simulate activity.
  5. Process ( as an object in the directory): Directory object “Processes.
  6. Process ( process diagram): A model of processes of one class, made in the form of a diagram. Based on this model, simulating processes are created.

Conclusion

Thank you for your attention. I sincerely hope that my experience will be useful to those who wish to distinguish between the above objects. The problem of the current state of the industry is such that entities named by one term cease to differ in the minds of analysts. I tried to give you an example of how you can think and how you can introduce terms to distinguish between different entities. I hope the reading was interesting.

Simulation modeling is a powerful tool for studying the behavior of real systems. Simulation modeling methods allow you to collect the necessary information about the behavior of a system by creating its computer model. This information is then used to design the system.

The purpose of simulation modeling is to reproduce the behavior of the system under study based on the results of the analysis of the most significant relationships between its elements in the subject area in order to conduct various experiments.

Simulation modeling allows you to simulate the behavior of a system over time. Moreover, the advantage is that time in the model can be controlled: slowed down in the case of fast processes and accelerated for modeling systems with slow variability. It is possible to imitate the behavior of those objects with which real experiments are expensive, impossible or dangerous.

Simulation modeling is used when:

1. It is expensive or impossible to experiment on a real object.

2. It is impossible to build an analytical model: the system has time, causal relationships, consequences, nonlinearities, stochastic (random) variables.

3. It is necessary to simulate the behavior of the system over time.

Imitation, as a method for solving non-trivial problems, received its initial development in connection with the creation of computers in the 1950s - 1960s.

There are two types of imitation:

1. Monte Carlo method (statistical test method);

2. Simulation modeling method (statistical modeling).

Currently, there are three areas of simulation models:

1. Agent-based modeling is a relatively new (1990s-2000s) direction in simulation modeling, which is used to study decentralized systems, the dynamics of which are determined not by global rules and laws (as in other modeling paradigms), but vice versa. When these global rules and laws are the result of individual activity of group members.

The goal of agent-based models is to gain an understanding of these global rules, the general behavior of the system, based on assumptions about the individual, private behavior of its individual active objects and the interaction of these objects in the system. An agent is a certain entity that has activity, autonomous behavior, can make decisions in accordance with a certain set of rules, interact with the environment, and also change independently.

2. Discrete-event modeling is an approach to modeling that proposes to abstract from the continuous nature of events and consider only the main events of the simulated system, such as “waiting”, “order processing”, “moving with cargo”, “unloading” and others. Discrete event modeling is the most developed and has a huge range of applications - from logistics and queuing systems to transport and production systems. This type of modeling is most suitable for modeling production processes.


3. System dynamics is a modeling paradigm where graphical diagrams of causal relationships and global influences of some parameters on others over time are constructed for the system under study, and then the model created on the basis of these diagrams is simulated on a computer. In fact, this type of modeling, more than all other paradigms, helps to understand the essence of the ongoing identification of cause-and-effect relationships between objects and phenomena. Using system dynamics, models of business processes, city development, production models, population dynamics, ecology and epidemic development are built.

Basic concepts of model building

Simulation modeling is based on reproducing, using computers, the process of system functioning unfolded over time, taking into account interaction with the external environment.

The basis of any simulation model (IM) is:

· development of a model of the system under study based on private simulation models (modules) of subsystems united by their interactions into a single whole;

· selection of informative (integrative) characteristics of an object, methods of obtaining and analyzing them;

· building a model of the impact of the external environment on the system in the form of a set of simulation models of external influencing factors;

· choosing a method for studying a simulation model in accordance with methods for planning simulation experiments (IE).

Conventionally, a simulation model can be represented in the form of operating, software (or hardware) implemented blocks.

The figure shows the structure of the simulation model. The block for simulating external influences (ESI) generates implementations of random or deterministic processes that simulate the influence of the external environment on an object. The results processing unit (RPB) is designed to obtain informative characteristics of the object under study. The information necessary for this comes from the block of the mathematical model of the object (BMO). The control unit (BUIM) implements a method for studying a simulation model; its main purpose is to automate the process of conducting IE.

The purpose of simulation modeling is to construct an IM of an object and carry out IE on it to study the patterns of functioning and behavior, taking into account given restrictions and target functions under conditions of simulation and interaction with the external environment.

Principles and methods for constructing simulation models

The process of functioning of a complex system can be considered as a change in its states, described by its phase variables

Z1(t), Z2(t), Zn(t) in n - dimensional space.

The task of simulation modeling is to obtain the trajectory of motion of the system under consideration in n-dimensional space (Z1, Z2, Zn), as well as to calculate some indicators that depend on the output signals of the system and characterize its properties.

In this case, the “movement” of the system is understood in a general sense - as any change occurring in it.

There are two known principles for constructing a process model for the functioning of systems:

1. The Δt principle for deterministic systems

Let us assume that the initial state of the system corresponds to the values ​​Z1(t0), Z2(t0), Zn(t0). The Δt principle involves transforming the system model to such a form that the values ​​of Z1, Z2, Zn at the time t1 = t0 + Δt can be calculated through the initial values, and at the time t2 = t1+ Δt through the values ​​at the previous step, and so on for each i- th step (t = const, i = 1 M).

For systems where randomness is the determining factor, the Δt principle is as follows:

1. The conditional probability distribution is determined at the first step (t1 = t0+ Δt) for the random vector, let’s denote it (Z1, Z2, Zn). The condition is that the initial state of the system corresponds to the trajectory point.

2. The coordinate values ​​of the system trajectory point (t1 = t0+ Δt) are calculated as the coordinate values ​​of a random vector specified by the distribution found in the previous step.

3. The conditional distribution of the vector is found at the second step (t2 = t1 + Δ t), provided that the corresponding values ​​are obtained at the first step, etc., until ti = t0 + i Δ t takes on the value (tM = t0 + M Δ t).

The Δ t principle is universal and applicable to a wide class of systems. Its disadvantage is that it is uneconomical in terms of machine time.

2. The principle of special states (δz principle).

When considering certain types of systems, two types of states δz can be distinguished:

1. Normal, in which the system is most of the time, while Zi(t), (i=1 n) change smoothly;

2. Special, characteristic of the system at certain moments in time, and the state of the system changes abruptly at these moments.

The principle of special states differs from the Δt principle in that the time steps in this case are not constant, is a random value and is calculated in accordance with information about the previous special state.

Examples of systems that have special states are queuing systems. Special states appear when requests are received, when channels are released, etc.

Basic methods of simulation modeling.

The main methods of simulation modeling are: analytical method, static modeling method and combined method (analytical-statistical) method.

The analytical method is used to simulate processes mainly for small and simple systems where there is no randomness factor. The method is named conventionally, since it combines the capabilities of simulating a process, the model of which is obtained in the form of an analytically closed solution, or a solution obtained by methods of computational mathematics.

The statistical modeling method was originally developed as a statistical testing method (Monte Carlo). This is a numerical method consisting of obtaining estimates of probabilistic characteristics that coincide with the solution of analytical problems (for example, solving equations and calculating a definite integral). Subsequently, this method began to be used to simulate processes occurring in systems within which there is a source of randomness or which are subject to random influences. It is called the statistical modeling method.

The combined method (analytical-statistical) allows you to combine the advantages of analytical and statistical modeling methods. It is used in the case of developing a model consisting of various modules representing a set of both statistical and analytical models that interact as a single whole. Moreover, the set of modules can include not only modules corresponding to dynamic models, but also modules corresponding to static mathematical models.

Self-test questions

1. Define what an optimization mathematical model is.

2. What can optimization models be used for?

3. Determine the features of simulation modeling.

4. Characterize the method of statistical modeling.

5. What is a “black box” model, a composition model, a structure model, a “white box” model?

Simulation models

Simulation modelreproduces behaviorcreation of a complex system of interacting elementsComrade Simulation modeling is characterized by the presence of the following circumstances (all or some of them simultaneously):

  • the object of modeling is a complex heterogeneous system;
  • the simulated system contains factors of random behavior;
  • it is required to obtain a description of a process developing over time;
  • It is fundamentally impossible to obtain simulation results without using a computer.

The state of each element of the simulated system is described by a set of parameters that are stored in the computer memory in the form of tables. The interactions of system elements are described algorithmically. Modeling is carried out in a step-by-step mode. At each modeling step, the values ​​of the system parameters change. The program that implements the simulation model reflects changes in the state of the system, producing the values ​​of its required parameters in the form of tables by time steps or in the sequence of events occurring in the system. To visualize modeling results, graphical representation is often used, incl. animated.

Deterministic Modeling

A simulation model is based on imitation of a real process (imitation). For example, when modeling the change (dynamics) in the number of microorganisms in a colony, you can consider many individual objects and monitor the fate of each of them, setting certain conditions for its survival, reproduction, etc. These conditions are usually specified verbally. For example: after a certain period of time, the microorganism is divided into two parts, and after another (longer) period of time, it dies. The fulfillment of the described conditions is algorithmically implemented in the model.

Another example: modeling the movement of molecules in a gas, when each molecule is represented as a ball with a certain direction and speed of movement. The interaction of two molecules or a molecule with the wall of a vessel occurs according to the laws of absolutely elastic collision and is easily described algorithmically. The integral (general, averaged) characteristics of the system are obtained at the level of statistical processing of modeling results.

Such a computer experiment actually claims to reproduce a full-scale experiment. To the question: “Why do you need to do this?” we can give the following answer: simulation modeling makes it possible to isolate “in its pure form” the consequences of hypotheses embedded in ideas about micro-events (i.e. at the level of system elements), freeing them from the inevitable influence of other factors in a full-scale experiment, which we may not even know about suspect. If such modeling also includes elements of a mathematical description of processes at the micro level, and if the researcher does not set the task of finding a strategy for regulating the results (for example, controlling the size of a colony of microorganisms), then the difference between a simulation model and a mathematical (descriptive) one turns out to be quite conditional.

The above examples of simulation models (evolution of a colony of microorganisms, movement of molecules in a gas) lead to deterministicbathroom description of systems. They lack elements of probability and randomness of events in simulated systems. Let's consider an example of modeling a system that has these qualities.

Models of random processes

Who has not stood in line and impatiently wondered whether he would be able to make a purchase (or pay rent, ride a carousel, etc.) in the time available to him? Or, trying to call the helpline and encountering short beeps several times, you get nervous and evaluate whether I can get through or not? From such “simple” problems, at the beginning of the 20th century, a new branch of mathematics was born - queuing theory, using the apparatus of probability theory and mathematical statistics, differential equations and numerical methods. Subsequently, it turned out that this theory has numerous implications in economics, military affairs, production organization, biology and ecology, etc.

Computer modeling in solving queuing problems, implemented in the form of a statistical test method (Monte Carlo method), plays an important role. The capabilities of analytical methods for solving real-life queuing problems are very limited, while the statistical testing method is universal and relatively simple.

Let's consider the simplest problem of this class. There is a store with one seller, into which customers randomly enter. If the seller is free, then he begins to serve the buyer immediately, if several buyers come in at the same time, a queue forms. There are many other similar situations:

  • repair area for motor vehicles and buses that have left the line due to a breakdown;
  • emergency room and patients who came for an appointment due to injury (i.e. without an appointment system);
  • a telephone exchange with one entrance (or one telephone operator) and subscribers who, when the entrance is busy, are put in a queue (such a system is sometimes
    practiced);
  • a local network server and personal computers at the workplace that send a message to a server capable of receiving and processing no more than one message at a time.

The process of customers coming to the store is a random process. The time intervals between the arrivals of any consecutive pair of buyers are independent random events distributed according to some law, which can only be established through numerous observations (or some plausible version of it is taken for modeling). The second random process in this problem, which is in no way connected with the first, is the duration of service for each customer.

The purpose of modeling systems of this type is to obtain answers to a number of questions. A relatively simple question - what is the average time you will have to stand and queue for given distribution laws of the above random variables? A more difficult question; What is the distribution of waiting times for service in the queue? An equally difficult question: at what ratios of the parameters of the input distributions will a crisis occur, in which the turn of the newly entered buyer will never reach? When you think about this relatively simple task, the possible questions multiply.

The modeling method generally looks like this. The mathematical formulas used are the laws of distribution of initial random variables; the numerical constants used are the empirical parameters included in these formulas. No equations are solved that would be used in the analytical study of this problem. Instead, a queue is simulated, played out using computer programs that generate random numbers with given distribution laws. Then statistical processing of the set of obtained values ​​of quantities determined by the given modeling goals is carried out. For example, the optimal number of sellers is found for different periods of store opening hours, which will ensure the absence of queues. The mathematical apparatus used here is called methods of mathematical statistics.

Another example is described in the article "Modeling Ecological Systems and Processes" imitationnogo modeling: one of many models of the predator-prey system. Individuals of species that are in the indicated relationships, according to certain rules containing elements of chance, move, predators eat victims, both of them reproduce, etc. Such the model does not contain any mathematical formulas, but requires by the waystical processing the results.

Example of a deterministic algorithm simulation model

Let's consider a simulation model of the evolution of a population of living organisms, known as "Life", which is easy to implement in any programming language.

To construct the game algorithm, consider a square field of n -\- 1 columns and rows with regular numbering from 0 to P. For convenience, we define the extreme boundary columns and rows as the “dead zone”; they play only an auxiliary role.

For any internal cell of the field with coordinates (i,j), 8 neighbors can be defined. If the cell is “live”, we paint it over; if the cell is “dead”, it empty.

Let's set the rules of the game. If cell (i,j) is “live” and is surrounded by more than three “live” cells, it dies (from overcrowding). A “living” cell also dies if there are less than two “living” cells in its environment (from loneliness). A “dead” cell comes to life if three “live” cells appear around it.

For convenience, we introduce a two-dimensional array A, whose elements take the value 0 if the corresponding cell is empty, and 1 if the cell is “live”. Then the algorithm for determining the state of a cell with coordinate (i, j) can be defined as follows:

S:=A+A+A+A+A+A+A+A;
If (A = 1) And (S > 3) Or (S< 2)) Then B: =0;
If (A = 0) And (S = 3)
Then B: = 1;

Here the array Defines the coordinates of the field at the next stage. For all internal cells from i = 1 to n - 1 and j = 1 to n - 1, the above is true. Note that subsequent generations are determined similarly, you just need to carry out the reassignment procedure:

For I: = 1 Then N - 1 Do
For J: = 1 Then N - 1 Do
A:=B;

It is more convenient to display the field status on the display screen not in a matrix form, but in a graphical form.
All that remains is to determine the procedure for setting the initial configuration of the playing field. When randomly determining the initial state of cells, an algorithm is suitable

For I: = 1 To K Do
Begin K1: = Random(N-1);
K2:= Random (N-1)+1;
End;

It is more interesting for the user to set the initial configuration himself, which is easy to implement. As a result of experiments with this model, one can find, for example, stable settlements of living organisms that never die, remaining unchanged or changing their configuration over a certain period. Absolutely unstable (perishing in the second generation) is “cross” settlement.

In a basic computer science course, students can implement the Life simulation model as part of the Introduction to Programming section. A more thorough mastery of simulation modeling can occur in high school in a specialized or elective course in computer science. This option will be discussed below.

The beginning of the study is a lecture on simulation modeling of random processes. In Russian schools, the concepts of probability theory and mathematical statistics are just beginning to be introduced into mathematics courses, and the teacher should be prepared to make an introduction to this material, which is essential for the formation of a worldview and mathematical culture. We emphasize that we are talking about an elementary introduction to the range of concepts being discussed; this can be done in 1-2 hours.

Then we discuss technical issues related to the computer generation of sequences of random numbers with a given distribution law. In this case, we can rely on the fact that every universal programming language has a sensor of random numbers uniformly distributed on the interval from 0 to 1. At this stage it is inappropriate to go into the complex question of the principles of its implementation. Based on existing random number sensors, we show how to arrange

a) a generator of uniformly distributed random numbers on any segment [a, b];

b) a random number generator under almost any distribution law (for example, using the intuitively clear “selection-rejection” method).

It is advisable to begin considering the queuing problem described above with a discussion of the history of solving queuing problems (Erlang problem of servicing requests at a telephone exchange). This is followed by a consideration of the simplest problem, which can be formulated using the example of forming and examining a queue in a store with one seller. Note that at the first stage of modeling, the distributions of random variables at the input can be assumed to be equally probable, which, although not realistic, removes a number of difficulties (to generate random numbers, you can simply use the sensor built into the programming language).

We draw students' attention to what questions are posed first when modeling systems of this type. First, it is the calculation of average values ​​(mathematical expectations) of some random variables. For example, what is the average time you have to wait in line at the counter? Or: find the average time spent by the seller waiting for the buyer.

The teacher's task, in particular, is to explain that sample means themselves are random variables; in another sample of the same size they will have different values ​​(with large sample sizes - not too different from each other). Further options are possible: in a more prepared audience, you can show a method for estimating confidence intervals in which the mathematical expectations of the corresponding random variables are located at given confidence probabilities (using methods known from mathematical statistics without attempting to justify them). For a less prepared audience, we can limit ourselves to a purely empirical statement: if in several samples of equal size the average values ​​coincide at a certain decimal place, then this sign is most likely correct. If the simulation fails to achieve the desired accuracy, the sample size should be increased.

For an even more mathematically prepared audience, one can pose the question: what is the distribution of random variables that are the results of statistical modeling, given the given distributions of random variables that are its input parameters? Since the presentation of the corresponding mathematical theory in this case is impossible, we should limit ourselves to empirical techniques: constructing histograms of the final distributions and comparing them with several typical distribution functions.

After mastering the initial skills of this modeling, we move on to a more realistic model, in which the input flows of random events are distributed, for example, according to Poisson. This will require students to additionally master the method of generating sequences of random numbers with the specified distribution law.

In the problem considered, as in any more complex problem about queues, a critical situation may arise when the queue grows without limit with time. Modeling the approach to a critical situation as one of the parameters increases is an interesting research task for the most prepared students.

Using the queue problem as an example, several new concepts and skills are practiced at once:

  • concepts of random processes;
  • concepts and basic skills of simulation modeling;
  • construction of optimization simulation models;
  • building multi-criteria models (by solving problems about the most rational customer service in combination with the interests of
    store owner).

Exercise :

    1. Draw up a diagram of key concepts;
  • Select practical tasks with solutions for basic and specialized computer science courses.

Model is an abstract description of the system, the level of detail of which is determined by the researcher himself. A person makes a decision about whether a given element of the system is essential, and, therefore, whether it will be included in the description of the system. This decision is made taking into account the purpose underlying the development of the model. The success of the modeling depends on how well the researcher is able to identify essential elements and the relationships between them.

A system is viewed as consisting of many interrelated elements combined to perform a specific function. The definition of a system is largely subjective, i.e. it depends not only on the purpose of processing the model, but also on who exactly defines the system.

So, the modeling process begins with defining the goal of developing the model, on the basis of which the system boundaries And required level of detail simulated processes. The chosen level of detail should allow one to abstract from aspects of the functioning of a real system that are not precisely defined due to a lack of information. In addition, the system description must include criteria for the effectiveness of the system and evaluated alternative solutions that can be considered as part of the model or as its inputs. Evaluations of alternative solutions based on given performance criteria are considered as model outputs. Typically, evaluation of alternatives requires changes to the system description and, therefore, restructuring of the model. Therefore, in practice, the process of building a model is iterative. Once recommendations can be made based on the assessments of alternatives, the implementation of the modeling results can begin. At the same time, the recommendations should clearly formulate both the main decisions and the conditions for their implementation.

Simulation modeling(in a broad sense) is the process of constructing a model of a real system and conducting experiments on this model in order to either understand the behavior of the system or evaluate (within the imposed constraints) various strategies that ensure the functioning of this system.

Simulation modeling(in a narrow sense) is a representation of the dynamic behavior of a system by moving it from one state to another in accordance with well-known operating rules (algorithms).

So, to create a simulation model, it is necessary to identify and describe the state of the system and the algorithms (rules) for changing it. This is then written in terms of some modeling tool (algorithmic language, specialized language) and processed on a computer.

Simulation model(IM) is a logical-mathematical description of a system that can be used during experiments on a digital computer.

MI can be used to design, analyze and evaluate the functioning of systems. Machine experiments are carried out with IM, which allow us to draw conclusions about the behavior of the system:

· in the absence of its construction, if it is a designed system;

· without interfering with its functioning, if it is an existing system, experimentation with which is impossible or undesirable (high costs, danger);

· without destroying the system, if the purpose of the experiment is to determine the impact on it.

The process of forming a simulation model can be briefly represented as follows ( Fig.2):

Fig.2. Scheme for forming a simulation model

Conclusion: IM is characterized by the reproduction of phenomena described by a formalized process diagram, preserving their logical structure, sequence of alternations in time, and sometimes physical content.

Simulation modeling (IM) on a computer is widely used in the study and control of complex discrete systems (CDS) and the processes occurring in them. Such systems include economic and industrial facilities, seaports, airports, oil and gas pumping complexes, irrigation systems, software for complex control systems, computer networks and many others. The widespread use of IM is explained by the fact that the size of the problems being solved and the lack of formalizability of complex systems do not allow the use of strict optimization methods.

Under imitation we will understand the numerical method of conducting computer experiments with mathematical models that describe the behavior of complex systems over a long period of time.

Simulation experiment is a display of a process occurring in the SDS over a long period of time (minute, month, year, etc.), which usually takes several seconds or minutes of computer operating time. However, there are problems for which it is necessary to carry out so many calculations during modeling (as a rule, these are problems related to control systems, modeling support for making optimal decisions, developing effective control strategies, etc.) that the IM works slower than the real system. Therefore, the ability to simulate a long period of VTS operation in a short time is not the most important thing that simulation provides.

Simulation capabilities:

1. Machine experiments are carried out with the IM, which allow us to draw conclusions about the behavior of the system:

· without its construction, if it is a designed system;

· without interfering with its functioning, if it is an existing system, experimentation with which is impossible or undesirable (expensive, dangerous);

· without its destruction, if the purpose of the experiment is to determine the maximum impact on the system.

2. Experimentally explore complex interactions within the system and understand the logic of its functioning.

4. Study the impact of external and internal random disturbances.

5. Investigate the degree of influence of system parameters on performance indicators.

6. Test new management and decision-making strategies in operational management.

7. Predict and plan the functioning of the system in the future.

8. Conduct staff training.

The basis of the simulation experiment is the model of the simulated system.

IM was developed to model complex stochastic systems - discrete, continuous, combined.

Modeling means that successive moments in time are specified and the state of the model is calculated by the computer sequentially at each of these moments in time. To do this, it is necessary to set a rule (algorithm) for the transition of the model from one state to the next, that is, a transformation:

where is the state of the model at the -th moment in time, which is a vector.

Let us introduce into consideration:

Vector of the state of the external environment (model input) at the th moment of time,

Control vector at the th moment of time.

Then the IM is determined by specifying the operator, with the help of which you can determine the state of the model at the next moment in time based on the state at the current moment, the control vectors and the external environment:

Let us write this transformation in recurrent form:

Operator defines a simulation model of a complex system with its structure and parameters.

An important advantage of IM is the ability to take into account uncontrolled factors of the modeled object, which are a vector:

Then we have:

Simulation model is a logical-mathematical description of a system that can be used during experiments on a computer.

Fig.3. Composition of the IM of a complex system

Returning to the problem of simulation modeling of a complex system, let us conditionally highlight in IM: model of the controlled object, model of the control system and model of internal random disturbances (Fig.3).

The inputs of the controlled object model are divided into controlled controlled and uncontrolled uncontrolled disturbances. The latter are generated by random number sensors according to a given distribution law. Control, in turn, is the output of the control system model, and disturbances are the output of random number sensors (model of internal disturbances).

Here is the control system algorithm.

Simulation allows you to study the behavior of a simulated object over a long period of time – dynamic simulation. In this case, as mentioned above, it is interpreted as the number of the moment in time. In addition, you can study the behavior of the system at a certain point in time - static simulation, then treated as a state number.

With dynamic simulation, time can change in constant and variable steps ( Fig.4):

Fig.4. Dynamic simulation

Here g i– moments of events in the VTS, g * i– moments of events during dynamic simulation with a constant step, g ‘i- moments of events at a variable step.

With a constant step, the implementation is simpler, but the accuracy is lower and there may be empty (that is, extra) time points when the state of the model is calculated.

With variable steps, time moves from event to event. This method is a more accurate reproduction of the process; there are no unnecessary calculations, but it is more difficult to implement.

Basic provisions, arising from what has been said:

1. MI is a numerical method and should be used when other methods cannot be used. For complex systems this is currently the main research method.

2. Imitation is an experiment, which means that when conducting it, the theory of planning an experiment and processing its results must be used.

3. The more accurately the behavior of the modeled object is described, the more accurate the model is required. The more accurate the model, the more complex it is and requires more computer resources and time for research. Therefore, it is necessary to seek a compromise between the accuracy of the model and its simplicity.

Examples of tasks to be solved: analysis of system designs at various stages, analysis of existing systems, use in control systems, use in optimization systems, etc.

Simulation model– a description of a system and its behavior that can be implemented and examined during computer operations.

Simulation modeling is most often used to describe the properties of a large system, provided that the behavior of its constituent objects is very simple and clearly formulated. The mathematical description is then reduced to the level of static processing of modeling results when finding the macroscopic characteristics of the system. Such a computer experiment actually claims to reproduce a full-scale experiment. Simulation modeling is a special case of mathematical modeling. There is a class of objects for which, for various reasons, analytical models have not been developed, or a method for solving the resulting model has not been developed. In this case, the mathematical model is replaced by a simulator or simulation model. Simulation modeling allows you to test hypotheses and explore the influence of various factors and parameters.

Simulation modeling is a method that allows you to build models that describe processes as they would occur in reality.

Such a model can be “played” over time for both one test and a given set of them. In this case, the results will be determined by the random nature of the processes. From these data you can obtain fairly stable statistics. Experimenting with a model is called imitation.

Imitation– comprehension of the essence of a phenomenon without experiments on the object.

Imitation as a method for solving nontrivial problems received its initial development in connection with the creation of computers in 1950–1960. Types of simulation: Monte Carlo method (static test method); simulation method (static modeling).

The demand for simulation modeling: 1) experimenting on a real object is expensive and impossible; 2) it is impossible to build an analytical model: the system has time, causal relationships, consequences, nonlinearities, random variables; 3) it is necessary to simulate the behavior of the system over time.

Purpose of Simulation Modeling– reproduction of the behavior of the system under study based on the results of the analysis of the most significant relationships between its elements (development of a simulator of the subject area under study for conducting various experiments).

Types of simulation modeling.

Agent-based modeling– a relatively new (1990 – 2000) direction in simulation modeling, which is used to study decentralized systems, the dynamics of which are determined not by global rules and laws (as in other modeling paradigms), but vice versa. When these global rules and laws are the result of individual activity of group members. The goal of agent-based models is to gain an understanding of these global rules, the general behavior of the system based on assumptions about the individual, private behavior of its individual active objects and the interactions of these objects in the system. An agent is a certain entity that has activity and autonomous behavior; can make decisions in accordance with a certain set of rules, interact with the environment, and also change independently.

Discrete event simulation– an approach to modeling that proposes to abstract from the continuous nature of events and consider only the main events of the simulated system, such as: “waiting”, “order processing”, “movement with cargo”, “unloading”, etc. Discrete-event modeling is the most developed and has a huge range of applications - from logistics and queuing systems to transport and production systems. This type of modeling is most suitable for modeling production processes. Founded by Jeffrey Gordon in the 1960s.

System dynamics- graphical diagrams of causal relationships and global influences of some parameters on others over time are constructed for the system under study, and then the model created on the basis of these diagrams is simulated on a computer. Essentially, this type of modeling, more than all other paradigms, helps to understand the essence of the ongoing identification of cause-and-effect relationships between objects and phenomena. Using system dynamics, models of business processes, city development, production models, population dynamics, ecology and epidemic development are built. The method was founded by Forrester in 1950.

Some areas of application of simulation modeling: business processes, combat, population dynamics, traffic, IT infrastructure, project management, ecosystems. Popular computer simulation systems: AnyLogic, Aimsun, Arena, eM-Plant, Powersim, GPSS.

Simulation modeling allows you to simulate the behavior of a system over time. Moreover, the advantage is that time in the model can be controlled: slowed down in the case of fast processes and accelerated for modeling systems with slow variability. It is possible to imitate the behavior of those objects, real experiments with which are expensive, impossible and dangerous.

Share with friends or save for yourself:

Loading...