What is a computer experiment. Computer experiment and computer simulation. Computer physics experiment

In the definition presented above, the term "experiment" has a dual meaning. On the one hand, in a computer experiment, as well as in a real one, the responses of the system to certain changes in parameters or to external influences are studied. Temperature, density, composition are often used as parameters. And the effects are most often realized through mechanical, electrical or magnetic fields. The only difference is that the experimenter is dealing with a real system, while in a computer experiment the behavior of a mathematical model of a real object is considered. On the other hand, the ability to obtain rigorous results for well-defined models makes it possible to use a computer experiment as an independent source of information to test the predictions of analytical theories and, therefore, in this capacity, the simulation results play the role of the same standard as the experimental data.

From all that has been said, it can be seen that there is the possibility of two very different approaches to setting up a computer experiment, which is due to the nature of the problem being solved and thus determines the choice of a model description.

First, calculations using the MD or MC methods can pursue purely utilitarian goals related to the prediction of the properties of a specific real system and their comparison with a physical experiment. In this case, it is possible to make interesting predictions and conduct research in extreme conditions, for example, at ultrahigh pressures or temperatures, when a real experiment is impossible for various reasons or requires too high material costs. Computer simulation is often generally the only way to obtain the most detailed ("microscopic") information about the behavior of a complex molecular system. This was especially clearly shown by numerical experiments of a dynamic type with various biosystems: globular proteins in the native state, DNA and RNA fragments. , lipid membranes. In a number of cases, the obtained data made it necessary to revise or significantly change the previously existing ideas about the structure and functioning of these objects. It should be borne in mind, however, that since in such calculations different kind valence and non-valence potentials, which only approximate the true interactions of atoms, then this circumstance ultimately determines the degree of correspondence between the model and reality. Initially, the inverse problem is solved, when the potentials are calibrated according to the available experimental data, and only then these potentials are used to obtain more detailed information about the system. Sometimes, the parameters of interatomic interactions can in principle be found from quantum chemical calculations performed for simpler model compounds. When modeling by MD or MC methods, a molecule is treated not as a set of electrons and nuclei, obeying the laws of quantum mechanics, but as a system of bound classical particles - atoms. Such a model is called mechanical model of a molecule .

The goal of another approach to setting up a computer experiment may be to understand the general (universal or model-invariant) patterns of behavior of the system under study, that is, patterns that are determined only by the most typical features of a given class of objects, but not by the details of the chemical structure of a single compound. That is, in this case, the computer experiment has as its goal the establishment of functional relationships, and not the calculation of numerical parameters. This ideology is most clearly present in the scaling theory of polymers. From the point of view of this approach, computer modeling acts as a theoretical tool, which, first of all, allows you to check the conclusions of existing analytical methods of the theory or supplement their predictions. This interaction between analytical theory and computer experiment can be very fruitful when both approaches manage to use identical models. The most striking example of such generalized models of polymer molecules is the so-called lattice model . On its basis, many theoretical constructions have been made, in particular, related to the solution of the classical and, in some sense, the main problem of the physicochemistry of polymers on the effect of bulk interactions on the conformation and, accordingly, on the properties of a flexible polymer chain. Bulk interactions are usually understood as short-range repulsive forces that arise between units distant along the chain when they approach each other in space due to random bending of the macromolecule. In the lattice model, a real chain is considered as a broken trajectory that passes through the nodes of a regular lattice of a given type: cubic, tetrahedral, etc. The occupied lattice nodes correspond to polymer units (monomers), and the segments connecting them - chemical bonds in the backbone of the macromolecule. The prohibition of self-intersections of the trajectory (or, in other words, the impossibility of simultaneous entry of two or more monomers into one lattice site) models volumetric interactions (Fig. 1). That is, if, for example, if the MC method is used and when a randomly selected link is displaced, it falls into an already occupied node, then such a new conformation is discarded and is no longer taken into account in the calculation of the system parameters of interest. Different chain arrangements on the lattice correspond to polymer chain conformations. According to them, the required characteristics are averaged, for example, the distance between the ends of the chain R.

The study of such a model makes it possible to understand how volume interactions affect the dependence of the root-mean-square value on the number of links in the chain N . course value , which determines the average size of the polymer coil, plays the main role in various theoretical constructions and can be measured experimentally; however, there is still no exact analytical formula for calculating the dependence on N in the presence of bulk interactions. It is also possible to introduce an additional energy of attraction between those pairs of links that have fallen into neighboring lattice nodes. By varying this energy in a computer experiment, it is possible, in particular, to investigate an interesting phenomenon called the "coil-globule" transition, when, due to the forces of intramolecular attraction, an unfolded polymer coil is compressed and turns into a compact structure - a globule resembling a liquid microscopic drop. Understanding the details of such a transition is important for developing the most general ideas about the course of biological evolution that led to the emergence of globular proteins.

There are various modifications of lattice models, for example, those in which the lengths of bonds between links do not have fixed values, but can change in a certain interval, which guarantees only the prohibition of chain self-crossings, this is how the widely used model with "fluctuating bonds" is arranged. However, all lattice models have in common that they are discrete, that is, the number of possible conformations of such a system is always finite (although it can be an astronomical value even with a relatively small number of links in the chain). All discrete models have very high computational efficiency, but, as a rule, can only be investigated by the Monte Carlo method.

For some cases, use continuous generalized models of polymers that are capable of changing conformation in a continuous manner. The simplest example is a chain made up of a given number N solid balls connected in series by rigid or elastic links. Such systems can be studied both by the Monte Carlo method and by the molecular dynamics method.

Home > Lecture

LECTURE

Topic: Computer experiment. Analysis of simulation results

To give life to new design developments, introduce new technical solutions into production or to test new ideas, you need an experiment. An experiment is an experiment that is performed with an object or model. It consists in performing some actions and determining how the experimental sample reacts to these actions. At school, you conduct experiments in the lessons of biology, chemistry, physics, geography. Experiments are carried out when testing new product samples at enterprises. Usually, a specially created setup is used for this purpose, which makes it possible to conduct an experiment in laboratory conditions, or the real product itself is subjected to all kinds of tests (a full-scale experiment). To study, for example, the operational properties of a unit or assembly, it is placed in a thermostat, frozen in special chambers, tested on vibration stands, dropped, etc. It’s good if it’s a new watch or a vacuum cleaner - it’s not a big loss upon destruction. And if a plane or a rocket? Laboratory and full-scale experiments require large material costs and time, but their value, nevertheless, is very great. With the development of computer technology, a new unique research method has appeared - computer experiment. In many cases, computer model studies have come to help, and sometimes even to replace, experimental samples and test benches. The stage of conducting a computer experiment includes two stages: drawing up an experiment plan and conducting a study. Experiment plan The experiment plan should clearly reflect the sequence of work with the model. The first point of such a plan is always testing the model. Testing - processcheckscorrectnessbuiltmodels. Test - kitinitialdata, allowingdefinegreat-vilenessbuildingmodels. To be sure of the correctness of the obtained simulation results, it is necessary:

    check the developed algorithm for building the model; make sure that the constructed model correctly reflects the properties of the original, which were taken into account in the simulation.
To check the correctness of the model construction algorithm, a test set of initial data is used, for which the final result is known in advance or predetermined in other ways. For example, if you use calculation formulas in modeling, then you need to select several options for the initial data and calculate them “manually”. This is test tasks. When the model is built, you test with the same inputs and compare the results of the simulation with the conclusions obtained by calculation. If the results match, then the algorithm is developed correctly, if not, it is necessary to look for and eliminate the cause of their discrepancy. Test data may not reflect the real situation at all and may not carry semantic content. However, the results obtained during the testing process may prompt you to think about changing the original informational or sign model, primarily in that part of it where the semantic content is laid down. To make sure that the constructed model reflects the properties of the original, which were taken into account in the simulation, it is necessary to select a test example with real source data. Conducting research After testing, when you have confidence in the correctness of the constructed model, you can proceed directly to conducting research. The plan should include an experiment or series of experiments that meet the objectives of the simulation. Each experiment must be accompanied by an understanding of the results, which serves as the basis for analyzing the results of modeling and making decisions. The scheme for preparing and conducting a computer experiment is shown in Figure 11.7.

MODEL TESTING

EXPERIMENT PLAN


CONDUCTING RESEARCH


ANALYSIS OF THE RESULTS


Rice. 11.7. Scheme of a computer experiment

Analysis of simulation results

The ultimate goal of modeling is making a decision, which should be developed on the basis of a comprehensive analysis of the results of modeling. This stage is decisive - either you continue the study, or finish. Figure 11.2 shows that the results analysis stage cannot exist autonomously. The conclusions obtained often contribute to an additional series of experiments, and sometimes to a change in the task. The basis for developing a solution is the results of testing and experiments. If the results do not correspond to the goals of the task, it means that mistakes were made at the previous stages. This may be either an incorrect statement of the problem, or an overly simplified construction of an information model, or an unsuccessful choice of a method or modeling environment, or a violation of technological methods when building a model. If such errors are found, then model adjustment, that is, a return to one of the previous stages. The process is repeated until the results of the experiment meet the objectives of the simulation. The main thing to remember is that the detected error is also the result. As the proverb says, you learn from your mistakes. The great Russian poet A. S. Pushkin also wrote about this: Oh, how many wonderful discoveries are being prepared for us by the spirit of enlightenment And experience, the son of difficult mistakes, And genius, friend of paradoxes, And chance, god the inventor ...

Controlquestionsandtasks

    What are the two main types of problem statement modeling.
    In the well-known "Problem Book" by G. Oster, there is the following problem:
The evil witch, working tirelessly, turns 30 princesses into caterpillars a day. How many days will it take her to turn 810 princesses into caterpillars? How many princesses per day will have to be turned into caterpillars to cope with the work in 15 days? Which question can be attributed to the type of "what will happen if ...", and which one - to the type of "how to do so that ..."?
    List the most well-known goals of modeling. Formalize the playful problem from G. Oster's "Problem Book":
From two booths located at a distance of 27 km from one another, two pugnacious dogs jumped out towards each other at the same time. The first runs at a speed of 4 km / h, and the second - 5 km / h. How long will the fight start? Houses: §11.4, 11.5.
  1. The concept of information

    Document

    The world around us is very diverse and consists of a huge number of interconnected objects. To find your place in life, you early childhood Together with your parents, and then with your teachers, you will learn all this diversity step by step.

  2. Managing editor V. Zemskikh Editor N. Fedorova Art editor R. Yatsko Layout T. Petrova Proofreaders M. Odinokova, M. Schukina bbk 65. 290-214

    Book

    Ш39 Organizational culture and leadership / Per. from English. ed. V. A. Spivak. - St. Petersburg: Peter, 2002. - 336 p: ill. - (Series "Theory and practice of management").

  3. Educational and methodological complex in the discipline: "Marketing" specialty: 080116 "Mathematical methods in economics"

    Training and metodology complex

    Area of ​​professional activity: analysis and modeling of economic processes and objects at the micro, macro and global levels; monitoring of economic and mathematical models; forecasting, programming and optimization of economic systems.

The modern computer has many uses. Among them, as you know, the capabilities of the computer as a means of automating information processes are of particular importance. But no less significant are its possibilities as tool conducting experimental work and analyzing its results.

Computational experiment has long been known in science. Remember the discovery of the planet Neptune "at the tip of the pen." Often the results of scientific research are considered reliable only if they can be presented in the form of mathematical models and confirmed by mathematical calculations. Moreover, this applies not only to physics.


or technical design, but also to sociology, linguistics, marketing - traditionally humanities, far from mathematics.

A computational experiment is a theoretical method of cognition. The development of this method is numerical simulation- a relatively new scientific method that has become widespread due to the advent of computers.

Numerical simulation is widely used both in practice and in scientific research.

Example. Without building mathematical models and performing various calculations on constantly changing data coming from measuring instruments, the operation of automatic production lines, autopilots, tracking stations, and automatic diagnostic systems is impossible. Moreover, to ensure the reliability of systems, calculations must be carried out in real time, and their errors can be millionths of a percent.

Example. A modern astronomer can often be seen not at the eyepiece of a telescope, but in front of a computer display. And not only theorist, but also the observer. Astronomy is an unusual science. She, as a rule, cannot directly experiment with the objects of research. Different kinds radiation (electromagnetic, gravitational, neutrino or cosmic ray fluxes) astronomers only "peep" and "eavesdrop". This means that you need to learn how to extract the maximum information from observations and reproduce them in calculations to test the hypotheses describing these observations. The applications of computers in astronomy, as in other sciences, are extremely diverse. This is both the automation of observations and the processing of their results (astronomers see images not in the eyepiece, but on a monitor connected to special devices). Computers are also needed to work with large catalogs (stars, spectral analyses, chemical compounds, etc.).

Example. Everyone knows the expression "a storm in a teacup". In order to study in detail such a complex hydrodynamic process as a storm, it is necessary to involve complex numerical simulation methods. Therefore, powerful computers are located in large hydrometeorological centers: “a storm is played out” in the computer processor crystal.


Even if you are doing not very complex calculations, but you need to repeat them a million times, then it is better to write a program once, and the computer will repeat it as many times as necessary (the limitation, of course, will be the speed of the computer).

Numerical simulation can be an independent research method when only the values ​​of some indicators are of interest (for example, the cost of production or the integral spectrum of the galaxy), but more often it acts as one of the means of constructing computer models in the broader sense of the term.

Historically, the first work on computer modeling was associated with physics, where a whole class of problems of hydraulics, filtration, heat transfer and heat transfer, solid mechanics, etc. was solved using numerical simulation. Modeling was mainly a solution of complex nonlinear problems of mathematical physics and in essence was, of course, mathematical modeling. The success of mathematical modeling in physics contributed to its spread to the problems of chemistry, electric power, biology, and the modeling schemes did not differ too much from each other. The complexity of the problems solved on the basis of modeling was limited only by the power of the available computers. This type of modeling is widespread at the present time. Moreover, during the development of numerical simulation, entire libraries of subroutines and functions have been accumulated that facilitate the application and expand the possibilities of simulation. And yet, at present, the concept of "computer modeling" is usually associated not with fundamental natural science disciplines, but primarily with a system analysis of complex systems from the standpoint of cybernetics (that is, from the standpoint of management, self-management, self-organization). And now computer modeling is widely used in biology, macroeconomics, in the creation of automated control systems, etc.

Example. Recall Piaget's experiment described in the previous paragraph. It, of course, could be carried out not with real objects, but with an animated image on the display screen. But after all, the movement of toys could be filmed on ordinary film and shown on TV. Is it appropriate to call the use of a computer in this case a computer simulation?


Example. The flight model of a body thrown vertically upwards or at an angle to the horizon is, for example, a graph of the height of the body as a function of time. You can build it

a) on a piece of paper point by point;

b) in a graphical editor for the same points;

c) using a business graphics program, for example, in
spreadsheets;

d) writing a program that not only displays
wound flight path, but also allows you to set different
initial data (inclination angle, initial speed
growth).

Why do you not want to call option b) a computer model, but options c) and d) fully correspond to this name?

Under computer model often understand a program (or a program plus a special device) that provides an imitation of the characteristics and behavior of a particular object. The result of executing this program is also called a computer model.

In the specialized literature, the term "computer model" is more strictly defined as follows:

A conditional image of an object or some system of objects (processes, phenomena), described using interconnected computer tables, flowcharts, diagrams, graphs, drawings, animation fragments, hypertexts, and so on, and displaying the structure (elements and relationships between them) of the object. Computer models of this kind are called structural and functional;

A separate program or a set of programs that, using a sequence of calculations and a graphical display of their results, reproduces (simulates) the processes of an object's functioning under the influence of various, usually random, factors on it. Such models are called imitation.

Computer models can be simple or complex. You created simple models many times when you learned programming or built your database. In 3D graphics systems, expert systems, automated control systems, very complex computer models are built and used.


Example. The idea of ​​constructing a model of human activity with the help of a computer is not new, and it is difficult to find a field of activity in which it would not be tried to be implemented. Expert systems are computer programs that simulate the actions of a human expert in solving problems in any subject area based on the accumulated knowledge that makes up the knowledge base. ES solve the modeling problem mental activity. Due to the complexity of the models, the development of ES, as a rule, takes several years.

Modern expert systems, in addition to the knowledge base, also have a base of precedents - for example, survey results real people and information about the subsequent success/failure of their activities. For example, the case base of the NYPD expert system is 786 000 people, Center "Hobby" (personnel policy at the enterprise) - 512 000 people, and according to the specialists of this center, the ES developed by them worked with the expected accuracy only when the base exceeded 200 000 man, it took 6 years to create it.

Example. Progress in the creation of computer graphics has moved from wireframe images of three-dimensional models with a simple halftone image to modern realistic pictures that are examples of art. This was the result of success in defining the modeling environment more precisely. Transparency, reflection, shadows, lighting patterns and surface properties are some of the areas where research teams are working hard, constantly coming up with new algorithms to create increasingly realistic artificial images. Today, these methods are also used to create high-quality animation.

practical needs in computer modeling pose challenges for hardware developers funds computer. That is, the method actively influences not only the emergence of new and new programs but and on the development technical means.

Example. For the first time, computer holography was discussed in the 80s. So, in computer-aided design systems, in geographic information systems, it would be nice to be able not only to see the object of interest in a three-dimensional form, but to present it in the form of a hologram that can be rotated, tilted, look inside it. To create a holographic image useful in real applications, needed


holographic

Pictures

displays with a gigantic number of pixels - up to a billion. Now such work is actively carried out. Simultaneously with the development of the holographic display, work is in full swing on the creation of a three-dimensional workstation based on a principle called "substitution of reality." Behind this term is the idea of ​​wide application of all those natural and intuitive methods that a person uses when interacting with natural (material-energy) models, but at the same time, emphasis is placed on their comprehensive improvement and development using the unique capabilities of digital systems. It is assumed, for example, that it will be possible to manipulate and interact with computer holograms in real time using gestures and touches.

Computer modelling has the following advantages:

Provides visibility;

Available to use.

The main advantage of computer simulation is that it allows not only to observe, but also to predict the result of an experiment under some special conditions. Thanks to this possibility, this method has found application in biology, chemistry, sociology, ecology, physics, economics and many other fields of knowledge.


Computer modeling is widely used in teaching. With the help of special programs, you can see models of such phenomena as the phenomena of the microcosm and the world with astronomical dimensions, the phenomena of nuclear and quantum physics, the development of plants and the transformation of substances in chemical reactions.

The training of specialists in many professions, especially such as air traffic controllers, pilots, nuclear and power plant managers, is carried out with the help of simulators controlled by a computer that simulates real situations, including emergency ones.

Laboratory work can be carried out on a computer if there are no necessary real devices and devices or if the solution of a problem requires the use of complex mathematical methods and labor-intensive calculations.

Computer modeling makes it possible to "revive" the studied physical, chemical, biological, social laws, to put a series of experiments with the model. But do not forget that all these experiments are of a very conditional nature and their cognitive value is also very conditional.

Example. Prior to the practical use of the nuclear decay reaction, nuclear physicists simply did not know about the dangers of radiation, but the first mass application of "achievements" (Hiroshima and Nagasaki) clearly showed how much radiation

is dangerous to humans. Start physics with nuclear electro-

stations, humanity would not have learned about the dangers of radiation for a long time. The achievement of chemists at the beginning of the last century - the most powerful pesticide DDT - was considered absolutely safe for humans for a long time -

In the context of the use of powerful modern technologies, wide replication and thoughtless use of erroneous software products, such highly specialized, it would seem, questions, such as the adequacy of a computer model of reality, can acquire significant universal significance.

Computer experiments- it is a tool for studying patterns, not natural or social phenomena.

Therefore, a full-scale experiment should always be carried out simultaneously with a computer experiment, so that the researcher, comparing their results, can evaluate the quality of the corresponding model, the depth of our understanding of the essence of phenomena.


childbirth. Do not forget that physics, biology, astronomy, computer science are sciences about the real world, and not about virtual reality.

AT scientific research, both fundamental and practically directed (applied), the computer often acts as essential tool experimental work.

A computer experiment is most often associated with:

With complex mathematical calculations (number
lazy modeling);

With the construction and study of visual and / or dynamic
mic models (computer modelling).

Under computer model means a program (or a program in conjunction with a special device) that provides an imitation of the characteristics and behavior of a particular object, as well as the result of executing this program in the form of graphic images (stationary or dynamic), numerical values, tables, etc.

There are structural-functional and simulation computer models.

Structural-functional a computer model is a conditional image of an object or some system of objects (processes, phenomena), described using interconnected computer tables, flowcharts, diagrams, graphs, drawings, animation fragments, hypertexts, and so on, and displaying the structure of an object or its behavior.

A simulation computer model is a separate program or software package that allows, using a sequence of calculations and a graphical display of their results, to reproduce (simulate) the processes of an object's functioning under the influence of various random factors.

Computer modeling is a method for solving the problem of analyzing or synthesizing a system (most often a complex system) based on the use of its computer model.


Advantages of computer simulation are that it:

Allows not only to observe, but also to predict the result of the experiment under some special conditions;

Allows you to model and study the phenomena predicted by any theories;

It is environmentally friendly and does not pose a danger to nature and humans;

Provides visibility;

Available to use.

The computer modeling method has found application in biology, chemistry, sociology, ecology, physics, economics, linguistics, jurisprudence and many other fields of knowledge.

Computer modeling is widely used in education, training and retraining of specialists:

For a visual representation of models of the phenomena of the microworld and the world with astronomical dimensions;

To simulate the processes occurring in the world of animate and inanimate nature

To simulate real situations of managing complex systems, including emergencies;

For laboratory work when there are no necessary devices and devices;

For solving problems, if this requires the use of complex mathematical methods and labor-intensive calculations.

It is important to remember that it is not objective reality that is modeled on a computer, but our theoretical ideas about it. The object of computer modeling are mathematical and other scientific models, and not real objects, processes, phenomena.

Computer experiments- it is a tool for studying patterns, not natural or social phenomena.

The criterion of fidelity of any of the results of computer simulation has been and remains a full-scale (physical, chemical, social) experiment. In scientific and practical research, a computer experiment can only accompany a full-scale one, so that the researcher can compare


Nivaya their results, could assess the quality of the model, the depth of our ideas about the essence of natural phenomena.

It is important to remember that physics, biology, astronomy, economics, computer science are sciences about the real world, and not about
virtual reality.

Exercise 1

A letter written in a text editor and sent by e-mail is unlikely to be called a computer model.

Text editors often allow you to create not only ordinary documents (letters, packs, reports), but also document templates in which there is constant information that the user cannot change, there are data fields that are filled in by the user, and there are fields in which calculations based on the entered data. Can such a template be considered as a computer model? If so, what is the object of modeling in this case and what is the purpose of creating such a model?

Task 2

You know that before you create a database, you first need to build a data model. You also know that an algorithm is a model of activity.

Both data models and algorithms are most often developed with computer implementation in mind. Can we say that at some point they become a computer model, and if so, when does this happen?

Note. Check your answer against the definition of "computer model".

Task 3

Describe the stages of building a computer model using the example of developing a program that simulates some physical phenomenon.

Task 4

Give examples of when computer simulation has brought real benefits and when it has led to undesirable consequences. Prepare a report on this topic.


Computer modelling - the basis for the representation of knowledge in computers. Computer modeling for the birth of new information uses any information that can be updated with the help of a computer. The progress of modeling is associated with the development of computer modeling systems, and the progress in information technology is with updating the experience of modeling on a computer, with the creation of banks of models, methods and software systems that allow you to collect new models from bank models.

A kind of computer simulation is a computational experiment, i.e. an experiment carried out by an experimenter on a system or process under study with the help of an experimental tool - a computer, computer environment, technology.

The computational experiment is becoming a new tool, a method of scientific knowledge, a new technology also due to the growing need to move from the study of linear mathematical models of systems (for which research methods and theory are well known or developed) to the study of complex and nonlinear mathematical models of systems (the analysis of which is much more difficult). Roughly speaking, our knowledge about the surrounding world is linear, and the processes in the surrounding world are non-linear.

A computational experiment allows you to find new patterns, test hypotheses, visualize the course of events, etc.

To give life to new design developments, to introduce new technical solutions into production or to test new ideas, an experiment is needed. In the recent past, such an experiment could be carried out either in laboratory conditions on installations specially created for it, or in nature, that is, on a real sample of the product, subjecting it to all sorts of tests.

With the development of computer technology, a new unique research method has appeared - a computer experiment. A computer experiment includes a certain sequence of work with a model, a set of purposeful user actions on a computer model.

Stage 4. Analysis of simulation results.

Final goal modeling - making a decision, which should be developed on the basis of a comprehensive analysis of the results obtained. This stage is decisive - either you continue the study, or finish. Perhaps you know the expected result, then you need to compare the received and expected results. In case of a match, you can make a decision.

The basis for developing a solution is the results of testing and experiments. If the results do not correspond to the goals of the task, it means that mistakes were made at the previous stages. This may be either an overly simplified construction of an information model, or an unsuccessful choice of a modeling method or environment, or a violation of technological methods when building a model. If such errors are found, then model adjustment , i.e. return to one of the previous steps. Process repeats until the results of the experiment meet goals modeling. The main thing to remember is that the detected error is also the result. As the proverb says, you learn from your mistakes.

Simulation programs

ANSYS- universal software system of finite element ( FEM) analysis, existing and developing over the past 30 years, is quite popular among specialists in the field of computer engineering ( CAE, Computer-Aided Engineering) and FE solutions of linear and nonlinear, stationary and non-stationary spatial problems of deformable mechanics solid body and mechanics of structures (including non-stationary geometrically and physically nonlinear problems of contact interaction of structural elements), problems of fluid and gas mechanics, heat transfer and heat transfer, electrodynamics, acoustics, as well as mechanics of coupled fields. Modeling and analysis in some industries avoids costly and lengthy development cycles such as "design - manufacture - test". The system works on the basis of the geometric kernel Parasolid .

AnyLogic - software for simulation modeling complex systems and processes, developed Russian by XJ Technologies ( English XJ technologies). The program has user's graphical environment and allows you to use Java language for model development .

AnyLogic models can be based on any of the major simulation modeling paradigms: discrete event simulation, system dynamics, and agent modeling.

System dynamics and discrete-event (process) modeling, by which we mean any development of ideas GPSS are traditional well-established approaches, agent-based modeling is relatively new. System dynamics operates mainly with processes that are continuous in time, while discrete-event and agent-based modeling - with discrete ones.

System dynamics and discrete event modeling have historically been taught completely different groups students: management, production engineers and control system development engineers. As a result, three different almost non-overlapping communities have emerged, which almost never communicate with each other.

Agent-based modeling has been a strictly academic field until recently. However, the growing demand for global optimization on the part of business has forced leading analysts to pay attention to agent-based modeling and its combination with traditional approaches in order to obtain a more complete picture of the interaction of complex processes of various nature. Thus, the demand for software platforms was born that allows integrating different approaches.

Now let's consider simulation modeling approaches on the abstraction level scale. System dynamics, by replacing individual objects with their aggregates, assumes the highest level of abstraction. Discrete event simulation works in the low and medium range. As for agent-based modeling, it can be applied at almost any level and on any scale. Agents can represent pedestrians, cars, or robots in a physical space, a customer or salesperson at a mid-level, or competing companies at a high level.

When developing models in AnyLogic, you can use concepts and tools from several modeling methods, for example, in an agent-based model, use the methods of system dynamics to represent changes in the state of the environment, or in a continuous model of a dynamic system, take into account discrete events. For example, supply chain management using simulation modeling requires the description of supply chain participants by agents: manufacturers, sellers, consumers, a network of warehouses. At the same time, production is described within the framework of discrete-event (process) modeling, where the product or its parts are applications, and cars, trains, stackers are resources. The deliveries themselves are represented by discrete events, but the demand for goods can be described by a continuous system-dynamic diagram. The ability to mix approaches allows you to describe the processes of real life, and not to adjust the process to the available mathematical apparatus.

LabVIEW (English Lab oratory V virtual I instrumentation E ngineering W orkbench) is development environment and platform to execute programs created in the graphical programming language "G" of the company National Instruments(USA). The first version of LabVIEW was released in 1986 for Apple Macintosh, there are currently versions for UNIX, GNU/Linux, MacOS etc., and the most developed and popular versions are for Microsoft Windows.

LabVIEW is used in systems for collecting and processing data, as well as for managing technical objects and technological processes. Ideologically, LabVIEW is very close to SCADA-systems, but unlike them, it is more focused on solving problems not so much in the field of APCS how many in the area ASNI.

MATLAB(short for English « matrix Laboratory» ) is a term referring to a package of applied programs for solving problems of technical calculations, as well as to the programming language used in this package. MATLAB used by more than 1,000,000 engineers and scientists, it works on most modern operating systems, including GNU/Linux, MacOS, Solaris and Microsoft Windows .

maple- software package, computer algebra system. It is a product of Waterloo Maple Inc., which 1984 produces and markets software products focused on complex mathematical calculations, data visualization and modeling.

The Maple system is designed to symbolic calculations, although it has a number of tools for the numerical solution differential equations and finding integrals. It has advanced graphics. Has its own programming language reminiscent of Pascal.

Mathematica - computer algebra system companies Wolfram Research. Contains many functions both for analytical transformations and for numerical calculations. In addition, the program supports graphics and sound, including the construction of two- and three-dimensional charts functions, drawing arbitrary geometric shapes, import and export images and sound.

Forecasting tools- software products that have the functions of calculating forecasts. Forecasting is one of the most important human activities today. Even in ancient times, forecasts allowed people to calculate the periods of droughts, the dates of solar and lunar eclipses, and many other phenomena. With the advent of computer technology, forecasting has received a powerful impetus for development. One of the first applications of computers was the calculation of the ballistic trajectory of projectiles, that is, in fact, the prediction of the point where the projectile hits the ground. This type of forecast is called static forecast. There are two main categories of forecasts: static and dynamic. The key difference is that dynamic forecasts provide information about the behavior of the object under study over a significant period of time. In turn, static forecasts reflect the state of the object under study only at a single point in time and, as a rule, in such forecasts, the time factor in which the object undergoes changes plays an insignificant role. To date, there are a large number of tools that allow you to make forecasts. All of them can be classified according to many criteria:

Instrument name

Scope of application

Implemented Models

Required User Training

Ready for use

Microsoft Excel , openoffice.org

general purpose

algorithmic, regression

basic knowledge of statistics

significant refinement is required (implementation of models)

statistics , SPSS , e-views

research

a wide range of regression, neural network

boxed product

matlab

research, application development

algorithmic, regression, neural network

special mathematical education

programming required

SAP APO

business forecasting

algorithmic

deep knowledge is not required

ForecastPro , ForecastX

business forecasting

algorithmic

deep knowledge is not required

boxed product

Logility

business forecasting

algorithmic, neural network

deep knowledge is not required

Significant improvement required (for business processes)

ForecastPro SDK

business forecasting

algorithmic

basic knowledge of statistics required

programming required (software integration)

iLog , AnyLogic , iThink . MatlabSimulink , GPSS

application development, simulation

imitation

special mathematical education is required

programming is required (according to the specifics of the region)

PC LIRA- a multifunctional software package designed for the design and calculation of machine-building and building structures for various purposes. Calculations in the program are performed both for static and dynamic impacts. The basis of calculations is finite element method(FEM). Various plug-in modules (processors) allow you to select and check sections of steel and reinforced concrete structures, simulate soil, calculate bridges and the behavior of buildings during installation, etc.

A computer experiment with a system model during its research and design is carried out in order to obtain information about the characteristics of the process of functioning of the object under consideration. The main task of planning computer experiments is to obtain the necessary information about the system under study under resource constraints (computer time, memory, etc.). Among the particular tasks solved when planning computer experiments are the tasks of reducing the cost of computer time for modeling, increasing the accuracy and reliability of modeling results, checking the adequacy of the model, etc.

The effectiveness of computer experiments with models significantly depends on the choice of the experimental plan, since it is the plan that determines the volume and procedure for performing calculations on a computer, methods for accumulating and statistical processing of the system simulation results. . Therefore, the main task of planning computer experiments with a model is formulated as follows: it is necessary to obtain information about the object of modeling, given in the form of a modeling algorithm (program), with minimal or limited expenditure of machine resources for the implementation of the modeling process.

The advantage of computer experiments over natural ones is the ability to fully reproduce the conditions of the experiment with the model of the system under study. . A significant advantage over full-scale ones is the ease of interrupting and resuming computer experiments, which allows the use of sequential and heuristic planning techniques that may not be feasible in experiments with real objects. When working with a computer model, it is always possible to interrupt the experiment for the time necessary to analyze the results and make decisions about its further course (for example, on the need to change the values ​​of the model characteristics).

The disadvantage of computer experiments is that the results of some observations depend on the results of one or more previous ones, and therefore they contain less information than independent observations.

In relation to the database, a computer experiment means the manipulation of data in accordance with the set goal using the tools of the DBMS. The purpose of the experiment can be formed based on the general purpose of the simulation and taking into account the requirements of a particular user. For example, there is a database "Dean's office". The overall goal of creating this model is to manage the educational process. If you need to obtain information about the progress of students, you can make a request, i.e. conduct an experiment to select the desired information.

The DBMS environment toolkit allows you to perform the following operations on data:

1) sorting - ordering data according to some attribute;

2) search (filtering) - selection of data that satisfies a certain condition;

3) creation of calculation fields - transformation of data into another form based on formulas.

Information model management is inextricably linked with the development of various criteria for searching and sorting data. Unlike paper file cabinets, where sorting is possible according to one or two criteria, and the search is generally carried out manually - by sorting through cards, computer databases allow you to set any sorting forms for various fields and various search criteria. The computer will sort or select the necessary information without time expenditure according to the given criterion.

For successful work with the information model, database software environments allow you to create calculation fields in which the original information is converted into a different form. For example, based on semester grades, a special built-in function can calculate a student's GPA. Such calculated fields are used either as additional information or as criteria for searching and sorting.

A computer experiment includes two stages: testing (checking the correctness of operations) and conducting an experiment with real data.

After formulating formulas for calculated fields and filters, you need to make sure that they work correctly. To do this, you can enter test records for which the result of the operation is known in advance.

The computer experiment ends with the output of the results in a form convenient for analysis and decision making. One of the advantages of computer information models is the ability to create various forms of presentation of output information, called reports. Each report contains information that meets the purpose of a particular experiment. The convenience of computer reports lies in the fact that they allow you to group information according to given criteria, enter the final fields for counting records by group and in general for the entire database, and then use this information to make a decision.

The environment allows you to create and store several typical, frequently used report forms. Based on the results of some experiments, you can create a temporary report that is deleted after copying it to Text Document or printouts. Some experiments do not require reporting at all. For example, it is required to select the most successful student for awarding an increased scholarship. To do this, it is enough to sort by the average score of grades in the semester. The required information will contain the first entry in the list of students.