State the complementarity principle where it applies. The principle of complementarity, its manifestations and essence

The principle of complementarity is a methodological postulate, which was originally formulated by the great Danish physicist and philosopher Niels Bohr in relation to the field The principle of complementarity of Bohr, most likely, came into being only due to the fact that even earlier, the German physicist Kurt Gödel proposed his conclusion and the formulation of the famous theorem about the properties of deductive systems, which belongs to the field of Niels Bohr, extended Gödel's logical conclusions to quantum mechanics and formulated the principle approximately as follows: in order to reliably and adequately know the subject of the microcosm, it should be investigated in systems that mutually exclude each other, that is, in some additional systems. This definition went down in history as the principle of complementarity in quantum mechanics.

An example of such a solution to the problems of the microworld was the consideration of light in the context of two theories - wave and corpuscular, which led to a scientific result that was amazing in terms of efficiency, which revealed to man the physical nature of light.

Niels Bohr in his understanding of the conclusion made went even further. He makes an attempt to interpret the principle of complementarity through the prism of philosophical knowledge, and it is here that this principle acquires universal scientific significance. Now the formulation of the principle sounded like this: in order to reproduce a phenomenon with the aim of its knowledge in a sign (symbolic) system, it is necessary to resort to additional concepts and categories. Speaking more plain language, the principle of complementarity presupposes in cognition not only possible, but in some cases necessary, the use of several methodological systems that will allow one to acquire objective data about the subject of research. The principle of complementarity, in this sense, has shown itself as a fact of agreement with the metaphorical nature of the logical systems of methodology - they can manifest themselves in one way or another. Thus, with the advent and understanding of this principle, in fact, it was recognized that logic alone was not enough for cognition, and therefore illogical conduct in the research process was recognized as acceptable. Ultimately, the application of the Bohr principle contributed to a significant change

Later, Yu. M. Lotman expanded methodological significance Bohr's principle and transferred its regularities to the sphere of culture, in particular, applied to the description Lotman formulated the so-called "paradox of the amount of information", the essence of which is that human existence mainly takes place in conditions of information insufficiency. And as development progresses, this insufficiency will increase all the time. Using the principle of complementarity, it is possible to compensate for the lack of information by transferring it to another semiotic (sign) system. This technique led, in fact, to the emergence of computer science and cybernetics, and then the Internet. Later, the functioning of the principle was confirmed by physiological fitness human brain to this type of thinking, this is due to the asymmetry of the activity of his hemispheres.

Another provision, which is mediated by the operation of the Bohr principle, is the fact of the discovery by the German physicist Werner Heisenberg, the law of the uncertainty relation. Its action can be defined as the recognition of the impossibility of the same description of two objects with the same accuracy if these objects belong to different systems. The philosophical analogy of this conclusion was given in the work “On Reliability”, he stated that in order to assert the certainty of something, one must doubt something.

Thus, Bohr's principle has acquired enormous methodological significance in various fields.

The fundamental principle of quantum mechanics, along with the uncertainty relation, is the principle of complementarity, to which N. Bohr gave the following formulation:

"The concepts of particle and wave complement each other and at the same time contradict each other, they are complementary pictures of what is happening."

Contradictions of corpuscular-wave properties of micro-objects are the result of uncontrolled interaction of micro-objects and macro-devices. There are two classes of devices: in some quantum objects behave like waves, in others they behave like particles. In experiments, we observe not reality as such, but only a quantum phenomenon, including the result of the interaction of a device with a micro-object. M. Born figuratively noted that waves and particles are "projections" of physical reality onto the experimental situation.

Firstly, the idea of ​​wave-particle duality means that any material object that has wave-particle duality has an energy shell. A similar energy shell exists in the Earth, as well as in humans, which is most often called an energy cocoon. This energy shell can play the role of a sensory shell that shields a material object from the external environment and makes up its outer "gravitational sphere". This sphere can play the role of a membrane in the cells of living organisms. It passes inside only "filtered" signals, with the level of perturbations exceeding a certain limit value. Similar signals that have exceeded a certain threshold of sensitivity of the shell, it can also pass in the opposite direction.

Secondly, the presence of an energy shell in material objects brings to a new level of understanding the hypothesis of the French physicist L. de Broglie about the truly universal nature of wave-particle duality.

Thirdly, due to the evolution of the structure of matter, the nature of the corpuscular-wave dualism of an electron can be a reflection of the corpuscular-wave dualism of photons. This means that the photon, being a neutral particle, has a meson structure and is the most elementary micro atom, from which, in the image and likeness, all material objects of the Universe are built. Moreover, this construction is carried out according to the same rules.

Fourthly, corpuscular-wave dualism makes it possible to naturally explain the phenomenon of gene memory (Gene memory) of particles, atoms, molecules, living organisms, making it possible to understand the mechanisms of such memory, when a structureless particle remembers all its creations in the Past and has "intelligence" to selected synthesis processes, in order to form new "particles", with selected properties.

The uncertainty principle is a physical law that states that it is impossible to accurately measure the coordinates and momentum of a microscopic object at the same time, because the measurement process disturbs the equilibrium of the system. The product of these two uncertainties is always greater than Planck's Constant. This principle was first formulated by Werner Heisenberg.

It follows from the uncertainty principle that the more precisely one of the quantities included in the inequality is determined, the less certain is the value of the other. No experiment can lead to a simultaneous accurate measurement of such dynamic variables; At the same time, the uncertainty in measurements is connected not with the imperfection of the experimental technique, but with the objective properties of matter.

The uncertainty principle, discovered in 1927 by the German physicist W. Heisenberg, was an important step in elucidating the patterns of intra-atomic phenomena and building quantum mechanics. An essential feature of microscopic objects is their corpuscular-wave nature. The state of a particle is completely determined by the wave function (a value that completely describes the state of a micro-object (electron, proton, atom, molecule) and, in general, of any quantum system). A particle can be found at any point in space where the wave function is non-zero. Therefore, the results of experiments to determine, for example, coordinates are of a probabilistic nature.

Example: the motion of an electron is the propagation of its own wave. If you shoot an electron beam through a narrow hole in the wall: a narrow beam will pass through it. But if you make this hole even smaller, such that its diameter is equal in size to the wavelength of an electron, then the electron beam will diverge in all directions. And this is not a deflection caused by the nearest atoms of the wall, which can be eliminated: this is due to the wave nature of the electron. Try to predict what will happen next with an electron passing through the wall, and you will be powerless. You know exactly where it crosses the wall, but you can't tell how much transverse momentum it will acquire. On the contrary, in order to accurately determine that the electron will appear with such and such a certain momentum in the original direction, you need to enlarge the hole so that the electron wave passes straight, only slightly diverging in all directions due to diffraction. But then it is impossible to say exactly where exactly the electron-particle passed through the wall: the hole is wide. How much you win in the accuracy of determining the momentum, so you lose in the accuracy with which its position is known.

This is the Heisenberg Uncertainty Principle. He played an extremely important role in the construction of a mathematical apparatus for describing the waves of particles in atoms. Its strict interpretation in experiments with electrons is that, like light waves, electrons resist any attempt to make measurements with the utmost precision. This principle also changes the picture of the Bohr atom. It is possible to determine exactly the momentum of an electron (and, therefore, its energy level) in any of its orbits, but in this case its location will be absolutely unknown: nothing can be said about where it is located. From this it is clear that it makes no sense to draw a clear orbit of an electron and mark it on it in the form of a circle. AT late XIX in. many scientists believed that the development of physics was completed for the following reasons:

More than 200 years there are laws of mechanics, the theory of universal gravitation

developed a molecular kinetic theory

A solid foundation has been laid for thermodynamics

Completed Maxwell's theory of electromagnetism

Fundamental laws of conservation (energy, momentum, angular momentum, mass and electric charge) have been discovered

At the end of XIX - beginning of XX century. discovered by V. Roentgen - X-rays (X-rays), A. Becquerel - the phenomenon of radioactivity, J. Thomson - electron. However, classical physics failed to explain these phenomena.

A. Einstein's theory of relativity required a radical revision of the concept of space and time. Special experiments confirmed the validity of J. Maxwell's hypothesis about the electromagnetic nature of light. It could be assumed that the radiation of electromagnetic waves by heated bodies is due to the oscillatory motion of electrons. But this assumption had to be confirmed by comparing theoretical and experimental data.

For a theoretical consideration of the laws of radiation, we used the model of an absolutely black body, i.e., a body that completely absorbs electromagnetic waves of any length and, accordingly, emits all wavelengths of electromagnetic waves.

An example of an absolutely black body in terms of emissivity can be the Sun, in terms of absorption - a cavity with mirror walls with a small hole.

Austrian physicists I. Stefan and L. Boltzmann experimentally established that the total energy E radiated for 1 with a completely black body from a unit surface is proportional to the fourth power of the absolute temperature T:

where s = 5.67.10-8 J/(m2.K-s) is the Stefan-Boltzmann constant.

This law was called the Stefan-Boltzmann law. He made it possible to calculate the radiation energy of a completely black body from a known temperature.

In an effort to overcome the difficulties of the classical theory in explaining the radiation of a black body, M. Planck in 1900 put forward a hypothesis: atoms emit electromagnetic energy in separate portions - quanta. Energy E, where h=6.63.10-34 J.s is Planck's constant.

It is sometimes convenient to measure the energy and Planck's constant in electron volts.

Then h=4.136.10-15 eV.s. In atomic physics, the quantity is also used

(1 eV is the energy that an elementary charge acquires, passing through an accelerating potential difference of 1 V. 1 eV = 1.6.10-19 J).

Thus, M. Planck indicated the way out of the difficulties faced by the theory thermal radiation, after which the modern physical theory called quantum physics began to develop.

Physics is the main of the natural sciences, since it reveals truths about the relationship of several basic variables that are true for the entire universe. Her versatility is inversely proportional to the number of variables she introduces into her formulas.

The progress of physics (and science in general) is associated with the gradual rejection of direct visibility. As if such a conclusion should contradict the fact that modern science and physics, first of all, is based on experiment, i.e. empirical experience that takes place under human controlled conditions and can be reproduced at any time, any number of times. But the thing is that some aspects of reality are invisible to superficial observation and visibility can be misleading.

Quantum mechanics is a physical theory that establishes the way of description and the laws of motion at the micro level.

Classical mechanics is characterized by the description of particles by specifying their position and velocities, and the dependence of these quantities on time. In quantum mechanics, the same particles under the same conditions can behave differently.

Statistical laws can only be applied to large populations, not to individuals. Quantum mechanics refuses to search for individual laws of elementary particles and establishes statistical laws. On the basis of quantum mechanics, it is impossible to describe the position and speed of an elementary particle or predict its future path. Probability waves tell us the probability of encountering an electron in a particular place.

The importance of experiment has grown in quantum mechanics to such an extent that, as Heisenberg writes, "observation plays a decisive role in an atomic event and that reality differs depending on whether we observe it or not."

The fundamental difference between quantum mechanics and classical mechanics is that its predictions are always probabilistic. This means that we cannot accurately predict exactly where, for example, an electron falls in the experiment discussed above, no matter what perfect means of observation and measurement are used. One can only estimate his chances of getting to a certain place, and, therefore, apply for this the concepts and methods of probability theory, which serves to analyze uncertain situations.

In quantum mechanics, any state of a system is described using the so-called density matrix, but, unlike classical mechanics, this matrix determines the parameters of its future state not reliably, but only with varying degrees of probability. The most important philosophical conclusion from quantum mechanics is the fundamental uncertainty of measurement results and, consequently, the impossibility of accurately predicting the future.

This, combined with the Heisenberg Uncertainty Principle and other theoretical and experimental evidence, has led some scientists to suggest that microparticles have no inherent properties at all and only appear at the moment of measurement. Others suggested that the role of the experimenter's consciousness for the existence of the entire Universe is key, since, according to quantum theory, it is observation that creates or partially creates the observed. Determinism is the doctrine of the initial determinability of all processes occurring in the world, including all processes human life, from the side of God (theological determinism, or the doctrine of predestination), or only the phenomena of nature (cosmological determinism), or specifically the human will (anthropological-ethical determinism), for the freedom of which, as well as for responsibility, there would then be no room left.

Definability here means the philosophical assertion that every event that occurs, including both human actions and behavior, is uniquely determined by a set of causes that immediately precede this event.

In this light, determinism can also be defined as the thesis that there is only one, precisely given, possible future.

Indeterminism is a philosophical doctrine and methodological position that deny either the objectivity of a causal relationship or the cognitive value of a causal explanation in science.

In the history of philosophy, starting from ancient Greek philosophy (Socrates) up to the present, indeterminism and determinism act as opposing concepts on the problems of the conditionality of a person’s will, his choice, the problem of a person’s responsibility for his actions.

Indeterminism treats the will as an autonomous force, arguing that the principles of causality do not apply to the explanation of human choice and behavior.

The term determination was introduced by the Hellenistic philosopher Democritus in his atomistic concept, which denied chance, taking it simply for an unknown necessity. From the Latin language, the term determination is translated as a definition, the obligatory definability of all things and phenomena in the world by other things and phenomena. At first, to determine meant to determine an object through the identification and fixation of its features that separate this object from others. Causality was equated with necessity, while randomness was excluded from consideration, it was considered simply non-existent. Such an understanding of determination implied the existence of a cognizing subject.

With the emergence of Christianity, determinism is expressed in two new concepts - divine predestination and divine grace, and the old principle of free will collides with this new, Christian determinism. For the general ecclesiastical consciousness of Christianity, from the very beginning it was equally important to keep intact both assertions: that everything, without exception, depends on God and that nothing depends on man. In the 5th century, in the West, in his teachings, Pelagius raises the issue of Christian determinism in the aspect of free will. Blessed Augustine spoke out against Pelagian individualism. In his polemical writings, in the name of the demands of Christian universality, he often carried determinism to erroneous extremes, incompatible with moral freedom. Augustine develops the idea that the salvation of a person depends entirely and exclusively on the grace of God, which is communicated and acts not according to a person’s own merits, but as a gift, according to the free choice and predestination on the part of the Divine.

Determinism received further development and substantiation in the natural sciences and materialistic philosophy of modern times (F. Bacon, Galileo, Descartes, Newton, Lomonosov, Laplace, Spinoza, French materialists of the 18th century). In accordance with the level of development of natural science, the determinism of this period is mechanistic, abstract.

Based on the works of his predecessors and on the fundamental ideas of the natural sciences of I. Newton and C. Linnaeus, Laplace, in his work “The Experience of the Philosophy of the Theory of Probability” (1814), brought the ideas of mechanistic determinism to its logical conclusion: he proceeds from the postulate, according to which, from knowledge of the initial causes can always be unambiguously deduced consequences.

The methodological principle of determinism is at the same time the fundamental principle of the philosophical doctrine of being. One of the fundamental ontological ideas laid down in the basis of classical natural science by its creators (G. Galileo, I. Newton, I. Kepler, and others) was the concept of determinism. This concept consisted in the adoption of three basic statements:

1) nature functions and develops in accordance with its inherent internal, "natural" laws;

2) the laws of nature are an expression of the necessary (unambiguous) connections between the phenomena and processes of the objective world;

3) the purpose of science, corresponding to its purpose and capabilities, is the discovery, formulation and justification of the laws of nature.

Among the diverse forms of determination, reflecting the universal interconnection and interaction of phenomena in the surrounding world, the cause-and-effect, or causal (from Latin causa - cause) connection is especially distinguished, the knowledge of which is indispensable for correct orientation in practical and scientific activity. Therefore, it is the cause that is the most important element of the system of determining factors. And yet the principle of determinism is wider than the principle of causality: in addition to cause-and-effect relationships, it includes other types of determination (functional connections, connection of states, target determination, etc.).

determinism in its historical development passed through two main stages - classical (mechanistic) and post-classical (dialectical) in its essence.

Epicurus's teaching on the spontaneous deviation of an atom from a straight line contained a modern understanding of determinism, but since Epicurus's randomness itself is not determined by anything (uncaused), then without any special errors we can say that indeterminism originates from Epicurus.

Indeterminism is the doctrine that there are states and events for which a cause does not exist or cannot be specified.

In the history of philosophy, two types of indeterminism are known:

· The so-called "objective" indeterminism, which completely denies causality as such, not only its objective reality, but also the possibility of its subjectivist interpretation.

· Idealistic indeterminism, which, denying the objective nature of the relations of determination, declares causality, necessity, regularity as products of subjectivity, and not attributes of the world itself.

This means (in Hume, Kant and many other philosophers) that cause and effect, like other categories of determination, are only a priori, i.e. received not from practice, forms of our thinking. Many subjective idealists declare the use of these categories to be a "psychological habit" of a person to observe one phenomenon after another and declare the first phenomenon to be the cause and the second to be the effect.

The stimulus for the revival of indeterministic views at the beginning of the 20th century was the fact that the role of statistical regularities in physics increased, the presence of which was declared to refute causality. However, the dialectical-materialistic interpretation of the correlation of chance and necessity, the categories of causality and law, the development of quantum mechanics, which revealed new types of objective causal connection of phenomena in the microworld, showed the failure of attempts to use the presence of probabilistic processes in the foundation of the microworld to deny determinism.

Historically, the concept of determinism is associated with the name of P. Laplace, although already among his predecessors, for example, Democritus and Spinoza, there was a tendency to identify the "law of nature", "causality" with "necessity", considering "chance" as a subjective result of ignorance of "true" causes .

Classical physics (particularly Newtonian mechanics) developed a specific idea of ​​a scientific law. It was taken as obvious that for any scientific law the following requirement must necessarily be satisfied: if the initial state of a physical system (for example, its coordinates and momentum in Newtonian mechanics) and the interaction that determines the dynamics are known, then in accordance with scientific law it is possible and should calculate its state at any moment of time both in the future and in the past.

The causal relationship of phenomena is expressed in the fact that one phenomenon (cause) under certain conditions necessarily brings to life another phenomenon (consequence). Accordingly, it is possible to give working definitions of cause and effect. A cause is a phenomenon whose action brings to life, determines the subsequent development of another phenomenon. Then the effect is the result of the action of a certain cause.

In the determination of phenomena, in the system of their certainty, along with the cause, conditions also enter - those factors, without the presence of which the cause cannot give rise to an effect. This means that the cause itself does not work in all conditions, but only in certain ones.

The system of determination of phenomena (especially social ones) often includes a reason - one or another factor that determines only the moment, the time of the occurrence of the effect.

There are three types of temporal orientation of causal relationships:

1) determination by the past. Such a determination is essentially universal, because it reflects an objective regularity, according to which the cause in the final analysis always precedes the effect. This regularity was very subtly noticed by Leibniz, who gave the following definition of a cause: "A cause is that which causes some thing to begin to exist";

2) determination by the present. Knowing nature, society, our own thinking, we invariably discover that many things, being determined by the past, are also in a determining interaction with things that coexist simultaneously with them. It is no coincidence that we encounter the idea of ​​a simultaneous determining relationship in various fields of knowledge - physics, chemistry (when analyzing equilibrium processes), biology (when considering homeostasis), etc.

The determinism of the present is also directly related to those paired categories of dialectics, between which there is a causal relationship. As you know, the form of any phenomenon is under the determining influence of the content, but this does not mean at all that the content precedes the form in general and at its original point can be formless;

3) determination by the future. Such a determination, as emphasized in a number of studies, although it occupies a more limited place among the determining factors compared to the types considered above, at the same time plays a significant role. In addition, one must take into account the entire relativity of the term "determination by the future": future events are still absent, one can speak of their reality only in the sense that they are necessarily present as trends in the present (and were present in the past). And yet the role of this type of determination is very significant. Let us turn to two examples related to the plots that have already been discussed,

Determination by the future underlies the explanation of the discovery discovered by Academician P.K. Anokhin of advanced reflection of reality by living organisms. The meaning of such an advance, as emphasized in the chapter on consciousness, is in the ability of a living thing to respond not only to objects that are now directly affecting it, but also to changes that seem to be indifferent to it at the moment, but in reality, which are signals of probable future impacts. The reason here, as it were, operates from the future.

There are no unreasonable phenomena. But this does not mean that all connections between phenomena in the surrounding world are causal.

Philosophical determinism, as the doctrine of the material regular conditioning of phenomena, does not exclude the existence of non-causal types of conditioning. Non-causal relationships between phenomena can be defined as such relationships in which there is a relationship, interdependence, interdependence between them, but there is no direct relationship between genetic productivity and temporal asymmetry.

The most characteristic example of non-causal conditioning or determination is the functional relationship between individual properties or characteristics of an object.

The links between causes and effects can be not only necessary, rigidly conditioned, but also random, probabilistic. The knowledge of probabilistic causal relationships required the inclusion of new dialectical categories in the causal analysis: chance and necessity, possibility and reality, regularity, etc.

Randomness is a concept that is polar to necessity. Random is such a relationship of cause and effect, in which the causal grounds allow the implementation of any of the many possible alternative consequences. At the same time, which particular variant of communication will be realized depends on a combination of circumstances, on conditions that are not amenable to accurate accounting and analysis. Thus, a random event occurs as a result of the action of some of the indefinitely a large number various and precisely unknown causes. The onset of a random event-consequence is in principle possible, but not predetermined: it may or may not occur.

In the history of philosophy, the point of view is widely represented, according to which there is no real accident, it is a consequence of necessary causes unknown to the observer. But, as Hegel first showed, a random event in principle cannot be caused by internal laws alone, which are necessary for this or that process. A random event, as Hegel wrote, cannot be explained from itself.

The unpredictability of chances seems to contradict the principle of causality. But this is not so, because random events and causal relationships are the consequences, although not known in advance and thoroughly, but still really existing and fairly certain conditions and causes. They do not arise randomly and not from “nothing”: the possibility of their appearance, although not rigidly, not unambiguously, but naturally, is connected with causal grounds. These connections and laws are discovered as a result of studying a large number (flow) of homogeneous random events, described using the apparatus of mathematical statistics, and therefore are called statistical. Statistical patterns are objective in nature, but differ significantly from the patterns of single phenomena. The use of quantitative methods of analysis and calculation of characteristics, subject to the statistical laws of random phenomena and processes, made them the subject of a special branch of mathematics - the theory of probability.

Probability is a measure of the possibility of a random event occurring. The probability of an impossible event is zero, the probability of a necessary (reliable) event is one.

The probabilistic-statistical interpretation of complex causal relationships has made it possible to develop and apply in scientific research fundamentally new and very effective methods knowledge of the structure and laws of development of the world. Modern advances in quantum mechanics and chemistry, genetics would be impossible without understanding the ambiguity of relationships between the causes and effects of the phenomena under study, without recognizing that the subsequent states of a developing object can not always be completely deduced from the previous one.

To explain the uncertainty relation, N. Bohr put forward complementarity principle, contrasting it with the principle of causality. When using an instrument that can accurately measure the coordinates of particles, the momentum can be any and, therefore, there is no causal relationship. Using devices of another class, you can accurately measure the momentum, and the coordinates become arbitrary. In this case, the process, according to N. Bohr, supposedly takes place outside of space and time, i.e. one should speak either of causality or of space and time, but not of both.

The principle of complementarity is a methodological principle. In a generalized form, the requirements of the principle of complementarity, as a method of scientific research, can be formulated as follows: in order to reproduce the integrity of a phenomenon at a certain intermediate stage of its cognition, it is necessary to apply mutually exclusive and mutually limiting each other “additional” classes of concepts that can be used separately, depending on special conditions, but only taken together exhaust all the information that can be defined and communicated.

So, according to the principle of complementarity, obtaining experimental information about some physical quantities describing a micro-object (elementary particle, atom, molecule) is inevitably associated with the loss of information about some other quantities that are additional to the first ones. Such mutually complementary quantities can be considered the coordinate of the particle and its velocity (momentum), kinetic and potential energy, direction and magnitude of the momentum.

The principle of complementarity made it possible to reveal the need to take into account the corpuscular-wave nature of micro-phenomena. Indeed, in some experiments, microparticles, for example, electrons, behave like typical corpuscles, in others they behave like wave structures.

From a physical point of view, the principle of complementarity is often explained by the influence measuring instrument on the state of the micro-object. When accurately measuring one of the additional quantities, the other quantity undergoes a completely uncontrolled change as a result of the interaction of the particle with the device. Although such an interpretation of the principle of complementarity is confirmed by the analysis of the simplest experiments, from a general point of view it encounters objections of a philosophical nature. From the point of view of modern quantum theory, the role of an instrument in measurements is to “prepare” a certain state of the system. States in which mutually complementary quantities would simultaneously have precisely defined values ​​are fundamentally impossible, and if one of these quantities is exactly defined, then the values ​​of the other are completely indeterminate. Thus, in fact, the principle of complementarity reflects the objective properties of quantum systems that are not related to the observer.

        1. Description of microobjects in quantum mechanics

The limited application of classical mechanics to micro-objects, the impossibility of describing the structure of the atom from classical positions, and the experimental confirmation of de Broglie's hypothesis about the universality of wave-particle duality led to the creation of quantum mechanics, which describes the properties of microparticles taking into account their features.

The creation and development of quantum mechanics covers the period from 1900 (Planck's formulation of the quantum hypothesis) to the end of the 20s of the twentieth century and is associated primarily with the work of the Austrian physicist E. Schrödinger, the German physicists M. Born and W. Heisenberg and the English physicist P. Dirac.

As already mentioned, de Broglie's hypothesis was confirmed by experiments on electron diffraction. Let's try to understand what is the wave nature of the motion of an electron, and what kind of waves are we talking about.

The diffraction pattern observed for microparticles is characterized by an unequal distribution of fluxes of microparticles scattered or reflected in different directions: more particles are observed in some directions than in others. The presence of a maximum in the diffraction pattern from the point of view of wave theory means that these directions correspond to the highest intensity of de Broglie waves. On the other hand, the intensity of de Broglie waves is greater where there are more particles. Thus, the intensity of de Broglie waves at a given point in space determines the number of particles that hit that point.

The diffraction pattern for microparticles is a manifestation of a statistical (probabilistic) regularity, according to which particles fall into those places where the intensity of de Broglie waves is greater. The need for a probabilistic approach to the description of microparticles is an important distinguishing feature of quantum theory. Is it possible to interpret de Broglie waves as probability waves, that is, to assume that the probability of detecting microparticles at different points in space changes according to the wave law? Such an interpretation of de Broglie waves is incorrect, if only because then the probability of detecting a particle at some points in space is negative, which does not make sense.

To eliminate these difficulties, the German physicist M. Born (1882–1970) suggested in 1926 that it is not the probability itself that changes according to the wave law, but the probability amplitude, called wave function. The description of the state of a micro-object with the help of the wave function has a statistical, probabilistic character: namely, the square of the modulus of the wave function (the square of the amplitude of de Broglie waves) determines the probability of finding a particle at a given time in a certain limited volume.

The statistical interpretation of de Broglie waves and the Heisenberg uncertainty relation led to the conclusion that the equation of motion in quantum mechanics, which describes the motion of microparticles in various force fields, should be an equation from which the experimentally observed wave properties of particles would follow. The basic equation should be the equation for the wave function, since its square determines the probability of finding a particle at a given time in a given specific volume. In addition, the desired equation must take into account the wave properties of particles, that is, it must be a wave equation.

The basic equation of quantum mechanics was formulated in 1926 by E. Schrödinger. Schrödinger equation, like all the basic equations of physics (for example, Newton's equation in classical mechanics and Maxwell's equations for the electromagnetic field) is not derived, but postulated. The correctness of the Schrödinger equation is confirmed by the agreement with the experience of the results obtained with its help, which in turn gives it the character of the laws of nature.

The wave function that satisfies the Schrödinger equation has no analogues in classical physics. Nevertheless, at very short de Broglie wavelengths, the transition from quantum equations to classical equations is automatically made, just as wave optics passes into ray optics for short wavelengths. Both passages to the limit are mathematically performed similarly.

The discovery of a new structural level of the structure of matter and the quantum mechanical method of its description laid the foundations of physics solid body. The structure of metals, dielectrics, semiconductors, their thermodynamic, electrical and magnetic properties were understood. Ways have been opened for a purposeful search for new materials with the necessary properties, ways for creating new industries, new technologies. Great strides have been made as a result of the application of quantum mechanics to nuclear phenomena. Quantum mechanics and nuclear physics have explained that the source of colossal energy of stars is nuclear fusion reactions occurring at stellar temperatures of tens and hundreds of millions of degrees.

The application of quantum mechanics to physical fields. A quantum theory of the electromagnetic field was built - quantum electrodynamics, which explained many new phenomena. The photon, a particle of the electromagnetic field, which has no rest mass, took its place in the series of elementary particles. The synthesis of quantum mechanics and the special theory of relativity, carried out by the English physicist P. Dirac, led to the prediction of antiparticles. It turned out that each particle should have, as it were, its own "double" - another particle with the same mass, but with the opposite electric or some other charge. Dirac predicted the existence of the positron and the possibility of converting a photon into an electron-positron pair and vice versa. The positron, the antiparticle of the electron, was experimentally discovered in 1934.

AT Everyday life There are two ways to transfer energy in space - by means of particles or waves. In order, say, to throw off a domino bone balanced on its edge from the table, you can give it the necessary energy in two ways. First, you can throw another domino at it (that is, transfer a point impulse using a particle). Secondly, you can build dominoes in a row, leading along the chain to the one on the edge of the table, and drop the first one onto the second: in this case, the impulse will be transmitted along the chain - the second domino will overwhelm the third, the third the fourth, and so on. This is the wave principle of energy transfer. In everyday life, there are no visible contradictions between the two mechanisms of energy transfer. So, a basketball is a particle, and sound is a wave, and everything is clear.

Let's summarize what has been said. If photons or electrons are directed into such a chamber one at a time, they behave like particles; however, if sufficient statistics of such single experiments are collected, it will be found that, in aggregate, these same electrons or photons will be distributed on the back wall of the chamber in such a way that a familiar pattern of alternating peaks and decays of intensity will be observed on it, indicating their wave nature. In other words, in the microcosm, objects that behave like particles, at the same time, seem to “remember” their wave nature, and vice versa. This strange property of microworld objects is called quantum wave dualism. Many experiments were carried out in order to "reveal the true nature" of quantum particles: various experimental techniques and installations were used, including those that would allow halfway to the receiver to reveal the wave properties of an individual particle or, conversely, to determine the wave properties of a light beam through the characteristics of individual quanta. Everything is in vain. Apparently, quantum-wave dualism is objectively inherent in quantum particles.

The complementarity principle is a simple statement of this fact. According to this principle, if we measure the properties of a quantum object as a particle, we see that it behaves like a particle. If we measure its wave properties, for us it behaves like a wave. The two views are by no means contradictory; they are complement one another, which is reflected in the name of the principle.

As I already explained in the Introduction, I believe that the philosophy of science has benefited from such wave-particle duality incomparably more than would have been possible in its absence and a strict distinction between corpuscular and wave phenomena. Today it is quite obvious that the objects of the microcosm behave in a fundamentally different way than the objects of the macrocosm that we are accustomed to. But why? On what tablets is it written? And, just as medieval natural philosophers struggled to figure out whether the flight of an arrow was "free" or "forced," so modern philosophers struggle to resolve quantum wave dualism. In fact, both electrons and photons are not waves or particles, but something very special in its intrinsic nature - and therefore not amenable to description in terms of our everyday experience. If we continue to try to squeeze their behavior into the framework of paradigms familiar to us, more and more paradoxes are inevitable. So the main conclusion here is that the dualism we observe is generated not by the inherent properties of quantum objects, but by the imperfection of the categories in which we think.

Conformity principle

A new theory that claims to have a deeper knowledge of the essence of the universe, to more Full description and for a wider application of its results than the previous one, should include the previous one as a limiting case. Thus, classical mechanics is the limiting case of quantum mechanics and the mechanics of the theory of relativity. Relativistic mechanics ( special theory relativity) in the limit of small velocities passes into classical mechanics (Newtonian). This is the content of the methodological principle of correspondence formulated by N. Bohr in 1923.

The essence of the correspondence principle is as follows: any new more general theory, which is the development of previous classical theories, the validity of which was experimentally established for certain groups of phenomena, does not reject these classical theories, but includes them. The previous theories retain their significance for certain groups of phenomena, as the limiting form and special case of the new theory. The latter determines the boundaries of the application of previous theories, and in certain cases there is the possibility of a transition from a new theory to an old one.

In quantum mechanics, the correspondence principle reveals the fact that quantum effects are significant only when considering quantities comparable to Planck's constant (h). When considering macroscopic objects, Planck's constant can be considered negligible (hà0). This leads to the fact that the quantum properties of the objects under consideration turn out to be insignificant; representations of classical physics - are fair. Therefore, the value of the correspondence principle goes beyond the boundaries of quantum mechanics. It will become an integral part of any new theory.

The complementarity principle is one of the most profound ideas modern natural science. A quantum object is not a wave, and not a particle separately. Experimental study of micro-objects involves the use of two types of instruments: one allows you to study the wave properties, the other - corpuscular. These properties are incompatible in terms of their simultaneous manifestation. However, they equally characterize the quantum object, and therefore do not contradict, but complement each other.

The principle of complementarity was formulated by N. Bohr in 1927, when it turned out that during the experimental study of micro-objects, accurate data can be obtained either about their energies and momenta (energy-impulse pattern), or about their behavior in space and time (spatio-temporal pattern). ). These mutually exclusive pictures cannot be applied simultaneously. So, if we organize the search for a particle with the help of precise physical instruments that fix its position, then the particle is found with equal probability at any point in space. However, these properties equally characterize the micro-object, which implies their use in the sense that instead of one single picture, two must be used: energy-impulse and space-time.

In a broad philosophical sense, the complementarity principle of N. Bohr is manifested in characterization of different objects of research within the same science.