In the past century, two great discoveries revolutionized our understanding of the Universe. The first was the study of the NGC 3198 galaxy in the 80s. Looking at the rotation velocity of the galaxy objects with respect to its center, the so-called rotational curve of the galaxy, an anomaly was found. The rotational curve did not seem to obey the known laws of physics as the velocities of objects far away from the center of NCG 3198 were too big with respect to any theoretical prediction. The explanation for these anomalies involves the presence of an additional unknown matter in the galaxies, called dark matter. An interesting property of dark matter is its interaction nature. Since we have not yet observed this matter with telescopes but only looked at its effect on galaxies objects, we believe dark matter interacts with standard matter (baryons, leptons, ...), radiation and neutrinos only gravitationally and not through electromagnetic interactions. In fact, up to now, dark matter has not been directly observed with ground detection experiments. The second discovery came from the observation of type 1A supernovae emissions. Although we already knew that the Universe was expanding since the beginning of the 20th century, at the end of the millennium, two independent experiments discovered that this expansion was accelerating. Within the theory of General Relativity, the acceleration can not be explained with the known standard matter, radiation, neutrinos, or even with a different spatial curvature. This contradiction led to the resolutive hypothesis that a new kind of matter, which throughout the whole history of the Universe has a constant density achieved with negative pressure, should be considered. This matter is called dark energy. These two breakthroughs are the basis of the standard model of cosmology. However, even though this model has been proven astonishingly accurate in describing the history of the Universe, cosmologists still struggle with some fundamental questions about dark energy and dark matter. It is important to stress that both dark matter and dark energy are called "dark" to underline our ignorance about the fundamental nature of these additional matter types. We should think of them as a way to parametrize our ignorance about the actual nature of these two hypotheses rather than the solution of the two problems mentioned before. We know that there should be some matter with specific properties to explain the two aforementioned observational phenomena, and its inclusion in the theoretical model leads to satisfactory theory-observation accordance. However, we know nothing about the fundamental nature of dark matter and dark energy, nor any direct observation has proven their existence. Moreover, additional fundamental and mathematical questions arise when postulating dark matter and dark energy in the form proposed in the standard model. In the decades after these discoveries, we are witnessing two phenomena in theoretical and observational cosmology. On one hand, experiments are becoming more accurate and precise in detecting information from the Universe. One of the most recent examples is the Planck experiment, whose space telescope observed the photons coming from a moment in the Universe's history called recombination, which happened 13 billions of years ago. At that moment, photons and electrons decoupled, letting the firsts travel freely and unscattered. This radiation is called Cosmic Microwave Background (CMB). We can collect these photons with a detector to create a snapshot of the photon intensity distribution at that time. This distribution is tightly linked with dark matter and dark energy properties, helping physicists to shed some light on the two dark components. In general, the accuracy of new experiments puts very tight constraints on theoretical models. On the other hand, many theoretical models have been proposed as alternatives to the standard model to solve the problems mentioned above. These models modify the General Relativity equations of motion, the Einstein equations, either replacing the dark matter and dark energy matter contents with respect to the standard model or modifying the geometry of spacetime. They achieve this by including additional dynamical quantities or degrees of freedom, whose evolution can explain the accelerated expansion acceleration or dark matter effects (or both). For instance, a scalar field can be considered, but other models with more complex additional degrees of freedom have been proposed. All these models are usually called Modified Gravity theories. In the last few years, most of the modified gravity models have been under scrutiny due to increased observational data. For instance, the predictions of the CMB might change when we consider modified gravity models for dark energy or dark matter, putting constraints on the theory parameters or ruling the model out. The data are becoming accurate enough to put very tight constraints on the modified gravity models. Nevertheless, the analysis of the CMB power spectrum or similar observables is not an easy task. One of the main obstacles in checking the viability of the theoretical models against experimental data is the complexity of the theoretical study of these crucial observables. No analytical solution of the equations of motion valid at all times of the Universe's history can be found. Moreover, additional degrees of freedom can increase the complexity of the evaluations for modified gravity models. We should also consider that a single evaluation of the Universe's history is not enough to conclude anything about the viability of the model. When we compare a theoretical model with experiments, we should minimize the difference between the predictions and the data, varying the value of the theory parameters. This procedure usually needs an enormous number of evaluations that require significant computational power, even with the most efficient Monte Carlo algorithms. Moreover, especially in the modified gravity context, it is also essential to distinguish the theory predictivity given by the new proposals' real physical and mathematical power rather than the simple addition of new degrees of freedom. It is easier to fit three points with a parabola with respect to a straight line at the price of adding a new parameter to the theory. However, is it always necessary? In physics, like in other fields, Occam's razor principle tells us that the most straightforward theory should always be preferred. With the introduction of the Bayesian probability, we can perform comparisons between models to find the ones that fit the data with fewer parameters. But, again, this procedure is computationally expensive. Finally, we can ask ourselves if there is a way to parametrize the modified gravity models in a model-independent way. In other words, does it exist a way to write a general action or Lagrangian which can include all modified gravity models? The power of such a generalization would be undeniable: we would be able to compute the equations of motion from one single action and apply it for every modified gravity model. Such a theory, which we will call Effective Field Theory (EFT) of Gravity, has been developed, and it works for any theory with an additional degree of freedom with respect to General Relativity. The major drawback is that the general form of the EFT of Gravity does not provide an immediate physical interpretation of its Lagrangian terms, and therefore a mapping between a "standard" modified gravity theory and its EFT counterpart is always preferred.
Theoretical and Numerical Methods for Modified Gravity / Casalino, Alessandro. - (2021 Jul 22), pp. 1-195. [10.15168/11572_313053]
Theoretical and Numerical Methods for Modified Gravity
Casalino, Alessandro
2021-07-22
Abstract
In the past century, two great discoveries revolutionized our understanding of the Universe. The first was the study of the NGC 3198 galaxy in the 80s. Looking at the rotation velocity of the galaxy objects with respect to its center, the so-called rotational curve of the galaxy, an anomaly was found. The rotational curve did not seem to obey the known laws of physics as the velocities of objects far away from the center of NCG 3198 were too big with respect to any theoretical prediction. The explanation for these anomalies involves the presence of an additional unknown matter in the galaxies, called dark matter. An interesting property of dark matter is its interaction nature. Since we have not yet observed this matter with telescopes but only looked at its effect on galaxies objects, we believe dark matter interacts with standard matter (baryons, leptons, ...), radiation and neutrinos only gravitationally and not through electromagnetic interactions. In fact, up to now, dark matter has not been directly observed with ground detection experiments. The second discovery came from the observation of type 1A supernovae emissions. Although we already knew that the Universe was expanding since the beginning of the 20th century, at the end of the millennium, two independent experiments discovered that this expansion was accelerating. Within the theory of General Relativity, the acceleration can not be explained with the known standard matter, radiation, neutrinos, or even with a different spatial curvature. This contradiction led to the resolutive hypothesis that a new kind of matter, which throughout the whole history of the Universe has a constant density achieved with negative pressure, should be considered. This matter is called dark energy. These two breakthroughs are the basis of the standard model of cosmology. However, even though this model has been proven astonishingly accurate in describing the history of the Universe, cosmologists still struggle with some fundamental questions about dark energy and dark matter. It is important to stress that both dark matter and dark energy are called "dark" to underline our ignorance about the fundamental nature of these additional matter types. We should think of them as a way to parametrize our ignorance about the actual nature of these two hypotheses rather than the solution of the two problems mentioned before. We know that there should be some matter with specific properties to explain the two aforementioned observational phenomena, and its inclusion in the theoretical model leads to satisfactory theory-observation accordance. However, we know nothing about the fundamental nature of dark matter and dark energy, nor any direct observation has proven their existence. Moreover, additional fundamental and mathematical questions arise when postulating dark matter and dark energy in the form proposed in the standard model. In the decades after these discoveries, we are witnessing two phenomena in theoretical and observational cosmology. On one hand, experiments are becoming more accurate and precise in detecting information from the Universe. One of the most recent examples is the Planck experiment, whose space telescope observed the photons coming from a moment in the Universe's history called recombination, which happened 13 billions of years ago. At that moment, photons and electrons decoupled, letting the firsts travel freely and unscattered. This radiation is called Cosmic Microwave Background (CMB). We can collect these photons with a detector to create a snapshot of the photon intensity distribution at that time. This distribution is tightly linked with dark matter and dark energy properties, helping physicists to shed some light on the two dark components. In general, the accuracy of new experiments puts very tight constraints on theoretical models. On the other hand, many theoretical models have been proposed as alternatives to the standard model to solve the problems mentioned above. These models modify the General Relativity equations of motion, the Einstein equations, either replacing the dark matter and dark energy matter contents with respect to the standard model or modifying the geometry of spacetime. They achieve this by including additional dynamical quantities or degrees of freedom, whose evolution can explain the accelerated expansion acceleration or dark matter effects (or both). For instance, a scalar field can be considered, but other models with more complex additional degrees of freedom have been proposed. All these models are usually called Modified Gravity theories. In the last few years, most of the modified gravity models have been under scrutiny due to increased observational data. For instance, the predictions of the CMB might change when we consider modified gravity models for dark energy or dark matter, putting constraints on the theory parameters or ruling the model out. The data are becoming accurate enough to put very tight constraints on the modified gravity models. Nevertheless, the analysis of the CMB power spectrum or similar observables is not an easy task. One of the main obstacles in checking the viability of the theoretical models against experimental data is the complexity of the theoretical study of these crucial observables. No analytical solution of the equations of motion valid at all times of the Universe's history can be found. Moreover, additional degrees of freedom can increase the complexity of the evaluations for modified gravity models. We should also consider that a single evaluation of the Universe's history is not enough to conclude anything about the viability of the model. When we compare a theoretical model with experiments, we should minimize the difference between the predictions and the data, varying the value of the theory parameters. This procedure usually needs an enormous number of evaluations that require significant computational power, even with the most efficient Monte Carlo algorithms. Moreover, especially in the modified gravity context, it is also essential to distinguish the theory predictivity given by the new proposals' real physical and mathematical power rather than the simple addition of new degrees of freedom. It is easier to fit three points with a parabola with respect to a straight line at the price of adding a new parameter to the theory. However, is it always necessary? In physics, like in other fields, Occam's razor principle tells us that the most straightforward theory should always be preferred. With the introduction of the Bayesian probability, we can perform comparisons between models to find the ones that fit the data with fewer parameters. But, again, this procedure is computationally expensive. Finally, we can ask ourselves if there is a way to parametrize the modified gravity models in a model-independent way. In other words, does it exist a way to write a general action or Lagrangian which can include all modified gravity models? The power of such a generalization would be undeniable: we would be able to compute the equations of motion from one single action and apply it for every modified gravity model. Such a theory, which we will call Effective Field Theory (EFT) of Gravity, has been developed, and it works for any theory with an additional degree of freedom with respect to General Relativity. The major drawback is that the general form of the EFT of Gravity does not provide an immediate physical interpretation of its Lagrangian terms, and therefore a mapping between a "standard" modified gravity theory and its EFT counterpart is always preferred.File | Dimensione | Formato | |
---|---|---|---|
Tesi.pdf
accesso aperto
Descrizione: Tesi
Tipologia:
Tesi di dottorato (Doctoral Thesis)
Licenza:
Tutti i diritti riservati (All rights reserved)
Dimensione
8.14 MB
Formato
Adobe PDF
|
8.14 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione