QUICK FACTS
Created Jan 0001
Status Verified Sarcastic
Type Existential Dread
holographic universe (album), the holographic principle, holographic principle, string theories, quantum gravity, space, boundary, light-like, gravitational horizon, black hole

Holographic Principle

“'Holographic universe' redirects here. For the Scar Symmetry album, see Holographic Universe (album)). For the Epica album, see The Holographic...”

Contents
  • 1. Overview
  • 2. Etymology
  • 3. Cultural Impact

“Holographic universe” redirects here. For the Scar Symmetry album, see Holographic Universe (album) . For the Epica album, see The Holographic Principle .

The holographic principle isn’t some abstract philosophical musing, but a concrete, if profoundly unsettling, property attributed to certain string theories and a highly anticipated characteristic of any viable theory of quantum gravity . In essence, it posits that the entire physical description of a given volume of space can be entirely captured, or “encoded,” on a lower-dimensional boundary that surrounds this region. Think of it as the universe’s most elaborate compression algorithm, where three dimensions of reality are somehow manifest from information residing on a two-dimensional surface. This boundary is often conceptualized as a light-like surface, such as the infamous gravitational horizon of a black hole .

This rather audacious concept was initially put forth by the Dutch theoretical physicist Gerard ’t Hooft in 1993, [3] who dared to suggest that the fundamental degrees of freedom of gravity might reside on a boundary. His insights were later given a more rigorous and precise interpretation within the realm of string theory by Leonard Susskind . [4] Susskind, combining ’t Hooft’s foundational ideas with earlier, prescient observations from Charles Thorn , [4] [5] articulated the principle with a clarity that still manages to disturb. He famously declared, “The three-dimensional world of ordinary experience—the universe filled with galaxies, stars, planets, houses, boulders, and people—is a hologram , an image of reality coded on a distant two-dimensional surface.” [6] One might almost detect a hint of cosmic weariness in such a statement, as if the universe itself is just a particularly elaborate projection. Indeed, as Raphael Bousso later highlighted, [7] Thorn had already intuited something similar in 1978, noting that string theory inherently permitted a lower-dimensional description from which gravity itself could emerge—a mechanism that would, in hindsight, be precisely what we now refer to as holographic. The most compelling and mathematically robust manifestation of this holographic correspondence to date is the celebrated AdS/CFT correspondence .

The genesis of the holographic principle can be traced back to the perplexing realm of black hole thermodynamics and, specifically, the Bekenstein bound . This bound, a cornerstone of black hole physics, conjectures that the maximum possible entropy that can be contained within any given region of space scales not with the volume (as one might intuitively expect for a three-dimensional object), but rather with the area of its boundary (specifically, the radius squared). For a black hole , this counter-intuitive scaling led to the profound realization that the entire information content of everything that has ever succumbed to the black hole’s gravitational embrace—every particle, every field, every fleeting thought—could be entirely encoded within the subtle fluctuations of its two-dimensional event horizon . This insight provides a potential resolution to the notorious black hole information paradox when viewed through the lens of string theory . [6] However, like all grand theories, it’s not without its thorns. There exist specific classical solutions to the Einstein equations —dubbed “Wheeler’s bags of gold”—which theoretically allow for entropy values that exceed the area-dependent limit imposed by the holographic principle . The mere existence of such solutions poses a direct challenge to the holographic interpretation, and their precise implications within a comprehensive quantum theory of gravity that embraces the holographic principle remain, rather inconveniently, an open and actively researched problem. [8] One might say the universe still enjoys throwing a curveball or two, just to keep us on our toes.

High-level summary

The universe, in its grand, indifferent way, is typically perceived as a vast expanse of “matter” and “energy.” Yet, in a 2003 article for Scientific American magazine, the distinguished physicist Jacob Bekenstein presented a rather speculative, yet increasingly compelling, summary of a paradigm shift initiated by John Archibald Wheeler . This evolving perspective suggests that scientists might eventually come to “regard the physical world as made of information , with energy and matter as incidentals.” It’s a thought that, if true, would relegate the tangible stuff of existence to mere footnotes in the cosmic ledger. Bekenstein provocatively queried, “Could we, as William Blake memorably penned, ‘see a world in a grain of sand’, or is that idea no more than ‘poetic license ’?” [9] He was, of course, gesturing towards the profound implications of the holographic principle , implying that perhaps the poetic vision held more truth than previously imagined.

Unexpected connection

Bekenstein’s illuminating overview, aptly titled “A Tale of Two Entropies” [10], meticulously outlined the potentially revolutionary ramifications of Wheeler’s information-centric view of the cosmos. A significant aspect of this narrative was the elucidation of a previously unappreciated, almost serendipitous, link between the abstract realm of information theory and the concrete rules of classical physics. This unexpected conceptual bridge first began to manifest shortly after the groundbreaking 1948 papers by the American applied mathematician Claude Shannon . Shannon’s seminal work introduced what remains the most widely accepted and utilized quantitative measure of information content – a metric now universally known as Shannon entropy . This objective measure of information has proven to be an indispensable tool, forming the fundamental bedrock for the design and operation of virtually all contemporary communication and data storage technologies, from the ubiquitous cellular phone to the intricate workings of modems , hard disk drives, and DVDs . The sheer ubiquity of its application underscores its practical, if not always obvious, importance.

Concurrently, within the venerable field of thermodynamics —the branch of physics dedicated to the study of heat and its relation to other forms of energy and work—entropy has long been colloquially understood as a measure of “disorder” within a physical system composed of matter and energy. However, in 1877, the Austrian physicist Ludwig Boltzmann provided a far more precise and fundamental definition. He framed entropy in terms of the sheer number of distinct microscopic arrangements, or “microstates,” that the constituent particles of a macroscopic “chunk” of matter could adopt while still presenting the same macroscopic appearance. To illustrate, consider the air filling a room: its thermodynamic entropy would precisely correspond to the logarithm of the total count of all the myriad ways the individual gas molecules could be spatially distributed throughout the room, coupled with all the possible permutations of their individual motions. The complexity hidden within apparent simplicity, laid bare.

Energy, matter, and information equivalence

Shannon’s rigorous pursuit of a method to quantify the information embedded within, say, a simple telegraph message, led him down an unexpected path. His resulting mathematical formula for information content remarkably mirrored the exact form of Boltzmann’s entropy formula for thermodynamic disorder. In his August 2003 Scientific American article, “Bekenstein provided a concise summary that truly connects these disparate fields: “Thermodynamic entropy and Shannon entropy are conceptually equivalent: the number of arrangements that are counted by Boltzmann entropy reflects the amount of Shannon information one would need to implement any particular arrangement” of matter and energy. The only superficial distinction between the thermodynamic entropy of physics and Shannon’s information entropy lies in their respective units of measure. The former is traditionally expressed in units of energy divided by temperature (e.g., Joules per Kelvin ), while the latter is presented in essentially dimensionless “bits” of information, a stark reminder of its abstract, yet fundamental, nature.

This profound equivalence underpins the holographic principle ’s most radical assertion: the entropy of all ordinary mass, not merely the exotic confines of black holes , is similarly proportional to its surface area, rather than its volume. This implies a startling conclusion: volume itself, the very perception of three-dimensional extent, is merely an illusion . Instead, the universe, in its entirety, is proposed to be a gigantic hologram , a projection that is mathematically isomorphic to the “information” painstakingly inscribed upon the two-dimensional surface of its cosmic boundary . [11] A rather inconvenient truth, if you ask me, making everything you perceive just a shadow of something else.

AdS/CFT correspondence

Conjectured relationship of adS/CFT

The anti-de Sitter/conformal field theory correspondence , often referenced as Maldacena duality (a nod to the Argentinian physicist Juan Maldacena who first proposed it [12]) or, more generically, gauge/gravity duality, stands as a formidable conjecture bridging two seemingly disparate types of physical theories. On one side of this profound relationship reside anti-de Sitter spaces (AdS), which are specific types of curved spacetimes frequently employed in theoretical frameworks attempting to describe quantum gravity , particularly when formulated within the intricate structures of string theory or M-theory . On the other side of this correspondence lie conformal field theories (CFTs), which are a class of quantum field theories (QFTs) that exhibit scale invariance and conformal symmetry. These include theories structurally analogous to the Yang–Mills theories that so effectively describe the interactions of elementary particles within the Standard Model .

This duality is not merely an academic exercise; it represents a truly monumental leap in our comprehension of both string theory and the elusive nature of quantum gravity . [13] Its significance stems from two critical contributions: firstly, it furnishes a non-perturbative formulation for string theory under specific boundary conditions , allowing for calculations that are otherwise intractable. Secondly, and perhaps more importantly in the context of this discussion, it stands as the most robust and successful concrete realization of the holographic principle discovered to date.

Beyond its theoretical elegance, the AdS/CFT correspondence has also provided a remarkably potent analytical toolkit for delving into the complexities of strongly coupled quantum field theories . [14] The immense utility of this duality arises from a “strong-weak” relationship: when the constituent fields within the quantum field theory are experiencing intensely strong interactions (making them notoriously difficult to analyze), the corresponding fields in the dual gravitational theory are, by fortunate contrast, weakly interacting and thus far more amenable to mathematical treatment. This ingenious inversion has been exploited to great effect, allowing physicists to explore numerous challenging facets of nuclear physics and condensed matter physics by effectively translating these problems into more tractable, albeit abstract, problems within the framework of string theory .

The conceptual genesis of the AdS/CFT correspondence can be precisely dated to late 1997, when Juan Maldacena unveiled his revolutionary proposal. [12] Crucial elaborations and deeper understandings of this correspondence were subsequently provided through the collaborative efforts of Steven Gubser , Igor Klebanov , and Alexander Markovich Polyakov , alongside independent contributions from Edward Witten . The profound impact of Maldacena’s original paper is perhaps best underscored by its citation count: by 2015, it had amassed over 10,000 citations, cementing its status as the most highly cited article in the entire field of high energy physics . [15] A testament to a genuinely interesting idea, I suppose.

Black hole entropy

Consider an object possessing a relatively high degree of entropy —it’s microscopically chaotic, like a volume of hot gas where individual molecules jostle about in countless random configurations. In stark contrast, a precisely defined configuration of classical fields, such as static electric and magnetic fields or propagating gravitational waves , possesses zero entropy; there is absolutely no inherent randomness in their description. Since black holes were, for a long time, understood as exact, pristine solutions to Einstein’s equations —mathematically perfect and devoid of internal structure—they were initially thought to possess no entropy whatsoever. A clean, simple picture.

However, Jacob Bekenstein sagely pointed out that this seemingly straightforward conclusion led to a rather inconvenient violation of the second law of thermodynamics . Imagine tossing a quantity of hot gas, replete with its inherent entropy, into a black hole . Once this gas irrevocably crosses the event horizon , its entropy would, according to the prevailing understanding, simply vanish from the observable universe. The chaotic, random properties of the gas would be entirely subsumed and effectively disappear once the black hole had absorbed it and settled into its new, larger, but still perfectly described state. To salvage the integrity of the second law—a law that has, after all, proven remarkably resilient—Bekenstein proposed a radical solution: black holes must, in fact, be profoundly random objects themselves, possessing an intrinsic entropy that increases by an amount precisely greater than the entropy of any matter they consume. A clever way to avoid breaking fundamental physics, if a bit of a workaround.

This line of reasoning led to a further compelling deduction: within any fixed volume of space , a black hole whose event horizon precisely encompasses that volume should inherently represent the object capable of containing the absolute maximum amount of entropy . If one could hypothetically conceive of something else within that volume possessing even greater entropy, then the act of feeding additional mass into that hypothetical entity would inevitably lead to its collapse into a black hole with less entropy than the original object, thereby creating a clear violation of the second law . [4] The universe, it seems, prefers its entropy to always increase, even when it involves forming cosmic devourers.

The conceptual framework of entropic gravity , the holographic principle , and the distribution of entropy within spacetime offers a path to derive the fundamental Einstein General Relativity equations from these very considerations. In a rather elegant twist, when the Bekenstein and Hawking equations are applied, the Einstein equation itself takes on a form remarkably analogous to the first law of thermodynamics , suggesting a deep, underlying connection between gravity and thermodynamics.

Within a spherical region of radius R, the entropy of a relativistic gas systematically increases as its total energy content rises. The only known physical constraint on this increase is gravitational in nature; beyond a certain critical energy density, the gas inevitably collapses under its own weight to form a black hole . Bekenstein leveraged this critical threshold to establish a universal upper bound on the entropy that could possibly be contained within any given region of space . Crucially, this bound was found to be directly proportional to the area of the region’s boundary, not its volume. From this, he concluded that the black hole entropy is unequivocally proportional to the area of its event horizon . [16] Furthermore, the bizarre effects of gravitational time dilation dictate that, from the detached perspective of a remote observer, time appears to completely halt at the event horizon . Given the fundamental cosmic speed limit—the speed of light —this temporal distortion effectively prevents any falling object from ever truly crossing the event horizon in the observer’s frame, no matter how infinitesimally close it approaches. Consequently, since any alteration in a quantum state necessitates the passage of time, all objects and their associated quantum information state are, in effect, perpetually “imprinted” upon the event horizon itself. This led Bekenstein to solidify his conclusion: for any distant observer, the black hole entropy is indeed directly proportional to the area of the event horizon .

Prior to these insights, Stephen Hawking had already demonstrated a rather elegant property: the total horizon area of any collection of black holes invariably increases over time. The event horizon itself is a complex boundary delineated by light-like geodesics —these are the light rays that are just barely unable to escape the black hole’s gravitational pull. If any neighboring geodesics begin to converge, they will eventually intersect, and their subsequent trajectories will lead them inexorably into the interior of the black hole . Therefore, to maintain the definition of the horizon, these geodesics must always be diverging or remaining parallel, ensuring that the number of geodesics generating the boundary—and thus the area of the horizon—always increases. Hawking’s groundbreaking result was subsequently dubbed the second law of black hole thermodynamics , a direct and compelling analogy to the universal law of entropy increase in classical thermodynamics.

Initially, Hawking himself did not take this analogy with thermodynamics too seriously. He reasoned that a black hole must possess zero temperature, given that, by definition, black holes do not emit radiation and therefore cannot exist in thermal equilibrium with any black body possessing a positive temperature. [17] A perfectly logical deduction, at the time. However, to his profound surprise, a more meticulous analysis led him to a startling discovery: black holes do radiate . They emit a faint, thermal spectrum of particles, and they do so in precisely the right manner to establish equilibrium with a surrounding gas at a finite, non-zero temperature. This phenomenon, now known as Hawking radiation , fundamentally altered our understanding of black holes. Hawking’s detailed calculation also fixed the constant of proportionality in Bekenstein’s area law: the entropy of a black hole is precisely one-quarter its horizon area when measured in fundamental Planck units . [18] A rather elegant, if deeply counter-intuitive, numerical constant.

The very definition of entropy involves the logarithm of the number of distinct microstates —the myriad ways a system’s microscopic components can be arranged while its macroscopic properties remain unchanged. The concept of black hole entropy is, therefore, deeply perplexing. It explicitly states that the logarithm of the sheer number of possible internal states of a black hole is proportional to the area of its horizon, not to the volume of its interior. [11] This is the core enigma that the holographic principle attempts to unravel, suggesting that the surface holds all the secrets.

The pursuit of generalizing these entropy bounds to more complex and diverse spacetimes has been a fertile area of research. Notable among these efforts was the Fischler–Susskind holographic bound . [19] Subsequently, Raphael Bousso introduced a more universal and elegant formulation, generalizing this concept into the covariant entropy bound . This bound is formulated based upon null hypersurfaces and demonstrates its applicability to any surface within any spacetime, providing a far more encompassing framework. [20]

Black hole information paradox

Hawking’s calculation , revolutionary as it was, presented another profound challenge: it suggested that the thermal radiation emanating from black holes carried no discernible information about the specific matter that had originally fallen in. The outgoing light rays, which constitute the Hawking radiation , are theorized to originate precisely at the edge of the black hole and linger for an extended period near the horizon . In contrast, any infalling matter only reaches and crosses the horizon much later. The interaction between this infalling matter and the outgoing radiation is minimal, occurring only at the point where they effectively “cross” paths. It seemed highly improbable, to put it mildly, that the entire quantum state of the outgoing radiation could be fully determined by such a minuscule, residual scattering event. [citation needed]

Based on this, Hawking initially interpreted his findings to mean that when black holes absorb photons in a pristine pure state , characterized by a precise wave function , they subsequently re-emit new photons in a thermal mixed state , described by a density matrix . This implication, however, struck at the very heart of quantum mechanics itself. In the standard formulation of quantum mechanics, a system’s evolution is unitary, meaning that states which are superpositions with definite probability amplitudes should never irreversibly transform into probabilistic mixtures of different possibilities, except, perhaps, in the act of measurement itself, which a black hole should not be performing. [note 1] The suggestion that black holes could destroy information challenged a fundamental tenet of quantum theory, provoking a furious debate among physicists.

This contentious idea was subsequently refined and addressed with greater precision by Leonard Susskind , who had been independently developing his own concepts of holography . Susskind put forth the argument that the subtle oscillations and dynamics of a black hole’s horizon actually contain a complete description [note 2] of both the matter falling into the black hole and the radiation emanating from it. This was a crucial insight, drawing parallels to the world-sheet theory of string theory , which inherently offers a holographic description of fundamental strings . While elementary, short strings are theorized to possess zero entropy , Susskind ingeniously identified highly excited, extended string states with the complex properties of ordinary black holes . This represented a profound conceptual breakthrough, as it established that the abstract mathematical constructs of strings could possess a tangible, classical interpretation in terms of these enigmatic cosmic objects.

This groundbreaking work ultimately demonstrated that the vexing black hole information paradox could indeed be resolved, provided that quantum gravity is described in a specific, and perhaps initially unusual, string-theoretic manner. This resolution hinges on the crucial assumption that the string-theoretical description is entirely complete, unambiguous, and devoid of redundancy. [22] Within this framework, the very fabric of space-time in quantum gravity would not be a fundamental given, but rather an emergent, effective description arising from the intricate theory of oscillations occurring on a lower-dimensional black-hole horizon . This further suggested that any black hole possessing the appropriate characteristics, and not solely the theoretical constructs of strings , could potentially serve as a foundational basis for a comprehensive description of string theory itself.

The year 1995 marked another significant milestone when Susskind , collaborating with esteemed physicists Tom Banks , Willy Fischler , and Stephen Shenker , presented a novel formulation of the then-emerging M-theory . Their approach employed a holographic description framed in terms of charged point black holes , specifically the D0 branes characteristic of type IIA string theory . The matrix theory they proposed had its conceptual roots in earlier work by Bernard de Wit , Jens Hoppe, and Hermann Nicolai , who first suggested it as a description of two branes within eleven-dimensional supergravity . Susskind and his collaborators, however, reinterpreted these very same matrix models as a dynamic description of point black holes under particular limiting conditions. The power of holography allowed them to conclude that the intricate dynamics of these black holes furnished a complete, non-perturbative formulation of M-theory . Then, in 1997, Juan Maldacena delivered another paradigm-shifting contribution, providing the first holographic descriptions for a higher-dimensional object: the 3+1-dimensional type IIB membrane . This particular breakthrough resolved a long-standing and challenging problem: finding a consistent string description that could accurately describe a gauge theory . These rapid and interconnected developments not only provided profound insights into the nature of quantum gravity but also simultaneously elucidated the intricate relationships between string theory and various forms of supersymmetric quantum field theories .

Limit on information density

The Bekenstein-Hawking entropy of a black hole , a quantity proportional to the surface area of the black hole when expressed in fundamental Planck units , imposes a stark and inescapable limit on the amount of information the universe can contain.

To understand this, consider that information content is formally defined as the logarithm of the reciprocal of the probability that a system occupies a specific microstate . Building upon this, the information entropy of a system is simply the expected value of its information content. This definition of entropy is, rather conveniently, entirely equivalent to the standard Gibbs entropy that has long been employed in classical physics. When this definition is rigorously applied to any physical system, it leads to a rather profound conclusion: for any given amount of energy confined within a particular volume, there exists an absolute upper limit to the density of information—the renowned Bekenstein bound —regarding the precise whereabouts and states of all the constituent particles of matter within that volume. More dramatically, any given volume of space has an ultimate ceiling on the information it can possibly hold, beyond which point it will inevitably collapse into a black hole . The universe, it seems, has its own hard drive capacity limits.

This inherent information ceiling carries a significant implication for the very structure of matter. It strongly suggests that matter itself cannot be infinitely subdivided into ever-smaller components. Instead, there must exist an ultimate, irreducible level of fundamental particles . If, hypothetically, a particle could be infinitely subdivided into an endless hierarchy of lower-level particles, then the cumulative degrees of freedom of the original particle (which are the product of all the degrees of freedom of its sub-particles) would become infinite. Such an infinite number of degrees of freedom would catastrophically violate the maximal limit of entropy density mandated by the Bekenstein bound . Therefore, the holographic principle intrinsically implies that the process of subdivision must, at some fundamental level, cease. There’s a bottom to the rabbit hole, apparently.

While the AdS/CFT correspondence by Juan Maldacena stands as the most rigorous and extensively explored realization of the holographic principle , it’s worth noting an earlier, equally rigorous mathematical proof. In 1986, J. David Brown and Marc Henneaux demonstrated with precision that the asymptotic symmetry of 2+1 dimensional gravity (gravity in two spatial dimensions plus one time dimension) naturally gives rise to a Virasoro algebra . The corresponding quantum theory associated with this algebra is, remarkably, a 2-dimensional conformal field theory . [23] This early result provided a foundational, albeit simpler, instance of a gravitational theory being equivalent to a lower-dimensional field theory.

Experimental tests

This particular plot, designed to illustrate the cutting edge of experimental physics, endeavors to depict the sensitivity of various experiments to the quantum fluctuations inherent in space and time. The horizontal axis, rendered on a logarithmic scale, represents the size of the experimental apparatus (or, equivalently, the duration of the experiment multiplied by the speed of light ), measured in meters. The vertical axis, also logarithmic, denotes the root-mean-square (rms) fluctuation amplitude, expressed in the same units. The bottom-left corner of this intricate diagram corresponds to the infinitesimally small Planck length or time, representing the theoretical limits of spatial and temporal resolution. For context, the entire observable universe spans a dimension roughly equivalent to 26 units on this scale. Numerous physical systems and experimental setups are strategically plotted across this graph. Of particular interest is the “holographic noise” line, which theoretically represents the rms transverse holographic fluctuation amplitude expected at any given scale, should the holographic principle manifest as a measurable physical effect.

The Fermilab physicist Craig Hogan has been a vocal proponent of the idea that the holographic principle should, if truly fundamental, induce discernible quantum fluctuations in spatial position. [24] These fluctuations, he claims, would manifest as an apparent background noise, a “holographic noise,” which could theoretically be detected by highly sensitive gravitational wave detectors , specifically citing the GEO 600 interferometer. [25] However, these claims have, rather awkwardly, not garnered widespread acceptance or frequent citation among the broader community of quantum gravity researchers. Indeed, they appear to be in direct contradiction with the rigorous calculations derived from established string theory models. [26] One might charitably say the evidence remains… inconclusive, to put it mildly.

Further analyses conducted in 2011, based on measurements of the gamma ray burst GRB 041219A (observed in 2004 by the INTEGRAL space observatory, which was launched in 2002 by the European Space Agency ), indicated a notable absence of Craig Hogan’s predicted noise. This lack of detection extended down to an astonishingly small scale of 10 −48 meters, a far cry from the 10 −35 meters that Hogan had originally predicted, and even further from the 10 −16 meters observed in the operational measurements of the GEO 600 instrument. [27] Despite these rather inconvenient empirical findings, research under Hogan’s direction at Fermilab continued as of 2013, [28] a testament to persistence, if nothing else.

Separately, Jacob Bekenstein himself, ever the innovator, claimed to have devised a method to experimentally test the holographic principle using a relatively accessible tabletop experiment involving photons . [29] The promise of such a test, bringing the grand scales of cosmology down to a lab bench, remains tantalizing.

Celestial holography

In 2020, [30] Andrew Strominger , the distinguished Harvard theoretical physicist renowned for his development of the dS/CFT correspondence (a related holographic duality for de Sitter space), expressed his conviction that an equivalence relation he had introduced in his 2016 lecture series, “Lectures on the Infrared Structure of Gravity and Gauge Theory” [31]—specifically, the concept of the infrared triangle —could be leveraged. This theoretical framework, originally proposed to impose constraints on theories of quantum gravity [32], he believed, could demonstrate that the holography of the universe is not exclusively confined to anti-de Sitter (AdS) space . [30] This provocative idea has since blossomed into a burgeoning new subfield known as celestial holography . Its central ambition is to reformulate quantum field theories (QFTs) and quantum gravity within asymptotically flat spacetimes —the kind of spacetime that describes our actual universe at large distances—in terms of a lower-dimensional conformal field theory (CFT) defined on the celestial sphere at null infinity . [33] Strominger and his collaborators within this subfield are actively pursuing avenues to test these theoretical constructs against real world physics , specifically advocating for the use of cutting-edge gravitational wave detection instruments such as LIGO or the planned LISA . [34] It seems even the cosmos’s most abstract theories are eventually subjected to empirical scrutiny.

Within the intricate framework of celestial holography , the scattering amplitudes that characterize particle interactions in four-dimensional flat spacetimes are posited to be fundamentally related, through the mathematical operation of Mellin transforms over the external particle energies, to the correlation functions of conformal primary operators within the corresponding celestial CFT . [35] The overarching conjecture driving this research is that this elegant correspondence provides a complete boundary description of the gravitational S-matrix [36] within a flat Minkowski spacetime . [37] A truly ambitious claim, promising to unify gravity and quantum mechanics in a new holographic light.

In a significant institutional development, the Celestial Holography Initiative at the esteemed Perimeter Institute for Theoretical Physics was established in 2021 by Sabrina Pasterski , specifically to propel this nascent subfield forward. [34] Further underscoring its growing recognition and potential, the Simons Foundation entered into a collaborative agreement with this initiative in 2023, with Strominger assuming the role of its director. [38] It appears the universe’s boundary, once thought to be a simple edge, is far more complex and interesting than we ever bothered to assume.

See also

Notes

  • ^ except in the case of measurements, which the black hole should not be performing
  • ^ “Complete description” means all the primary qualities. For example, John Locke (and before him Robert Boyle ) determined these to be size, shape, motion, number, and solidity . Such secondary quality information as color, aroma, taste and sound , [21] or internal quantum state is not information that is implied to be preserved in the surface fluctuations of the event horizon. (See however “path integral quantization”)