Humans, a species whose self-assigned moniker, Homo sapiens — 'wise man' or 'knowledgeable man' — often feels like an optimistic overstatement, are, at their core, primates. We belong to the rather dramatic biological family of great apes, distinguished by a peculiar combination of traits that, when viewed through a cosmic lens, are both remarkable and utterly predictable. Among these defining characteristics are our relative hairlessness (a curious evolutionary choice, really), our consistent bipedality – the upright strut that seems to demand attention – and, of course, our notoriously high, if not always well-utilized, intelligence.
Our oversized brains, disproportionately large compared to body size, have indeed gifted us with advanced cognitive skills. These aren't just for contemplating the void; they've facilitated our rather successful, if often destructive, adaptation to nearly every conceivable environment. They've driven the development of increasingly sophisticated tools – from chipped stone to digital interfaces – and enabled the construction of fantastically intricate social structures and sprawling civilizations. Whether these structures are truly better is, of course, a matter for endless, tiresome debate.
Humans are, to put it mildly, highly social creatures. Individuals inevitably find themselves entangled in multi-layered networks of distinct social groups. These range from the intimate, often chaotic confines of families and fleeting peer groups to the vast, impersonal machinations of corporations and the often-dysfunctional apparatus of political states. These incessant social interactions have, over millennia, forged an astonishingly diverse tapestry of values, unwritten social norms, complex languages, and deeply ingrained traditions, all bundled together under the rather abstract term of institutions. Each of these, in its own way, acts as a scaffold, bolstering the precarious edifice of human society.
Beyond mere survival and social maneuvering, humans exhibit a profound, almost pathological, curiosity. This relentless desire to comprehend – and, more often than not, to manipulate – phenomena has been the engine behind humanity's relentless pursuit of science, the ceaseless innovation of technology, the abstract meanderings of philosophy, the comforting narratives of mythology, and the often-conflicting tenets of religion. These, along with countless other frameworks of knowledge, represent our attempts to make sense of a universe that frankly doesn't care. Ironically, we also turn this intense scrutiny upon ourselves, through fields such as anthropology, social science, history, psychology, and medicine – as if self-analysis might actually yield a definitive answer to "what are we?" As of 2025, the sheer volume of this self-studying, self-replicating species is staggering: there are estimated to be more than 8 billion living humans crowding this small planet. A truly impressive, if somewhat concerning, number.
Historically, for the vast majority of our existence, humans were nomadic hunter-gatherers, a lifestyle that, one might argue, was far more ecologically sound. The shift to what we optimistically call "behavioral modernity" began to manifest roughly 160,000 to 60,000 years ago. Then came the Neolithic Revolution, a series of independent developments in multiple regions, kick-starting approximately 13,000 years ago in Southwest Asia. This era marked the advent of agriculture and the unsettling permanence of human settlement. This, in turn, inevitably led to the development of civilization – a process that ushered in an unbroken, and indeed still accelerating, period of population growth and astonishingly rapid technological change. Since that pivotal shift, countless civilizations have risen, flourished, and, with predictable regularity, crumbled into dust, each cycle marked by continuous sociocultural and technological advancements that have fundamentally, and often irrevocably, reshaped the human lifestyle.
Humans are, by dietary necessity, omnivorous, capable of processing a truly vast array of plant and animal matter. We have, with remarkable ingenuity, used fire and other forms of heat to prepare and cook our food since the time of Homo erectus, a practice that undoubtedly broadened our dietary horizons. Generally, we are diurnal creatures, typically demanding a solid seven to nine hours of sleep per day – a requirement often ignored by modern society, much to its detriment. Our species has exerted a dramatic, some might say catastrophic, effect on the environment. We are, by virtually any measure, apex predators, rarely finding ourselves in the unenviable position of being prey for other species. However, our relentless population growth, coupled with unchecked industrialisation, expansive land development, rampant overconsumption, and the ceaseless combustion of fossil fuels, has plunged the planet into an era of profound environmental destruction and pervasive pollution. This self-inflicted ecological wound is a significant, undeniable contributor to the ongoing mass extinction of countless other forms of life. Within the last century, our insatiable drive has pushed us to explore the most challenging environments imaginable: the desolate expanse of Antarctica, the crushing depths of the deep sea, and the silent, indifferent void of outer space. While human habitation in these extreme locales is usually fleeting and confined to scientific, military, or industrial expeditions, we have, with a mix of audacity and perhaps a touch of desperation, even visited the Moon and launched human-made spacecraft to other distant celestial bodies, thereby becoming the first known species to reach beyond its planetary cradle.
Although the scientific term "humans" technically encompasses all members, living and extinct, of the genus Homo, in common, less precise usage, it almost exclusively refers to Homo sapiens, our own species, which is, rather conveniently, the only extant member. All other members of the genus Homo, now tragically extinct, are relegated to the category of "archaic humans," while the qualifier "modern human" is often employed to draw a distinction between Homo sapiens and these earlier, less fortunate hominins. Anatomically modern humans first graced the stage at least 300,000 years ago in Africa, evolving from Homo heidelbergensis or a closely related species. From this African crucible, they embarked on a series of migrations out of Africa, gradually, and often ruthlessly, replacing, or indeed interbreeding with, the local populations of archaic humans they encountered. Multiple hypotheses swirl around the extinction of these archaic human species, such as Neanderthals, ranging from direct competition and outright violence to genetic dilution through interbreeding with Homo sapiens, or simply a tragic inability to adapt to shifting climate change. Our human biology displays variation in a multitude of visible characteristics, physiology, susceptibility to disease, mental abilities, body size, and life span, all influenced by the intricate dance between our genes and the environment. Despite this apparent diversity, humans are, genetically speaking, among the least diverse primates. Any two individuals, picked at random, will share at least 99% of their genetic code – a testament to our relatively recent common ancestry.
Humans exhibit clear sexual dimorphism: typically, males possess greater overall body strength, while females generally carry a higher percentage of body fat. At the onset of puberty, both sexes develop distinct secondary sex characteristics. Females are capable of pregnancy, a biological window that generally spans from puberty, usually around 12 years of age, until menopause, which typically occurs around 50. Childbirth is, by any objective measure, a perilous undertaking, fraught with a high risk of complications and even death – a biological design flaw, if you ask me. Interestingly, in human societies, both the mother and the father often share the burden of caring for their offspring, who are notoriously helpless at birth and require prolonged nurturing.
Etymology and definition
One might wonder how this peculiar species came to be labeled. The renowned taxonomist Carl Linnaeus, in his 1735 magnum opus, Systema Naturae, bestowed upon us the rather grandiose name Homo sapiens. This generic name, Homo, is a learned 18th-century derivation from the Latin homō, a term that, rather inclusively, referred to humans of either sex. And, as a matter of pedantic clarification, the word "human" can indeed refer to all members of the entire Homo genus, not just our self-congratulatory species. The name Homo sapiens itself translates, with a touch of hubris, to 'wise man' or 'knowledgeable man'. There remains a perpetual, and somewhat tiresome, disagreement among specialists as to whether certain extinct members of the genus, notably the Neanderthals, should be classified as a distinct species of "humans" or merely as a subspecies of H. sapiens. It's a debate that, frankly, reveals more about human squabbles over categorization than it does about the Neanderthals themselves.
The term "human" is a loanword that infiltrated Middle English from the Old French "humain," which itself ultimately stemmed from the Latin hūmānus. This Latin root was the adjectival form of homō, meaning 'man' in the broader sense of humanity. Interestingly, the native English term "man" also carried this dual meaning, referring to the species generally (as a synonym for "humanity") as well as, more specifically, to human males. It could, in older usage, even refer to individuals of either sex, adding another layer of linguistic ambiguity to our self-referential terms.
Despite the common, and frankly irritating, colloquial habit of using "animal" as an antonym for "human," a practice that flies in the face of basic biological understanding, and contrary to a prevalent biological misconception, humans are, unequivocally, animals. We are, after all, multicellular, heterotrophic eukaryotes. Furthermore, while the word "person" is often used interchangeably with "human," this too opens a philosophical Pandora's Box. Extensive debate persists regarding whether personhood should be universally applied to all humans, or perhaps extended to all sentient beings. Even more complex are the discussions surrounding whether a human can, under certain circumstances (such as entering a persistent vegetative state), lose their personhood, and, perhaps most controversially, what precisely constitutes the beginning of human personhood. Such discussions highlight our species' peculiar obsession with defining its own existence, even when the definitions are inherently slippery.
Evolution
Our lineage, a branch on the vast tree of life, firmly places humans within the biological family of apes (specifically, the superfamily Hominoidea). The evolutionary path that eventually culminated in us, Homo sapiens, first diverged from gibbons (the Hylobatidae family), then from orangutans (genus Pongo), followed by gorillas (genus Gorilla), and finally, from our closest living relatives, chimpanzees and bonobos (genus Pan). This final, most significant split, separating the human lineage from that of chimpanzees and bonobos, occurred roughly 8 to 4 million years ago, deep within the late Miocene epoch. A particularly notable event during this divergence was the fusion of two ancestral chromosomes, resulting in humans possessing only 23 pairs of chromosomes, a distinct difference from the 24 pairs found in other apes. Following this crucial split from chimpanzees and bonobos, the early hominins underwent a period of significant diversification, branching into numerous species and at least two distinct genera. Of this rich tapestry of early human-like forms, all but one lineage – the genus Homo and its sole surviving species, Homo sapiens – have since vanished into the fossil record.
The genus Homo, to which we belong, ultimately evolved from Australopithecus. While the fossil record detailing this precise transition remains somewhat scarce, the earliest documented members of Homo undeniably exhibit several key anatomical traits shared with their Australopithecus ancestors. Due to this frustratingly scant evidence, a definitive consensus on the exact timing of the divergence into the genus Homo continues to elude researchers. Some studies, employing sophisticated molecular clock techniques, estimate the Homo genus made its appearance between 4.30 and 2.56 million years ago. However, other analyses, arguing that certain early Homo species might be incorrectly categorized, propose a more recent estimate, placing the genus's origin at approximately 1.87 million years ago. Such academic disagreements are, of course, entirely typical.
The earliest undisputed fossil evidence attributed to Homo is the 2.8 million-year-old specimen known as LD 350-1, unearthed in Ethiopia. The earliest formally named species within this genus are Homo habilis and Homo rudolfensis, both of which are believed to have evolved by 2.3 million years ago. A truly pivotal figure in our family tree, H. erectus (whose African variant is sometimes designated H. ergaster), emerged around 2 million years ago. This species holds the distinction of being the first archaic human to venture beyond the African continent, successfully dispersing across the vast expanse of Eurasia. H. erectus also pioneered the development of the characteristically human body plan, setting the stage for future evolutionary refinements. Our direct ancestors, Homo sapiens, made their debut in Africa at least 300,000 years ago, evolving from a species frequently identified as either H. heidelbergensis or H. rhodesiensis – the descendants of H. erectus who remained within Africa. From their African homeland, H. sapiens embarked on their grand migration, systematically colonizing the globe, either gradually replacing or, more intriguingly, interbreeding with the existing local populations of archaic humans. This period of "behavioral modernity," marked by complex symbolic thought and innovative tool-making, became evident approximately 160,000 to 70,000 years ago, with some evidence suggesting it may have begun even earlier. This crucial developmental leap was likely a product of intense selective pressures, honed amidst the dramatic natural climate change events that characterized the Middle to Late Pleistocene epochs in Africa.
The monumental "out of Africa" migration unfolded in at least two distinct waves. The first, a tentative probing, occurred around 130,000 to 100,000 years ago, while the second, more substantial dispersal, often referred to as the Southern Dispersal, commenced approximately 70,000 to 50,000 years ago. From there, H. sapiens proceeded with characteristic human thoroughness to colonize every continent and most major islands. They arrived in Eurasia as early as 125,000 years ago, reached Australia around 65,000 years ago, crossed into the Americas roughly 15,000 years ago, and, in a testament to their seafaring prowess, colonized remote islands such as Hawaii, Easter Island, Madagascar, and New Zealand much more recently, between 300 and 1280 CE.
It's crucial to understand that human evolution was not a neat, linear progression, nor a simple branching tree. Instead, it was a far messier affair, involving significant interbreeding between related species. Indeed, extensive genomic research has revealed that hybridization between substantially diverged lineages was a surprisingly common occurrence throughout human evolution. DNA evidence, rather inconveniently for those who prefer clean narratives, strongly suggests that several genes of Neanderthal origin are present within virtually all non-sub-Saharan African populations. Furthermore, Neanderthals and other archaic hominins, such such as Denisovans, may have contributed a measurable portion – up to 6% – of their genome to present-day non-sub-Saharan African humans. This means many of us carry echoes of these "extinct" relatives within our very cells.
Human evolution is marked by a series of profound morphological, developmental, physiological, and behavioral transformations that have unfolded since the divergence from our last common ancestor with chimpanzees. Among the most significant of these adaptations are our peculiar hairlessness, our commitment to obligate bipedalism (standing upright, for all the world to see), a remarkable increase in brain size, and a notable decrease in sexual dimorphism compared to our ape cousins (a process known as neoteny). The precise interplay and causal relationships between all these changes remain, as ever, the subject of ongoing and often heated scientific debate.
graph TD
Hominoidea --> Hylobatidae[Hylobatidae (gibbons)]
Hominoidea --> Hominidae[Hominidae (hominids, great apes)]
Hominidae --> Ponginae
Ponginae --> Pongo[Pongo (orangutans)]
Pongo --> Pongo_abelii[Pongo abelii]
Pongo --> Pongo_tapanuliensis[Pongo tapanuliensis]
Pongo --> Pongo_pygmaeus[Pongo pygmaeus]
Hominidae --> Homininae
Homininae --> Gorillini
Gorillini --> Gorilla[Gorilla (gorillas)]
Gorilla --> Gorilla_gorilla[Gorilla gorilla]
Gorilla --> Gorilla_beringei[Gorilla beringei]
Homininae --> Hominini
Hominini --> Panina
Panina --> Pan[Pan (chimpanzees)]
Pan --> Pan_troglodytes[Pan troglodytes]
Pan --> Pan_paniscus[Pan paniscus]
Hominini --> Hominina
Hominina --> Homo_sapiens[Homo sapiens (humans)]
History
Ah, human history. A long, convoluted narrative, mostly of ourselves.
Prehistory
For an astonishingly long stretch of time – until approximately 12,000 years ago – all humans lived as hunter-gatherers. It was a lifestyle dictated by immediate needs, a constant dance with nature. Then, the Neolithic Revolution, a transformative period marked by the invention of agriculture, began to unfold. This monumental shift first took place in Southwest Asia, gradually spreading across vast sections of the Old World over many subsequent millennia. Not to be outdone, agriculture also emerged independently in other distant locales, including Mesoamerica (around 6,000 years ago), various regions of China, Papua New Guinea, and the Sahel and West Savanna areas of Africa.
The consequences of this agricultural revolution were profound and irreversible. The establishment of permanent human settlements, the strategic domestication of animals, and the innovative use of metal tools collectively ushered in an unprecedented era of consistent food surplus – a luxury unknown for the entirety of human existence prior. This newfound abundance and sedentary lifestyle inevitably paved the way for the emergence of what we now refer to as early civilizations. And so, the path to complex, stratified societies was set.
Ancient
The 4th millennium BCE witnessed a pivotal "urban revolution" with the rise of complex city-states, particularly the sophisticated Sumerian cities nestled within Mesopotamia. It was within the bustling confines of these proto-cities that the earliest known form of writing, the intricate cuneiform script, first emerged around 3000 BCE – a testament to the growing complexity of human record-keeping. Concurrent with these developments, other major civilizations began to flourish, notably Ancient Egypt along the Nile and the enigmatic Indus Valley Civilisation. These burgeoning societies eventually engaged in intricate trade networks, exchanging goods and, more importantly, ideas. They innovated technologies that would fundamentally reshape human existence: the wheel, the plow, and the sail. Across the oceans, emerging by 3000 BCE, the Caral–Supe civilization established itself as the oldest complex civilization in the Americas, proving that ingenuity was not confined to a single continent. During this era, humans also made significant strides in astronomy and mathematics, culminating in monumental feats of engineering such as the construction of the Great Pyramid of Giza. However, even these early achievements were not immune to the whims of nature; evidence points to a severe drought lasting approximately a century, which may have contributed to the decline of several of these foundational civilizations, only for new ones to rise in their wake. The Babylonians eventually asserted dominance over Mesopotamia, while other distinct cultures, such as the Poverty Point culture in North America, the Minoans in the Aegean, and the Shang dynasty in China, ascended to prominence in new geographical spheres. The Late Bronze Age collapse, a period of widespread societal upheaval around 1200 BCE, led to the abrupt disappearance of numerous civilizations and marked the beginning of the Greek Dark Ages. During this turbulent transition, iron, a harder and more abundant metal, gradually superseded bronze, heralding the advent of the Iron Age.
By the 5th century BCE, the systematic recording of history began to emerge as a recognized discipline, offering a significantly clearer, if still biased, window into the daily lives and political machinations of the time. Between the 8th and 6th centuries BCE, Europe entered the profound era of classical antiquity, a period when the intellectual and cultural achievements of ancient Greece and ancient Rome reached their zenith, laying many of the foundations for later Western thought. Simultaneously, other powerful civilizations were rising across the globe. The Maya civilization in Mesoamerica embarked on ambitious city-building projects and developed incredibly complex calendars, demonstrating a sophisticated understanding of celestial cycles. In Africa, the formidable Kingdom of Aksum supplanted the waning Kingdom of Kush, becoming a crucial nexus for trade between India and the Mediterranean world. In West Asia, the Achaemenid Empire's pioneering system of centralized governance served as a blueprint for countless subsequent empires. Meanwhile, the Gupta Empire in India and the Han dynasty in China are widely regarded as golden ages within their respective regions, periods of unparalleled advancement and cultural flourishing.
Post-classical
Following the rather dramatic fall of the Western Roman Empire in 476 CE – a collapse that, in hindsight, was perhaps inevitable – Europe plunged into the era known as the Middle Ages. During this period, Christianity and the omnipresent Church became the undisputed founts of authority and education, shaping nearly every aspect of life. Meanwhile, in the Middle East, Islam rose to become the dominant religion, rapidly expanding its influence across North Africa and beyond. This expansion ushered in an extraordinary Islamic Golden Age, a period of intellectual and artistic brilliance that inspired remarkable achievements in architecture, saw the revival and reinterpretation of ancient scientific and technological advances, and cultivated a distinct, sophisticated way of life that, frankly, put much of contemporary Europe to shame. Predictably, the Christian and Islamic worlds would eventually collide, most notably in a series of brutal holy wars – with the Kingdom of England, the Kingdom of France, and the Holy Roman Empire leading crusades in an attempt to seize control of the Holy Land from Muslims.
Across the Atlantic, in the Americas, the period between 200 and 900 CE marked the Classic Period in Mesoamerica, a time of flourishing urban centers and cultural sophistication. Further north, complex Mississippian societies began to emerge around 800 CE, showcasing intricate social and ceremonial structures. On the other side of the world, the formidable Mongol Empire, under its relentless leaders, swept across and conquered much of Eurasia throughout the 13th and 14th centuries, leaving an indelible mark on global history. During this same tumultuous period, the Mali Empire in Africa expanded to become the continent's largest, stretching from Senegambia to the Ivory Coast, a testament to its organizational and military prowess. In Oceania, the Tuʻi Tonga Empire rose to power, extending its influence across numerous islands in the South Pacific. By the late 15th century, the Aztecs had established themselves as the dominant force in Mesoamerica, while the Inca reigned supreme in the Andes, each creating vast, intricate empires just before the arrival of outside forces.
Modern
The early modern period in Europe and the Near East (roughly 1450–1800) commenced with the final defeat of the Byzantine Empire and the subsequent, inevitable rise of the Ottoman Empire, which reshaped geopolitical power in the region. Meanwhile, Japan entered its culturally distinct Edo period, the powerful Qing dynasty ascended to rule in China, and the magnificent Mughal Empire held sway over much of India. Europe, following its "dark ages," experienced the vibrant intellectual and artistic explosion known as the Renaissance, beginning in the 15th century. This period soon transitioned into the Age of Discovery, an era defined by relentless exploration and the subsequent, often brutal, colonizing of newly encountered regions. This expansion infamously included the colonization of the Americas, leading to the vast ecological and cultural exchange known as the Columbian Exchange. However, this era also brought with it the horrific Atlantic slave trade and the tragic genocide of the Americas' indigenous peoples, dark stains on the ledger of human ambition. This period also heralded the Scientific Revolution, a time of unprecedented intellectual ferment and groundbreaking advances in mathematics, mechanics, astronomy, and physiology, fundamentally altering humanity's understanding of the natural world.
The late modern period (1800–present) then saw the profound and accelerating impact of the Industrial and Technological Revolution. This era brought forth a torrent of discoveries, from sophisticated imaging technology to revolutionary innovations in transport and energy development, irrevocably altering the fabric of daily life. Influenced by the ideals of the Enlightenment, the Americas and Europe were convulsed by a period of profound political upheaval, collectively known as the Age of Revolution. The Napoleonic Wars tore through Europe in the early 1800s, leaving a trail of destruction and redrawing national borders. Spain, once a dominant colonial power, lost most of its vast holdings in the New World, while European powers continued their relentless expansion into Africa – where their control skyrocketed from a mere 10% to almost 90% in less than 50 years – and into Oceania. By the 19th century, the British Empire had swelled to become the world's largest empire, its influence spanning the globe.
However, this tenuous balance of power among European nations proved inherently unstable, collapsing spectacularly in 1914 with the catastrophic outbreak of the First World War, a conflict that quickly became one of the deadliest in recorded history. The 1930s then ushered in a worldwide economic crisis, a period of profound instability that, predictably, facilitated the rise of ruthless authoritarian regimes and culminated in the even more devastating Second World War, a global conflagration involving almost all of the world's countries. The sheer scale of destruction wrought by this war led to the inevitable collapse of most global empires, sparking widespread decolonization movements across the world.
Following the formal conclusion of the Second World War in 1945, a new geopolitical order emerged, dominated by two colossal powers: the United States and the Soviet Union, who quickly established themselves as the remaining global superpowers. This uneasy bipolarity immediately ignited the Cold War, a protracted struggle for global influence that played out through proxy conflicts, ideological battles, a terrifying nuclear arms race, and even a competitive space race, before finally concluding with the collapse of the Soviet Union. The current era, often dubbed the Information Age, is characterized by the explosive development of the Internet and increasingly sophisticated artificial intelligence systems. It sees the world becoming ever more profoundly globalized and intricately interconnected, for better or worse.
Habitat and population
Early human settlements, quite logically, were entirely dependent on proximity to reliable water resources and – depending on the prevailing lifestyle – other essential natural resources vital for subsistence. This included, for instance, robust populations of animal prey for hunting and fertile arable land suitable for cultivating crops and grazing livestock. Modern humans, however, have developed an almost unparalleled capacity for fundamentally altering their habitats through the relentless application of technology, elaborate irrigation systems, ambitious urban planning, massive construction projects, widespread deforestation, and the often-unintended consequence of desertification. Despite our supposed mastery over nature, human settlements remain stubbornly vulnerable to the destructive forces of natural disasters, particularly those unwisely situated in hazardous locations or constructed with a lamentable lack of quality. The pervasive human tendency for grouping and deliberate habitat alteration is almost always driven by a set of predictable goals: the desire for enhanced protection, the accumulation of comforts or material wealth, the expansion of available food sources, the pursuit of aesthetics, the relentless increase of knowledge, or the optimization of resource exchange.
Humans, despite possessing a surprisingly low or narrow biological tolerance for many of Earth's truly extreme environments, are nonetheless one of the most adaptable species. Currently, our species maintains a presence in all eight recognized biogeographical realms. However, our foothold in the Antarctic realm remains severely limited, primarily confined to isolated research stations, with a noticeable annual population decline during the harsh winter months. Elsewhere, humans have, with characteristic fervor, established nation-states across the other seven realms, including diverse countries such as South Africa, India, Russia, Australia, Fiji, the United States, and Brazil, each nestled within a distinct biogeographical domain.
Within the relatively brief span of the last century, humans have pushed the boundaries of exploration further, venturing into the abyssal deep sea and the boundless expanse of outer space. Sustained human habitation within these profoundly hostile environments is both prohibitively restrictive and astronomically expensive, typically limited in duration and confined to highly specialized scientific, military, or industrial expeditions. Yet, with a mix of scientific ambition and raw determination, humans have visited the Moon itself and, through the deployment of human-made robotic spacecraft, have made our presence known on other distant celestial bodies. Since the year 2000, humanity has maintained a continuous presence in space through the ongoing habitation of the International Space Station – a testament to our stubborn refusal to stay put.
Through the ingenious application of advanced tools and sophisticated clothing, humans have managed to dramatically extend their physiological tolerance to an astonishingly wide spectrum of temperatures, humidities, and altitudes. As a direct consequence, humans are now a truly cosmopolitan species, found in virtually every region of the world. We inhabit everything from dense tropical rainforests to parched arid deserts, from the brutally cold arctic regions to the densely packed, often heavily polluted urban sprawl of our cities. This stands in stark contrast to most other species, whose geographical distribution is severely restricted by their limited adaptability. The human population, however, is far from uniformly distributed across the Earth's surface. Population density varies wildly from one region to another, with vast stretches of the planet remaining almost completely uninhabited, such as the frozen expanse of Antarctica and the immense, silent depths of the ocean. The majority of humans (a staggering 61%) reside in Asia; the remainder are distributed across the Americas (14%), Africa (14%), Europe (11%), and Oceania (a mere 0.5%).
A rather sobering statistic: humans and their extensively domesticated animals now account for a staggering 96% of all mammalian biomass on Earth. All wild mammals, by comparison, represent a paltry 4%. This stark imbalance speaks volumes about our species' dominance – and its ecological footprint.
Estimates regarding the global human population at the dawn of agriculture, around 10,000 BCE, vary widely, ranging from a modest 1 million to a more substantial 15 million individuals. By the 4th century AD, it's believed that approximately 50–60 million people inhabited the combined eastern and western Roman Empire. However, the course of human population history has been anything but smooth. Devastating bubonic plagues, first documented in the 6th century AD, dramatically reduced populations, with the infamous Black Death alone claiming the lives of an estimated 75–200 million people across Eurasia and North Africa. Despite such catastrophic setbacks, the human population eventually rebounded and accelerated. It is believed to have reached one billion around 1800, then embarked on an exponential surge, hitting two billion by 1930, three billion by 1960, four by 1975, five by 1987, and six billion by 1999. It surpassed seven billion in 2011 and, with relentless momentum, passed eight billion in November 2022. To put this into perspective, it took over two million years of human prehistory and history for the global population to reach one billion, yet only a mere 207 years more to multiply to seven billion. The combined biomass of the carbon contained within all humans on Earth in 2018 was estimated at 60 million tons, a mass approximately 10 times greater than that of all non-domesticated mammals combined.
In 2018, a significant demographic milestone was reached: 4.2 billion humans (55% of the global total) resided in urban areas, a dramatic increase from just 751 million in 1950. The most heavily urbanized regions include Northern America (82%), Latin America (81%), Europe (74%), and Oceania (68%). Conversely, Africa and Asia continue to host nearly 90% of the world's 3.4 billion rural population. While cities offer undeniable advantages, they also present a myriad of problems for their human inhabitants, including various forms of pollution and pervasive crime, particularly concentrated in inner-city and suburban slums. A testament, perhaps, to our inability to perfectly manage our own creations.
Biology
Let's delve into the mechanics of this species, shall we?
Anatomy and physiology
Most aspects of human physiology are, rather unsurprisingly, closely homologous to corresponding aspects of animal physiology. Our dental formula is 2.1.2.32.1.2.3, a pattern shared with other catarrhines. However, humans possess proportionately shorter palates and significantly smaller teeth than other primates. We are, in fact, the only primates to exhibit short, relatively flush canine teeth, a feature that, if you think about it, somewhat undermines our claim to being fearsome predators. Our teeth also have a characteristic tendency to be crowded, with gaps from lost teeth usually closing up rather quickly in younger individuals. A curious evolutionary trend sees humans gradually losing their third molars (wisdom teeth), with some individuals even being born congenitally without them – perhaps a sign of future dental simplicity, or just another minor inconvenience.
Humans, like chimpanzees, still carry the echo of a vestigial tail – a coccyx, hidden from view, a silent reminder of our distant past. We also share an appendix (another delightful vestige), flexible shoulder joints, grasping fingers, and, crucially, opposable thumbs – the latter being a key ingredient in our tool-making prowess. Our chest, in contrast to the funnel shape common in other apes, is more barrel-shaped, an adaptation directly linked to our bipedal respiration. Beyond the obvious distinctions of bipedalism and brain size, humans primarily differ from chimpanzees in their senses of smelling and hearing, and in their methods of digesting proteins. While we possess a density of hair follicles comparable to other apes, the vast majority of it is vellus hair – so short and wispy as to be virtually invisible. This near-nakedness is compensated by an abundance of approximately 2 million sweat glands distributed across our entire bodies, a stark contrast to chimpanzees, whose sweat glands are sparse and mainly concentrated on their palms and soles. This adaptation, it seems, helps us cool off after a particularly strenuous intellectual endeavor, or perhaps just a brisk walk.
It's estimated that the global average height for an adult human male hovers around 171 cm (5 ft 7 in), while adult females average about 159 cm (5 ft 3 in). A rather predictable shrinkage of stature may commence in middle age for some, but it tends to become a typical, if somewhat disheartening, characteristic in the extremely aged. Throughout recorded history, human populations have, on a broad scale, consistently increased in height, a phenomenon likely attributable to improved nutrition, better healthcare, and generally enhanced living conditions. The average mass of an adult human is approximately 59 kg (130 lb) for females and 77 kg (170 lb) for males. Like many other physiological conditions, body weight and body type are influenced by a complex interplay of genetic susceptibility and environmental factors, leading to considerable variation among individuals.
Humans possess a significantly faster and more accurate throw than virtually any other animal – a skill that was undoubtedly crucial for hunting and, let's be honest, for throwing things at each other. We are also among the most accomplished long-distance runners in the animal kingdom, though, ironically, we are often outpaced over short bursts. Our thinner body hair and more efficient sweat glands are key physiological adaptations that help us avoid heat exhaustion during prolonged periods of running. Compared to other apes, the human heart exhibits a greater stroke volume and cardiac output, and our aorta is proportionately larger – perhaps a necessary upgrade for powering our large, demanding brains.
Genetics
Humans are, much like the majority of animals, plants, and fungi, a eukaryotic species, and, like most animals, we are diploid. Each somatic cell within our bodies contains two complete sets of 23 chromosomes, with each set meticulously inherited from one parent. Our gametes, however, possess only a single set of chromosomes, a carefully shuffled mixture of the two parental sets. Among these 23 pairs of chromosomes, 22 are autosomes, and one crucial pair consists of the sex chromosomes. Following the typical mammalian XY sex-determination system, females possess two X chromosomes (XX), while males carry one X and one Y chromosome (XY). The complex interplay of genes and the environment profoundly influences human biological variation, manifesting in everything from visible characteristics and physiology to disease susceptibility and mental abilities. The precise extent of influence exerted by genes and environment on specific traits remains, as ever, a subject of ongoing scientific inquiry.
While it is a well-established fact that no two humans – not even monozygotic twins – are truly genetically identical, the average genetic similarity between any two randomly selected humans ranges from 99.5% to 99.9%. This makes our species remarkably homogeneous when compared to other great apes, including our chimpanzee cousins. This surprisingly small degree of genetic variation in human DNA, relative to many other species, strongly suggests a significant population bottleneck occurred during the Late Pleistocene epoch (approximately 100,000 years ago), a period when the entire human population was drastically reduced to a relatively small number of breeding pairs. Despite this ancient bottleneck, the relentless forces of natural selection have continued to exert their influence on human populations, with clear evidence indicating that certain regions of the genome have undergone directional selection within the last 15,000 years.
The complete human genome was first painstakingly sequenced in 2001, a monumental scientific achievement. By 2020, hundreds of thousands of individual human genomes had been sequenced, painting an increasingly detailed picture of our genetic landscape. In 2012, the ambitious International HapMap Project had already compared the genomes of 1,184 individuals drawn from 11 diverse populations, identifying a staggering 1.6 million single nucleotide polymorphisms (SNPs). African populations, as expected given our species' origin, harbor the highest number of unique genetic variants. While many common variants found in non-African populations are also present on the African continent, a significant number of private variants are found exclusively in other regions, particularly Oceania and the Americas. As of 2010, estimates suggest that humans possess approximately 22,000 genes. By analyzing mitochondrial DNA, which is inherited exclusively from the mother, geneticists have concluded that the last female common ancestor whose genetic marker is present in all modern humans, a figure famously dubbed mitochondrial Eve, must have lived roughly 90,000 to 200,000 years ago.
Life cycle
The primary mode of human reproduction involves internal fertilization through sexual intercourse, though modern advances in assisted reproductive technology procedures offer alternative pathways. The average gestation period for a human pregnancy is approximately 38 weeks, although a normal, healthy pregnancy can naturally vary by as much as 37 days. Embryonic development in humans encompasses the first eight weeks; from the beginning of the ninth week until birth, the developing organism is referred to as a fetus. For medical reasons, humans have developed the capacity to induce early labor or perform a caesarean section when necessary. In developed countries, infants typically weigh between 3–4 kg (7–9 lb) and measure 47–53 cm (19–21 in) in height at birth. However, low birth weight remains a prevalent and concerning issue in developing countries, tragically contributing to the high rates of infant mortality in these regions.
Compared with other species, human childbirth is notoriously dangerous, carrying a significantly higher risk of complications and even death. The size of the fetal head is, rather inconveniently, much more closely matched to the dimensions of the pelvis than in other primates. The precise evolutionary reasons for this tight fit are not yet fully understood, but it undeniably contributes to a painful and often protracted labor that can easily extend for 24 hours or even longer. Thankfully, the chances of a successful labor increased dramatically during the 20th century in wealthier nations, largely due to the advent of new medical technologies and interventions. In stark contrast, pregnancy and natural childbirth continue to be hazardous ordeals in many developing regions of the world, where maternal death rates are tragically approximately 100 times greater than in developed countries.
Uniquely among primates, both the mother and the father often actively participate in the care of human offspring, a stark contrast to other primate species where parental care is predominantly, if not exclusively, provided by the mother. Born utterly helpless at birth, humans undergo an extended period of growth and development, typically reaching sexual maturity between 15 and 17 years of age. The human life span is broadly divided into various stages, typically ranging from three to twelve distinct phases. Common classifications include infancy, childhood, adolescence, adulthood, and old age. The precise duration of these stages has fluctuated across different cultures and historical periods, but it is characterized by an unusually rapid growth spurt during adolescence. Human females undergo menopause, a biological transition that renders them infertile around the age of 50. It has been proposed that menopause, rather than being a reproductive endpoint, actually enhances a woman's overall reproductive success by allowing her to redirect more time and resources towards her existing offspring and, crucially, their children – a concept known as the grandmother hypothesis.
The ultimate life span of an individual human is determined by two primary factors: their genetic endowment and the cumulative impact of their lifestyle choices. For a multitude of reasons, including underlying biological and genetic predispositions, women, on average, tend to live approximately four years longer than men. As of 2018, the global average life expectancy at birth for a girl was estimated to be 74.9 years, compared to 70.4 years for a boy. However, significant geographical disparities in human life expectancy persist, largely correlating with levels of economic development. For instance, life expectancy at birth in Hong Kong stands at an impressive 87.6 years for girls and 81.8 years for boys, while in the Central African Republic, these figures tragically plummet to 55.0 years for girls and a mere 50.6 years for boys. The developed world is currently experiencing a demographic shift, characterized by an aging population, with the median age hovering around 40 years. In contrast, the developing world maintains a significantly younger median age, typically between 15 and 20 years. This disparity is stark: while one in five Europeans is 60 years of age or older, only one in twenty Africans falls into this age bracket. In 2012, the United Nations estimated that there were approximately 316,600 living centenarians (humans aged 100 or older) worldwide – a testament to both medical advancements and sheer stubbornness.
Human life stages: Infant boy and girl, Boy and girl before puberty (children), Adolescent male and female, Adult man and woman, Elderly man and woman.
Diet
Humans, those famously adaptable creatures, are fundamentally omnivorous, capable of consuming and deriving sustenance from a remarkably diverse array of both plant and animal material. Indeed, different human groups across the globe have adopted a wide spectrum of dietary patterns, ranging from strictly vegan to predominantly carnivorous. While certain extreme dietary restrictions can, in some cases, lead to unfortunate deficiency diseases, stable human populations have, through a combination of subtle genetic specialization and ingrained cultural conventions, successfully adapted to numerous dietary patterns by strategically utilizing nutritionally balanced food sources. The human diet, it's worth noting, is deeply ingrained in human culture, having given rise to the entire field of food science – a testament to our ongoing obsession with what we put into our mouths.
Before the revolutionary advent of agriculture, Homo sapiens relied exclusively on a hunter-gatherer method for acquiring food. This involved a pragmatic combination of collecting stationary food sources such as fruits, grains, tubers, mushrooms, insect larvae, and aquatic mollusks, alongside the pursuit and capture of wild game. It has been widely proposed that humans have, with increasing sophistication, utilized fire to prepare and cook food since the time of Homo erectus, a practice that undoubtedly detoxified, tenderized, and diversified our ancestral diet. The deliberate domestication of wild plants by humans began approximately 11,700 years ago, a gradual, transformative process known as the Neolithic Revolution, which fundamentally reshaped our relationship with food. These profound dietary changes likely brought about corresponding alterations in human biology; for instance, the widespread adoption of dairy farming introduced a novel and rich food source, leading to the evolutionary development of the ability to digest lactose into adulthood in certain populations. The specific types of food consumed, and the methods by which they are prepared, have, predictably, varied enormously across different times, geographical locations, and cultures.
In general, humans possess a remarkable, if somewhat gruesome, capacity to survive for up to eight weeks without food, a duration largely dependent on their stored body fat reserves. Survival without water, however, is far more precarious, typically limited to a mere three or four days, with an absolute maximum of one week. The tragic irony of human existence is starkly illustrated by the fact that in 2020, an estimated 9 million humans perished annually from causes directly or indirectly linked to starvation. Childhood malnutrition remains distressingly common, contributing significantly to the global burden of disease. Yet, the distribution of food globally is anything but equitable. While some suffer, obesity among other human populations has surged rapidly, leading to a host of health complications and increased mortality rates in many developed and even some developing countries. Worldwide, over one billion individuals are now classified as obese, and in the United States, a staggering 35% of the population is obese, leading to this being accurately described as an "obesity epidemic." Obesity is fundamentally caused by consuming more calories than are expended, meaning excessive weight gain is typically a direct result of an energy-dense diet combined with insufficient physical activity.
Food consumption, of course, is merely the initial stage of the complex digestive process, which ultimately culminates in humans expelling feces, with a frequency that can range from multiple times per day to several times per week – a rather unglamorous, yet vital, biological function.
Biological variation
Despite our species' overall genetic homogeneity, there is, undeniably, considerable biological variation within the human species. This manifests in a multitude of traits, including blood type, susceptibility to genetic diseases, subtle differences in cranial features and facial features, variations in organ systems, a spectrum of eye color and hair color and texture, differences in height and build, and, most visibly, the wide range of skin color observed across the globe. The typical height of an adult human generally falls between 1.4 and 1.9 meters (4 ft 7 in and 6 ft 3 in), though this measurement can fluctuate significantly based on individual sex, ethnic origin, and family bloodlines. Body size itself is a complex trait, partly determined by one's genetic blueprint but also substantially influenced by environmental factors such as diet, regular exercise, and consistent sleep patterns.
Human hair color, a rather superficial yet fascinating trait, ranges from fiery red to sunny blond, through various shades of brown, to deep black, the latter being the most frequent. Hair color is fundamentally dependent on the amount of melanin pigment present, with concentrations naturally diminishing with age, leading to the eventual appearance of grey or even white hair. Skin color, perhaps the most striking visual variation, spans from the darkest brown to the lightest peach, or even nearly white or colorless in cases of albinism. This variation tends to follow a clinal pattern, generally correlating with the intensity of ultraviolet radiation in a particular geographical area, with darker skin tones predominantly found closer to the equator. Darker skin pigmentation is thought to have evolved as a protective mechanism against the damaging effects of intense ultraviolet solar radiation. Conversely, lighter skin pigmentation offers protection against the depletion of vitamin D, a vital nutrient whose synthesis in the body requires exposure to sunlight. Human skin also possesses the remarkable capacity to darken (or tan) in response to increased exposure to ultraviolet radiation, a further testament to its adaptive plasticity.
Crucially, there is relatively little genetic variation between different human geographical populations; the vast majority of human genetic diversity actually exists at the individual level. Much of human variation, whether phenotypic or genetic, is continuous, lacking clear, distinct points of demarcation. Genetic data consistently demonstrates that regardless of how population groups are defined, any two individuals from the same population group are almost as genetically distinct from each other as any two individuals chosen from different population groups. Furthermore, geographically disparate dark-skinned populations, such as those found in Africa, Australia, and South Asia, are not necessarily closely related to each other, highlighting convergent evolution.
Genetic research has unequivocally demonstrated that human populations indigenous to the African continent exhibit the highest degree of genetic diversity. This genetic diversity gradually diminishes with increasing migratory distance from Africa, a phenomenon likely attributable to successive bottlenecks experienced during human migration events. These non-African populations, having ventured beyond Africa, acquired new genetic inputs through local admixture with archaic populations, and consequently display greater genetic variation stemming from Neanderthals and Denisovans than is typically found in Africa, though Neanderthal admixture into African populations may be underestimated. Moreover, recent studies have revealed that populations in sub-Saharan Africa, and particularly West Africa, possess ancestral genetic variation that predates anatomically modern humans and has since been lost in most non-African populations. Some of this ancient ancestry is believed to originate from admixture with an unknown archaic hominin lineage that diverged even before the split between Neanderthals and modern humans.
Humans are a gonochoric species, meaning we are distinctly divided into male and female sexes. The greatest degree of genetic variation exists between males and females within our species. While the nucleotide genetic variation among individuals of the same sex across global populations is no greater than 0.1%–0.5%, the genetic difference between males and females is a more substantial 1% to 2%. On average, males are approximately 15% heavier and 15 cm (6 in) taller than females. Men, on average, also possess about 40–50% more upper-body strength and 20–30% more lower-body strength than women of the same body weight, a difference largely attributable to higher muscle mass and larger muscle fibers. Women, conversely, generally exhibit a higher body fat percentage than men. Women also tend to have lighter skin than men within the same population, a phenomenon that has been explained by a higher physiological requirement for vitamin D in females, particularly during periods of pregnancy and lactation. Given the fundamental chromosomal differences between females and males, certain X and Y chromosome-related conditions and disorders exclusively affect either men or women. After accounting for differences in body weight and vocal tract volume, the male voice is typically an octave deeper than the female voice. Furthermore, women consistently exhibit a longer life span in virtually every population across the globe. While the human population is predominantly dimorphic, intersex conditions do occur, though they are statistically rare.
Psychology
And now, for the inner workings.
Brain and cognition
The human brain, that marvel of biological engineering, serves as the undeniable focal point of the central nervous system in humans, orchestrating the intricate dance of the peripheral nervous system. Beyond merely controlling the "lower," involuntary, or primarily autonomic activities such as respiration and digestion, it is also the exclusive locus of our "higher" order functioning: the elusive processes of thought, the structured pathways of reasoning, and the profound capacity for abstraction. These intricate cognitive processes collectively constitute what we refer to as the mind, and, along with their observable behavioral consequences, they form the central subject of study in the vast field of psychology.
Humans possess a prefrontal cortex that is notably larger and more developed than that of other primates – a brain region intricately associated with our most complex cognition. This anatomical advantage has, perhaps predictably, led humans to confidently proclaim themselves the most intelligent species known. However, objectively defining intelligence remains a notoriously difficult and contentious endeavor, particularly when considering that other animals have evolved highly specialized senses and excel in areas where humans are demonstrably inferior. Perhaps our intelligence is simply different, not necessarily superior.
There are, however, some traits that, while not strictly unique to our species, certainly set humans apart from most other animals. Humans may very well be the only animals capable of true episodic memory – the ability to mentally re-experience past events – and who can engage in what is termed "mental time travel," projecting themselves into future scenarios. Even when compared with other highly social animals, humans exhibit an unusually high degree of flexibility and nuance in their facial expressions. We are also, remarkably, the only animals known to shed purely emotional tears. Furthermore, humans are among the very few animals capable of self-recognition in mirror tests, a benchmark often used to infer self-awareness. And, of course, there is ongoing, often heated, debate about the extent to which humans are truly the only animals possessing a sophisticated "theory of mind" – the ability to attribute mental states, such as beliefs and desires, to oneself and others.
Sleep and dreaming
Humans are, generally speaking, diurnal creatures, active during the day and resting at night. The average adult requires between seven and nine hours of sleep per day, while children typically need nine to ten hours. Elderly individuals, perhaps having seen enough of the world, usually manage with six to seven hours. Unfortunately, obtaining less sleep than these recommended amounts is a common, almost fashionable, habit among humans, despite the well-documented negative health effects of sleep deprivation. A sustained restriction of adult sleep to a mere four hours per day has been shown to induce significant alterations in both physiology and mental state, including impaired memory, chronic fatigue, increased aggression, and pervasive bodily discomfort.
During the enigmatic state of sleep, humans experience dreams, vivid sequences of sensory images and sounds. Dreaming is primarily stimulated by the pons region of the brain and predominantly occurs during the REM phase of sleep. The duration of a dream can vary wildly, from a fleeting few seconds to a sustained 30 minutes. Humans typically experience three to five dreams per night, with some individuals reporting up to seven. Dreamers are, perhaps inconveniently, more likely to recall their dreams if awakened during the REM phase. The events within dreams are generally beyond the conscious control of the dreamer, with the notable exception of lucid dreaming, a fascinating state where the dreamer achieves self-awareness within the dream itself. Dreams, in their unpredictable nature, can occasionally trigger a creative thought or impart a profound sense of inspiration – proving that even when our bodies are at rest, our minds are still, in their own chaotic way, at work.
Consciousness and thought
Human consciousness, at its most fundamental, is simply sentience or an awareness of internal or external existence. Despite centuries of meticulous analyses, countless definitions, elaborate explanations, and relentless debates by philosophers and scientists alike, the underlying nature of consciousness remains an enduring enigma, profoundly and maddeningly poorly understood. It is, as some have eloquently put it, "at once the most familiar and most mysterious aspect of our lives." The only universally agreed-upon notion regarding this topic is the rather intuitive conviction that it, indeed, exists. Opinions diverge sharply on what precisely constitutes the phenomenon to be studied and explained as consciousness. Some philosophers propose a division into phenomenal consciousness, which refers to raw sensory experience itself, and access consciousness, which is the aspect that can be utilized for reasoning or directly controlling actions. At times, consciousness is treated as synonymous with 'the mind,' while at other times it is considered merely an aspect of it. Historically, it has been intimately associated with introspection, private thought, imagination, and volition. Modern interpretations often broaden its scope to include various forms of experience, cognition, feeling, or perception. It can be understood as 'awareness,' or even, more complexly, 'awareness of awareness,' or the ultimate state of self-awareness. The very notion of consciousness may encompass different levels or orders of consciousness, or perhaps different kinds of consciousness, or simply a single kind manifesting with diverse features. Such is the delightful ambiguity of the human condition.
The process by which humans acquire knowledge and understanding through the intricate interplay of thought, experience, and the senses is known as cognition. The human brain meticulously perceives the external world through the various senses, and each individual human is profoundly shaped by their unique accumulation of experiences, inevitably leading to highly subjective interpretations of existence and the relentless passage of time. The fundamental nature of thought itself lies at the very heart of psychology and its related fields. Cognitive psychology specifically delves into the study of cognition, examining the underlying mental processes that drive observable behavior. Developmental psychology, largely focusing on the unfolding of the human mind across the entire life span, endeavors to unravel how individuals come to perceive, comprehend, and interact within the world, and how these complex processes evolve and transform as they age. This field may concentrate on intellectual, cognitive, neural, social, or moral development. To quantify and compare the relative intelligence of human beings, and to study its distribution within populations, psychologists have developed various intelligence tests and the widely known concept of the intelligence quotient (IQ). A rather crude metric, some might argue, for such a complex phenomenon.
Motivation and emotion
Human motivation remains, to a significant extent, an incompletely understood enigma. From a purely psychological standpoint, Maslow's hierarchy of needs stands as a well-established and influential theory, conceptualizing human motivation as a hierarchical process of satisfying fundamental needs in ascending order of complexity. From a broader, more philosophical perspective, human motivation can be defined as an individual's commitment to, or deliberate withdrawal from, various goals that demand the application of their inherent human abilities. Furthermore, the interplay of incentive and preference are both critical factors, as are any perceived connections between these incentives and preferences. Volition, or willpower, may also play a significant role, adding another layer of complexity. Ideally, both motivation and volition work in concert to ensure the optimal selection, relentless striving for, and ultimate realization of goals, a function that commences in early childhood and continues throughout one's entire lifetime in a continuous process known as socialization.
Emotions are, fundamentally, biological states intricately linked with the nervous system. They are typically triggered by discernible neurophysiological changes, which are, in turn, variously associated with specific thoughts, subjective feelings, observable behavioral responses, and a measurable degree of either pleasure or displeasure. Emotions are frequently intertwined with broader mood states, inherent temperament, established personality traits, individual disposition, creative drives (creativity), and, of course, overall motivation. Emotion exerts a profound influence on human behavior and significantly impacts our capacity for learning. Acting solely on extreme or uncontrolled emotions can, predictably, lead to social disorder and crime, with numerous studies indicating that individuals involved in criminal behavior often exhibit a lower emotional intelligence than the general population.
Emotional experiences can be broadly categorized as either perceived as pleasant, such as the sensations of joy, interest, or contentment, or as acutely unpleasant, encompassing feelings like anxiety, sadness, anger, and despair. Happiness, or the subjective state of being happy, represents a core human emotional condition. The precise definition of happiness remains a perennial philosophical topic. Some define it as the experience of positive emotional affects while simultaneously minimizing negative ones. Others view it as a broader appraisal of overall life satisfaction or perceived quality of life. Recent research, perhaps with a touch of realism, suggests that achieving happiness might actually involve experiencing some negative emotions, particularly when humans perceive them as warranted. A rather inconvenient truth, that.
Sexuality and love
For humans, sexuality is a complex, multi-faceted phenomenon encompassing biological, erotic, physical, emotional, social, or even spiritual feelings and behaviors. Because it is such a broad and historically fluid term, its precise definition remains elusive and has, predictably, varied significantly across different historical contexts. The biological and physical dimensions of sexuality are largely concerned with the intricacies of human reproductive functions, including the predictable stages of the human sexual response cycle. However, sexuality is also profoundly influenced by, and in turn influences, cultural, political, legal, philosophical, moral, ethical, and religious aspects of life – a truly comprehensive entanglement. Sexual desire, often referred to as libido, represents a fundamental mental state that initiates sexual behavior. Studies, to no one's great surprise, consistently show that men generally report desiring sex more frequently than women and tend to masturbate more often.
Humans exist along a continuous spectrum of sexual orientation, though the vast majority of individuals are heterosexual. While homosexual behavior is observed in some other animals, only humans and, rather curiously, domestic sheep have, to date, been found to exhibit an exclusive preference for same-sex relationships. Most available evidence points to nonsocial, fundamentally biological causes of sexual orientation, as cultures that are highly tolerant of homosexuality do not exhibit significantly higher rates of its occurrence. Furthermore, ongoing research in neuroscience and genetics strongly suggests that numerous other aspects of human sexuality are also significantly influenced by biological factors.
Love, that most celebrated and agonizing of human emotions, most commonly refers to a profound feeling of strong attraction or deep emotional attachment. It can manifest in impersonal forms (such as the love of an object, an abstract ideal, or a powerful political or spiritual connection) or, more commonly and dramatically, in interpersonal forms (the intense bond between humans). When an individual experiences the intoxicating state of being in love, a cascade of neurochemicals – including dopamine, norepinephrine, and serotonin – floods and stimulates the brain's pleasure center. This biochemical cocktail can lead to a predictable array of side effects, such as an increased heart rate, a disconcerting loss of appetite and sleep, and an almost overwhelming, intense feeling of excitement. A rather inefficient, if captivating, biological mechanism.
Culture
Humanity's truly unprecedented array of intellectual skills has, without a doubt, been the singular most important factor in our species' eventual technological ascendancy and the rather alarming concomitant domination of the biosphere. Setting aside the now-extinct hominids, humans are the only animals known to systematically teach generalizable information, to innately deploy recursive embedding to generate and communicate fantastically complex concepts, to engage in the sophisticated "folk physics" absolutely essential for competent tool design, or, indeed, to deliberately cook food in the wild. This capacity for systematic teaching and learning is what meticulously preserves the distinct cultural and ethnographic identity of human societies, ensuring continuity and transmission across generations. Other traits and behaviors, while perhaps not entirely exclusive, are overwhelmingly unique to humans, including the deliberate starting of fires, the complex phoneme structuring that underpins our languages, and the remarkable capacity for vocal learning – the ability to imitate and produce novel sounds.
Language
While a multitude of species engage in various forms of animal communication, language as we understand it is undeniably unique to humans. It stands as a defining feature of humanity itself and is, strikingly, a true cultural universal, found in every known human society. Unlike the often-limited and rigid communication systems of other animals, human language is fundamentally open – an infinite number of meanings can be generated by creatively combining a finite, limited number of symbols. Human language also possesses the remarkable capacity of displacement, allowing us to use words to represent things and events that are not physically present or locally occurring, but instead exist within the shared, abstract imagination of those communicating. A truly powerful tool, that.
Language further distinguishes itself from other forms of communication by being modality independent. This means the same underlying meanings can be effectively conveyed through a variety of different media: audibly in speech, visually through the intricate gestures of sign language or the permanence of writing, and even through tactile media such as braille. Language is not merely a means of conveying information; it is absolutely central to the intricate tapestry of communication between humans, and profoundly shapes the very sense of identity that binds together nations, cultures, and ethnic groups. Today, approximately six thousand distinct languages are actively in use, including numerous sign languages, alongside countless thousands more that have tragically become extinct – silent echoes of past human thought.
The arts
The arts, in their myriad forms, represent a profound and uniquely human expression. They can manifest across a vast spectrum, including the visual arts (ranging from traditional paintings and sculptures to modern film, intricate fashion design, and monumental architecture), the literary arts (encompassing diverse forms such as prose, evocative poetry, and dramatic dramas), and the dynamic performing arts (which generally involve the communal experiences of theatre, the universal language of music, and the expressive movements of dance). Humans, ever the synthesizers, frequently combine these different artistic forms – consider, for instance, the ubiquitous music video, a fusion of sound and image. Other creations, perhaps less conventionally, have also been described as possessing artistic qualities, including the meticulous preparation of food, the immersive narratives of video games, and even the nuanced practice of medicine. Beyond their undeniable capacity to entertain and to transmit knowledge across generations, the arts are also, quite frequently, employed for explicit political purposes, serving as powerful tools for propaganda, protest, and social commentary.
Art stands as a defining characteristic of human existence, and compelling evidence suggests a deep, intricate relationship between human creativity and the emergence of language. The earliest documented evidence of artistic endeavor predates even anatomically modern humans, with shell engravings made by Homo erectus dating back a staggering 300,000 years. Art unequivocally attributed to H. sapiens appeared at least 75,000 years ago, with the discovery of intricate jewelry and detailed drawings within caves in South Africa. Various hypotheses attempt to explain why humans have adapted to engage in the arts. These theories range from the idea that art allowed for better problem-solving, provided a means to control or influence other humans, encouraged cooperation and social cohesion within a society, or even enhanced the likelihood of attracting a potential mate. It is plausible that the development of imagination through artistic expression, combined with logical thought, conferred a significant evolutionary advantage upon early humans.
Remarkably, evidence of humans engaging in musical activities actually predates the earliest known cave art, underscoring its deep roots in our species. To this day, music has been practiced by virtually all known human cultures, making it a truly universal expression. A vast diversity of music genres and ethnic musics exists, and human musical abilities are intrinsically linked to other complex human social behaviors. Scientific studies have shown that human brains actively respond to music, becoming synchronized with its rhythm and beat – a fascinating process known as entrainment. Dance is another ubiquitous form of human expression, found in every culture, and may have evolved as a fundamental means for early humans to communicate complex social information. Both listening to music and observing dance stimulate the orbitofrontal cortex and other pleasure-sensing regions of the brain, highlighting the deep, intrinsic reward humans derive from these artistic forms.
Unlike the seemingly effortless acquisition of speech, the skills of reading and writing do not come naturally to humans; they are complex cognitive abilities that must be explicitly taught. Nevertheless, the concept of literature predates the invention of formal written words and structured language, with 30,000-year-old paintings adorning the walls of some caves, vividly portraying sequences of dramatic scenes. One of the oldest surviving works of literature is the epic narrative, the Epic of Gilgamesh, which was first painstakingly engraved on ancient Babylonian clay tablets approximately 4,000 years ago. Beyond its role in simply transmitting knowledge across generations, the creation and sharing of imaginative fiction through storytelling may have played a crucial role in developing humans' communication capabilities and, perhaps, even increased the likelihood of securing a mate. Storytelling also serves as a powerful means to impart moral lessons and foster cooperation within an audience, binding individuals through shared narratives and ethical frameworks.
Tools and technologies
The earliest evidence of stone tools being utilized by proto-humans dates back at least 2.5 million years. The ingenious use and systematic manufacture of tools has been proposed as the defining ability of humans, distinguishing us more profoundly than anything else, and has historically been recognized as a critical evolutionary step. This early technology became significantly more sophisticated around 1.8 million years ago, a period that also saw the controlled use of fire begin to emerge approximately 1 million years ago. The wheel and wheeled vehicles, those foundational innovations, appeared simultaneously in several geographically distinct regions sometime in the fourth millennium BC. The continuous development of increasingly complex tools and technologies proved absolutely essential in allowing land to be cultivated and animals to be domesticated, thereby forming the bedrock of agriculture – the transformative process known as the Neolithic Revolution.
China, a cradle of innovation, independently developed paper, the transformative printing press, the destructive yet powerful gunpowder, the indispensable compass, and a multitude of other important inventions that profoundly shaped global history. The relentless improvements in smelting techniques enabled the sophisticated forging of copper, bronze, iron, and eventually steel – a material that underpins modern infrastructure, from railways and towering skyscrapers to countless other manufactured products. This metallurgical revolution coincided with the profound societal shifts of the Industrial Revolution, an era where the invention of automated machines brought about truly seismic changes to human lifestyles. Modern technology is observed as progressing exponentially, a relentless acceleration of innovation. Major breakthroughs in the 20th century alone include: the harnessing of electricity, the life-saving discovery of penicillin, the development of semiconductors that power our digital world, the ubiquitous internal combustion engines, the interconnected Internet, revolutionary nitrogen fixing fertilizers, the miracle of airplanes, the omnipresent computers, the personal freedom of automobiles, the societal impact of contraceptive pills, the destructive power of nuclear fission, the transformative green revolution in agriculture, the widespread communication of radio, the precision of scientific plant breeding, the ambition of rockets, the comfort of air conditioning, the entertainment of television, and the efficiency of the assembly line. A truly dizzying array of advancements, for better or worse.
Religion and spirituality
Definitions of religion are, predictably, varied and contested. According to one widely accepted definition, a religion constitutes a belief system centered around the supernatural, the sacred, or the divine, coupled with a set of practices, values, institutions, and rituals associated with such beliefs. Many religions also incorporate a distinct moral code or ethical framework. The evolution and historical origins of the first religions have become active and intriguing areas of scientific investigation. Credible evidence suggesting the presence of religious behavior in humans dates back to the Middle Paleolithic era, roughly 45,000 to 200,000 years ago. It has been theorized that religion may have evolved to play a crucial role in reinforcing and encouraging cooperation among humans, providing a shared framework for social cohesion.
Religion manifests in an astonishing diversity of forms. It can encompass a belief in life after death, offer explanations for the origin of life, propose intricate narratives about the nature of the universe (religious cosmology) and its ultimate destiny (eschatology), and provide comprehensive moral or ethical teachings. Views on transcendence and immanence vary substantially across traditions, which variously espouse monism (the belief in a single ultimate reality), deism (belief in a creator god who does not intervene), pantheism (the belief that God is identical with the universe), and theism (including both polytheism with multiple gods and monotheism with a single god).
Although precisely measuring religiosity across diverse populations is inherently difficult, a significant majority of humans worldwide profess some form of religious or spiritual belief. In 2015, the plurality of the global population identified as Christian, followed by Muslims, Hindus, and Buddhists. Notably, as of 2015, approximately 16% of the global population, or slightly under 1.2 billion humans, were categorized as irreligious, a group that includes those with no specific religious beliefs and those who do not identify with any organized religion. A growing segment, perhaps, of those who simply find the narratives unconvincing.
Science and philosophy
An aspect truly unique to humans – and one that, admittedly, has yielded some useful results – is our species' remarkable ability to transmit knowledge from one generation to the next, and, crucially, to continually build upon this inherited information. This iterative process allows us to develop increasingly sophisticated tools, formulate robust scientific laws, and achieve other intellectual advancements, which are then, in turn, passed on to future generations. This accumulated body of knowledge can be rigorously tested through observation and experimentation to answer profound questions or to make accurate predictions about the fundamental workings of the universe. This systematic approach has proven extraordinarily successful in advancing human ascendancy, for better or worse.
Aristotle, with his systematic empirical investigations, has been aptly described as the first true scientist, laying crucial groundwork that preceded the significant rise of scientific thought throughout the Hellenistic period. Other pivotal early advances in science emerged from the intellectually vibrant Han dynasty in China and, later, during the flourishing Islamic Golden Age. The scientific revolution, which blossomed near the end of the Renaissance, marked a watershed moment, leading directly to the emergence of what we now recognize as modern science.
A complex chain of events and intellectual influences eventually coalesced to produce the scientific method – a rigorous, self-correcting process of systematic observation, hypothesis formation, and empirical experimentation. This method is the fundamental bedrock used to differentiate genuine science from deceptive pseudoscience. A sophisticated understanding of mathematics is another cognitive faculty largely unique to humans, although other animal species do exhibit some rudimentary forms of numerical cognition. The entire edifice of science can be broadly categorized into three major branches: the formal sciences (such as logic and mathematics), which are concerned with abstract formal systems; the applied sciences (including fields like engineering and medicine), which focus on practical applications of knowledge; and the empirical sciences, which are grounded in empirical observation and are further subdivided into the natural sciences (e.g., physics, chemistry, biology) and the social sciences (e.g., psychology, economics, sociology).
Philosophy is a profound and enduring field of study where humans relentlessly seek to understand fundamental truths about themselves and the bewildering world they inhabit. Philosophical inquiry has been a major, indeed defining, feature in the development of humanity's intellectual history, a relentless quest for meaning. It has often been described, rather poetically, as the "no man's land" situated precariously between definitive scientific knowledge and dogmatic religious teachings – a space where questions are often more important than answers. Major fields of philosophy include metaphysics (the study of fundamental reality), epistemology (the theory of knowledge), logic (the study of valid reasoning), and axiology (which encompasses both ethics and aesthetics).
Society
Society is, at its most basic, the intricate system of organizations and institutions that arise from the ceaseless interaction between humans. As established, humans are profoundly social creatures, tending to organize themselves into large, complex social groups. These groups can then be further subdivided and stratified according to various factors, including income, accumulated wealth, wielded power, earned reputation, and other less tangible attributes. The specific structure of this social stratification and the degree of potential social mobility within it differ significantly, particularly when comparing modern and traditional societies. Human groups themselves range dramatically in scale, from the intimate bonds of families to the vast, complex structures of entire nations. The earliest form of human social organization is widely believed to have resembled the relatively egalitarian hunter-gatherer band societies.
Gender
Human societies, with predictable regularity, typically construct and exhibit distinct gender identities and gender roles. These social constructs serve to differentiate between characteristics deemed masculine and those considered feminine, and, perhaps more restrictively, prescribe the acceptable range of behaviors and attitudes for their members based on their assigned sex. The most prevalent categorization, historically and globally, is the rigid gender binary of men and women. However, some societies, with varying degrees of acceptance, recognize the existence of a third gender, or, less commonly, even a fourth or fifth gender category. In other societies, the umbrella term "non-binary" is used to encompass a diverse range of gender identities that do not exclusively align with either male or female.
Historically, gender roles have often been associated with a stark division of norms, practices, dress, behavior, rights, duties, privileges, status, and power. It is a rather bleak historical truth that men have, in most societies, both past and present, enjoyed more rights and privileges than women. As a quintessential social construct, gender roles are not immutable; they are fluid and demonstrably vary historically within any given society. Challenges to predominant gender norms have, unsurprisingly, recurred throughout history in many societies. Little definitive information is known about the precise nature of gender roles in the earliest human societies. However, it is thought that early modern humans likely exhibited a range of gender roles similar to those observed in modern cultures from at least the Upper Paleolithic. In contrast, the Neanderthals were less sexually dimorphic, and evidence suggests that the behavioral differences between males and females in their societies were minimal – perhaps a more egalitarian existence, one might infer.
Kinship
All human societies, without exception, possess intricate systems for organizing, recognizing, and classifying various types of social relationships. These systems are fundamentally based on direct relations between parents, children, and other descendants (consanguinity), as well as relations established through the formal institution of marriage (affinity). A third, more flexible type of kinship is often applied to chosen relationships, such as godparents or adoptive children (fictive kinship). These culturally defined relationships are collectively referred to as kinship. In many societies, kinship stands as one of the most important social organizing principles, playing a crucial role in the transmission of social status and inheritance across generations. All human societies, for reasons that are both biological and cultural, enforce strict rules of incest taboo, prohibiting marriage between certain close kin relations. Some societies also possess rules dictating preferential marriage with specific kin relations, further solidifying social bonds.
Pair bonding, the formation of strong, often long-lasting, exclusive or semi-exclusive relationships between adults, is a ubiquitous feature of human sexual relationships. This can manifest in various forms, including serial monogamy (a series of exclusive relationships), polygyny (one man with multiple wives), or polyandry (one woman with multiple husbands). Genetic evidence strongly suggests that humans were predominantly polygynous for the vast majority of our existence as a species. However, this pattern began to shift during the Neolithic, when monogamy started to become more widespread, concomitant with the transition from nomadic to sedentary agricultural societies. Further anatomical evidence, derived from the analysis of second-to-fourth digit ratios (a known biomarker for prenatal androgen effects), likewise indicates that modern humans were largely polygynous during the Pleistocene epoch. The shift to monogamy, it seems, was a relatively recent and culturally driven development.
Ethnicity
Human ethnic groups are fundamentally a social category where individuals identify together as a group based on a shared constellation of attributes that serve to distinguish them from other groups. These shared attributes can include a common set of traditions, a perceived shared ancestry, a common language, a collective history, a shared society, a distinct culture, a sense of nationhood, a common religion, or even shared experiences of social treatment within their residing area. Ethnicity is a distinct concept from race, which is typically based on observable physical characteristics, although both concepts are, crucially, socially constructed. Assigning a definitive ethnicity to a particular population is often a complicated endeavor, as even within commonly recognized ethnic designations, there can exist a diverse range of subgroups, and the very makeup of these ethnic groups can evolve and change over time at both the collective and individual level. Indeed, there is no universally accepted definition of what precisely constitutes an ethnic group. Nonetheless, ethnic groupings can play an incredibly powerful role in shaping the social identity and fostering solidarity within ethnopolitical units. This phenomenon has been intimately tied to the rise of the nation state as the predominant form of political organization throughout the 19th and 20th centuries.
Government and politics
As burgeoning farming populations gathered into larger and increasingly dense communities, the frequency and complexity of interactions both within and between these diverse groups inevitably escalated. This increasing complexity directly spurred the development of formal governance mechanisms, both internal to communities and for managing inter-community relations. Humans have, rather conveniently, evolved the remarkable ability to change their affiliation with various social groups relatively easily, even abandoning previously strong political alliances, if such a shift is perceived as offering personal advantages. This inherent cognitive flexibility allows individual humans to adapt and even fundamentally alter their political ideologies, with individuals exhibiting higher cognitive flexibility being demonstrably less likely to support rigid authoritarian and nationalistic stances. A rather hopeful observation, perhaps.
Governments, in their various forms, are established to create and enforce laws and policies that directly impact the citizens they govern. Throughout human history, there have been many forms of government, each characterized by distinct means of obtaining power and the capacity to exert diverse controls over their populations. Currently, approximately 47% of humans live under some form of a democracy, while 17% reside in a hybrid regime (a blend of democratic and authoritarian elements), and a significant 37% endure under an authoritarian regime. Many countries also participate in various international organizations and alliances; the largest and most prominent of these is the United Nations, which boasts 193 member states – a testament to our ongoing, if often fraught, attempts at global cooperation.
Trade and economics
Trade, defined as the voluntary exchange of goods and services, is widely considered a characteristic that profoundly differentiates humans from other animals, and has been cited as a practice that conferred a major advantage upon Homo sapiens over other hominids. Evidence strongly suggests that early H. sapiens established and utilized long-distance trade routes for the exchange of goods and, crucially, ideas, leading to remarkable "cultural explosions" and providing critical additional food sources during periods of scarce hunting. Such sophisticated trade networks, notably, did not appear to exist for the now-extinct Neanderthals. Early trade likely involved essential materials for tool creation, such as obsidian. The first truly international trade routes, spanning vast distances and connecting diverse cultures, emerged around the spice trade throughout the Roman and medieval periods.
Early human economies were more likely to be structured around systems of gift giving rather than the direct exchange of a bartering system. The earliest forms of money consisted of tangible commodities, with cattle serving as the oldest known form, and cowrie shells becoming one of the most widely used. Money, of course, has since evolved considerably, progressing through government-issued coins and paper currency to the ephemeral realm of electronic money. The human study of economics constitutes a social science that meticulously examines how societies allocate scarce resources among their diverse populations. A rather stark reality of human societies is the massive inequalities in the division of wealth among humans; a truly sobering statistic reveals that the eight richest individuals on Earth possess the same monetary value as the poorest half of the entire human population. A testament to our species' capacity for extreme disparity.
Conflict
Humans commit acts of violence against other humans at a rate broadly comparable to other primates. However, a chilling distinction lies in our increased preference for killing adults, whereas infanticide tends to be more prevalent among other primate species. Phylogenetic analysis, peering into our deep past, predicts that approximately 2% of early H. sapiens would have been murdered, a rate that tragically surged to 12% during the medieval period, before, thankfully, dropping back below 2% in more modern times. Yet, there remains immense variation in violence rates among human populations, with homicide rates as low as 0.01% in societies that possess robust legal systems and strong cultural attitudes explicitly against violence.
The enduring willingness of humans to engage in the mass killing of other members of their own species through organized conflict (i.e., war) has long been a subject of intense debate and philosophical contemplation. One school of thought rigidly maintains that war evolved as an innate means to eliminate competitors, thus an inherent and unavoidable human characteristic. Another perspective, perhaps more optimistically, suggests that war is a relatively recent phenomenon, emerging primarily due to shifting social conditions and pressures. While this debate is far from settled, current evidence tends to indicate that warlike predispositions only became widespread and common approximately 10,000 years ago, and in many regions, much more recently than that. War has, without question, extracted an unimaginably high cost in human lives; it is estimated that during the 20th century alone, between 167 million and 188 million people perished as a direct result of war. Reliable war casualty data is, predictably, less available for pre-medieval times, especially global figures. However, when compared with any period over the past 600 years, the last 80 years (post-1946) have, rather remarkably, witnessed a very significant drop in global military and civilian death rates attributable to armed conflict. Perhaps we are, slowly, learning. Or perhaps we are just more efficient at it now.
See also
- Mammals portal
- Evolutionary biology portal
- Science portal
- List of human evolution fossils
- Timeline of human evolution
Notes
- ^ The world population and population density statistics are updated automatically from a template that uses the CIA World Factbook and United Nations World Population Prospects.
- ^ Cities with over 10 million inhabitants as of 2018.
- ^ Traditionally this has been explained by conflicting evolutionary pressures involved in bipedalism and encephalization (called the obstetrical dilemma), but recent research suggest it might be more complicated than that.