← Back to home

Computer Engineering

Computer Engineering

An engineering discipline specializing in the design of computer hardware and software, a field that, despite its name, involves far more than merely understanding which end of a circuit board goes where.

Not to be confused with Computational engineering, which, while related, doesn't quite capture the sheer breadth of existential despair required for this particular discipline.

"Hardware engineering" redirects here. For engineering other types of hardware, one might consider the more... grounded pursuits of Mechanical engineering.

Occupation Names Computer engineer
Occupation type Engineering
Activity sectors Electronics, telecommunications, signal processing, computer hardware, software
Specialty Hardware engineering, software engineering, hardware-software interaction, robotics, networking
Description Competencies: Technical knowledge, hardware design, software design, advanced mathematics, systems design, abstract thinking, analytical thinking
Fields of employment Science, technology, engineering, industry, military, exploration

Computer engineering (CE, [a] CoE, CpE, or CompE) is a branch of engineering specifically tailored for those who find the inherent complexities of both computer hardware and software equally fascinating and infuriating. It's where the abstract logic of computation collides, often violently, with the stubborn realities of physical electronics. This isn't just about making things work; it's about making them work together, a feat often requiring the patience of a saint and the diagnostic skills of a seasoned detective.

This article is part of a series on Engineering, a vast and sometimes bewildering collection of human endeavors to bend the natural world to our will.

Engineering branches

Lists of engineering software

Glossary of engineering

See also

It integrates several fields of electrical engineering, electronics engineering, and computer science—a truly ambitious cocktail of disciplines, if you ask me. This means a computer engineer is expected to navigate the quantum uncertainties of electron flow while simultaneously wrestling with the abstract logic of algorithms. Some institutions, clearly unwilling to commit to a single title, refer to computer engineering as Electrical and Computer Engineering or, even more generically, Computer Science and Engineering. It's a testament to the field's sprawling nature, or perhaps just a lack of decisive naming committees.

Computer engineers, bless their hearts, require rigorous training in the delicate art of hardware-software integration, the meticulous craft of software design, and the enduring agony of software engineering. The scope of their knowledge is vast, encompassing areas that range from the fundamental forces of electromagnetism to the unsettling future of artificial intelligence (AI), the mechanical ballet of robotics, the invisible threads of computer networks, the intricate blueprints of computer architecture, and the very operating principles of operating systems. Essentially, they are the polymaths of the digital age, expected to understand everything from the microscopic dance of electrons in a microcontroller to the grand symphony of a supercomputer. They are involved in countless hardware and software facets of computing, from the initial circuit design of individual microprocessors and personal computers to the overarching strategy of integrating these complex systems into a cohesive, functional whole. This isn't just about crafting components; it's about understanding how those components fit into, and fundamentally shape, the larger technological landscape. Robotics, for instance, stands as a prime application, demanding seamless harmony between physical mechanisms and digital intelligence.

In practice, computer engineering typically delves into areas that include the rather mundane, yet critical, task of writing software and firmware for embedded microcontrollers—the tiny brains behind countless devices you barely notice. It also involves the intricate art of designing Very-large-scale integration (VLSI) chips, crafting sensitive analog sensors, and assembling complex mixed signal circuit boards. Beyond the purely electronic, they grapple with the principles of thermodynamics (because heat, apparently, is always an issue) and the precise mechanics of control systems. Given their unique blend of skills, computer engineers are particularly well-suited for robotics research, a field that relies heavily on their ability to deploy digital systems for the precise command and vigilant monitoring of electrical systems such as motors, intricate [communications](/Computer-mediated_communication protocols), and an array of perceptive sensors.

It's a common, and frankly sensible, practice in many institutions of higher learning to allow computer engineering students to select areas of deeper specialization during their junior and senior years. This is not out of generosity, but rather a pragmatic acknowledgment that the sheer, intimidating breadth of knowledge required for both the design and practical application of modern computers simply cannot be crammed into a single undergraduate degree. Other institutions, perhaps less trusting of nascent engineers, may mandate that engineering students endure one or two years of general engineering coursework before they're finally permitted to declare computer engineering as their primary, and likely all-consuming, focus. It’s almost as if they want you to suffer a little before you commit to a lifetime of debugging.

[[File:STM32F407 die shot.jpg|thumb|A die shot of an STM32 Microcontroller. This chip is both designed by computer engineers and is utilized by them to make other systems]]

History

The journey to modern computer engineering is, like most histories, a winding path paved with both brilliance and incremental improvements. Before the digital age, mechanical marvels hinted at the future.

[[File:Difference Engine No. 2, London - Science Museum - geograph.org.uk - 1715053.jpg|thumb|The Difference Engine, a mechanical marvel that prefigured digital computation.]] [[File:Eniac.jpg|thumb|The ENIAC, the first electronic computer, a machine that filled rooms and heralded a new era.]]

The true genesis of what we now recognize as computer engineering can be traced back to 1939. This was the year when John Vincent Atanasoff and his graduate student, Clifford Berry, embarked on a rather ambitious project: developing the world's very first electronic digital computer. Their groundbreaking work was a fascinating confluence of physics, mathematics, and the nascent field of electrical engineering. Atanasoff, then a physics and mathematics instructor at Iowa State University, brought the theoretical rigor, while Berry, a former graduate student with expertise in both electrical engineering and physics, contributed the practical ingenuity. Together, they meticulously crafted the Atanasoff–Berry computer, famously known as the ABC, a project that consumed five years of their lives to bring to fruition.

While the original ABC, unfortunately, met an ignominious end, being dismantled and discarded in the 1940s, its legacy was not forgotten. In a touching, if somewhat belated, tribute to these pioneering inventors, a meticulous replica of the ABC was constructed in 1997. This endeavor required a dedicated team of researchers and engineers four years of effort and an investment of $350,000 to painstakingly recreate the machine. A considerable sum for a piece of history, but perhaps a necessary one to remind us where it all began.

The true explosion of the modern personal computer era, however, only truly dawned in the 1970s, following a series of pivotal breakthroughs in semiconductor technology. It was a chain reaction of innovation:

  • 1947: The very first working transistor was unveiled by the formidable trio of William Shockley, John Bardeen, and Walter Brattain at Bell Labs. This tiny device, a seemingly insignificant piece of material, would eventually shrink entire rooms down to the palm of your hand.
  • 1955: The critical development of silicon dioxide surface passivation by Carl Frosch and Lincoln Derick at Bell Labs. This seemingly technical detail was, in fact, a foundational step, enabling the protection of semiconductor surfaces and paving the way for more reliable and smaller components.
  • 1957: Frosch and Derick continued their work, demonstrating the first planar silicon dioxide transistors. This was a crucial refinement, allowing for more consistent and mass-producible transistors.
  • Later developments: The planar process itself, championed by Jean Hoerni, revolutionized semiconductor manufacturing by allowing the creation of multiple transistors on a single wafer simultaneously. This was a game-changer for scalability.
  • 1959: The concept of the monolithic integrated circuit chip was brought to life by Robert Noyce at Fairchild Semiconductor. This allowed multiple transistors and other components to be fabricated on a single piece of silicon, leading to exponentially more powerful and compact electronics.
  • 1960: The metal–oxide–semiconductor field-effect transistor (MOSFET, or MOS transistor) was demonstrated by another team at Bell Labs. This invention proved to be the workhorse of the digital age, offering superior performance and scalability compared to its bipolar predecessors.
  • 1971: The crowning achievement, the single-chip microprocessor—specifically the Intel 4004—was brought into existence by the combined genius of Federico Faggin, Marcian Hoff, Masatoshi Shima, and Stanley Mazor at Intel. This was the moment the "brain" of a computer could fit onto a fingernail-sized piece of silicon, forever changing the trajectory of technology.

History of computer engineering education

The formalization of computer engineering as an academic discipline began relatively recently. The very first dedicated computer engineering degree program in the United States was established in 1971 at Case Western Reserve University in the decidedly un-futuristic setting of Cleveland, Ohio. One can only imagine the initial skepticism.

As of 2015, the landscape had shifted considerably, with a respectable 250 ABET-accredited computer engineering programs scattered across the U.S. In Europe, the accreditation of computer engineering schools is handled by a diverse array of agencies, all operating under the umbrella of the EQANIE network, ensuring a certain baseline of competence, or at least compliance. The ever-increasing demands from industry for engineers who possess the rare ability to concurrently design hardware, develop software, craft firmware, and effectively manage the complex tapestry of computer systems used in modern production, has led many tertiary institutions worldwide to offer a specific bachelor's degree generally designated as computer engineering. It's a recognition that this multidisciplinary skillset isn't just a niche; it's a necessity. Both computer engineering and electronic engineering curricula inherently include robust training in both analog and digital circuit design—because, despite appearances, the digital world still relies on the very analog flow of electrons. As with the vast majority of engineering disciplines, a solid, unyielding grasp of mathematics and the fundamental principles of science is not merely helpful for aspiring computer engineers; it is absolutely indispensable.

Education

Computer engineering is, as previously noted, sometimes referred to as computer science and engineering at various universities, a semantic distinction that probably matters more to academic departments than to the actual engineers. For most entry-level positions in this field, a minimum of a bachelor's degree in computer engineering, electrical engineering, or computer science is typically required. It's not a path for the mathematically faint of heart; one must invariably grapple with an array of advanced mathematics, including the delightful intricacies of calculus, the linear elegance of linear algebra, and the often-bewildering world of differential equations. All this, of course, alongside a comprehensive dive into computer science.

Degrees in electronic or electrical engineering are often considered sufficient, largely due to the significant, almost indistinguishable, overlap between these fields and computer engineering. Given that hardware engineers frequently find themselves entangled with computer software systems, a robust background in computer programming is not just beneficial, it's a fundamental requirement. The Bureau of Labor Statistics (BLS) rather succinctly puts it: "a computer engineering major is similar to electrical engineering but with some computer science courses added to the curriculum." For those seeking positions in larger corporations or highly specialized roles, the pursuit of a master's degree often becomes a non-negotiable requirement.

It is also, and this is crucial, absolutely imperative for computer engineers to maintain a relentless pace of learning to keep abreast of the dizzying, often exhausting, advances in technology. Therefore, many find themselves in a perpetual state of education throughout their entire careers. This continuous pursuit of knowledge is not merely academic; it's a practical necessity, especially when it comes to acquiring new proficiencies or honing existing ones. Consider, for instance, the escalating cost of rectifying a bug: the further along a bug is discovered in the software development cycle, the exponentially more expensive it becomes to fix. This inconvenient truth underscores the immense financial benefits of developing and rigorously testing for quality code as early as humanly possible in the process, ideally long before any official release. It’s almost like preventing a catastrophe is better than cleaning up the mess, a concept many still struggle with.

Applications and practice

The examples and perspective in this section deal primarily with the United States and do not represent a worldwide view of the subject. You may improve this section, discuss the issue on the talk page, or create a new section, as appropriate. (July 2018) ( Learn how and when to remove this message )

When one distills the vastness of computer engineering, two primary, often intertwined, focuses emerge: the tangible realm of hardware and the ethereal domain of software.

Computer hardware engineering

According to the United States Bureau of Labor Statistics (BLS), the current job outlook for computer hardware engineers indicates an expected ten-year growth of 7% from 2024 to 2034. This translates to a projected total of 71,100 jobs. However, the BLS, in its own understated wisdom, labels this growth as "slower than average" when juxtaposed against other occupations. A quick glance at historical projections reveals a trend: this 7% is an increase from the 2019 to 2029 estimate of 2% (a mere 71,100 jobs) and a slight bump from the 2014 to 2024 BLS estimate of 3% (totaling 77,700 jobs). It’s also important to note that this is still significantly down from the 7% projected for 2012 to 2022, and a stark decline from the 9% estimated in the BLS's 2010 to 2020 outlook. So, "slower than average" seems to be the polite way of saying "don't hold your breath for a boom." Today, the field of computer hardware engineering is increasingly seen as somewhat synonymous with electronic and computer engineering (ECE) and has, in its relentless march towards specialization, fragmented into numerous subcategories, with the most undeniably significant being embedded system design.

Computer software engineering

According to the U.S. Bureau of Labor Statistics (BLS), the growth projections for computer applications software engineers and computer systems software engineers for the period of 2024 to 2034 are estimated at a respectable 15%. This figure is quite close to the 2014 to 2024 growth rate, which stood at an estimated 17%, accounting for a substantial 1,114,000 jobs in that year. However, it's worth noting that this is a step down from the BLS's 2012 to 2022 estimate of 22% for software developers, and even further removed from the rather optimistic 30% projection seen in the 2010 to 2020 BLS outlook. The silver lining, if you can call it that, is that growing societal concerns over cybersecurity continue to propel computer software engineering well above the average rate of increase for all fields. People, it seems, are finally realizing the digital world is a dangerous place.

However, a significant portion of this work, in a predictable turn of events, will continue to be outsourced to foreign countries. This means that while the overall demand remains high, domestic job growth in the U.S. will not match the frenetic pace of the last decade, as roles that historically would have gone to computer software engineers in the United States are increasingly being filled by their counterparts in nations such as India. It's the inevitable globalized shuffle. Furthermore, the BLS job outlook for dedicated Computer Programmers (those focused on, say, embedded systems rather than application development) tells a rather bleak story: a consistent decline, with an −8% outlook for 2014–24, followed by −9% for 2019–29, a 10% decline for 2021–2031, and now an 11% decline projected for 2022–2032. Perhaps we’re just getting too good at automating away the actual programming. And, adding another layer of pervasive disappointment, the representation of women in software fields has been steadily declining over the years, a trend that appears to be accelerating even faster than in other engineering fields. A curious regression, isn't it?

Specialty areas

Within the sprawling domain of computer engineering, numerous specialty areas have emerged, each demanding its own brand of obsessive focus.

Processor design

The intricate process of processor design begins with the fundamental decision of selecting an instruction set—the very language the processor will understand—and then committing to a particular execution paradigm, such as Very Long Instruction Word (VLIW) or Reduced Instruction Set Computing (RISC). This foundational choice ultimately culminates in a detailed microarchitecture, often meticulously described using hardware description languages like VHDL or Verilog. The design of a Central Processing Unit (CPU) itself is a fragmented, multi-faceted endeavor, typically segmented into the design of several crucial components: the datapaths, which are the computational arteries responsible for operations like those performed by Arithmetic Logic Units (ALUs) and the sequential flow within pipelines; the control unit, that complex logic orchestrating the precise ballet of the datapaths; various memory components, ranging from high-speed register files to hierarchical caches; the ever-present clock circuitry, including clock drivers, Phase-Locked Loops (PLLs), and the intricate networks distributing timing signals across the chip; the necessary pad transceiver circuitry for external communication; and finally, the underlying logic gate cell library from which all the higher-level logic is meticulously constructed. It's a miracle it ever works.

Coding, cryptography, and information protection

[[File:C programming language.png|thumb|left|Source code written in the C programming language, the lingua franca for many low-level system operations.]] Computer engineers immersed in the realms of coding, applied cryptography, and information protection are tasked with the Sisyphean labor of developing novel methodologies for safeguarding an ever-expanding universe of digital information. This includes, but is by no means limited to, protecting the integrity of digital images and digital music, combating the pervasive issues of data fragmentation, fending off the relentless tide of copyright infringement, and generally thwarting any other forms of unauthorized tampering. Techniques employed can range from the subtle, such as imperceptible digital watermarking, to the overtly complex mathematical gymnastics required for robust encryption. It's a constant, exhausting battle against those who would rather exploit than create.

Communications and wireless networks

Those engineers who choose to dedicate their intellect to the intricacies of communications and wireless networks are perpetually engaged in pushing the boundaries of telecommunications systems and network infrastructure, with a particular emphasis on the notoriously finicky domain of wireless networks. Their work delves deep into the theoretical underpinnings and practical applications of modulation techniques, the vital art of error-control coding to ensure data integrity across noisy channels, and the fundamental principles of information theory itself. This specialty encompasses the demanding task of high-speed network design, the relentless pursuit of interference suppression and optimized modulation schemes, the meticulous design and rigorous analysis of fault-tolerant systems (because things will break), and the development of efficient data storage and transmission schemes. It’s all about making sure the bits get from point A to point B, quickly and reliably, despite the universe’s best efforts to disrupt them.

Compilers and operating systems

[[File:Windows 10 Start Menu.png|thumb|Windows 10, an example of an operating system that requires continuous engineering effort.]] This specialty area is the domain of those who revel in the fundamental underpinnings of software: the design and development of compilers and operating systems. Engineers in this field are the architects of the digital world's foundational layers, constantly striving to devise novel operating system architectures, pioneer sophisticated program analysis techniques, and forge new methodologies to assure the elusive quality of these complex systems. Examples of their work are often invisible but critical, including the development of advanced post-link-time code transformation algorithms to optimize software performance, and the continuous, painstaking creation of entirely new operating system paradigms. They are the ones who make your computer actually do things, and then try to make it do them better, faster, and without spontaneously combusting.

Computational science and engineering

Computational science and engineering represents a relatively nascent, yet rapidly expanding, discipline that bridges the theoretical and the practical. According to the Sloan Career Cornerstone Center, individuals engaged in this area leverage "computational methods [to] formulate and solve complex mathematical problems in engineering and the physical and the social sciences." This isn't just about running simulations; it's about translating complex real-world phenomena into mathematical models that can be interrogated and understood through the power of computation. The breadth of its application is impressive, encompassing everything from the aerodynamic optimization of aircraft design and the precise plasma processing of nanometer-scale features on semiconductor wafers, to the intricate dance of VLSI circuit design, the detection capabilities of radar systems, the nuanced transport of ions through biological channels, and a myriad of other equally challenging problems. It's where raw computing power meets the most intractable questions of existence.

Computer networks, mobile computing, and distributed systems

In this particular specialty, engineers shoulder the responsibility of constructing integrated environments that seamlessly blend computing power, communication capabilities, and pervasive information access. Their work is the invisible backbone of our interconnected world. Examples of their endeavors include the optimization of shared-channel wireless networks to handle ever-increasing traffic, the development of sophisticated adaptive resource management strategies within various complex systems, and the relentless pursuit of improving the elusive quality of service in both mobile and Asynchronous Transfer Mode (ATM) environments. Further practical applications extend to the design and implementation of robust wireless network systems and the deployment of high-speed Ethernet cluster wired systems, ensuring that data flows freely, efficiently, and with minimal fuss, even if the users rarely appreciate the monumental effort involved.

Computer systems: architecture, parallel processing, and dependability

[[File:Intel Core i7-8700K Coffee Lake CPU.jpg|thumb|An example of a computer CPU, the heart of a computer system, a marvel of engineering.]] Engineers who immerse themselves in the realm of computer systems are perpetually engaged in research projects aimed at delivering computer systems that are not only reliable and secure but also boast exceptionally high performance. Their work often involves tackling some of the most fundamental challenges in computing. Projects within this field include the meticulous design of processors optimized for multithreading and the complex orchestration required for parallel processing—making multiple tasks run simultaneously without collapsing into chaos. Other significant contributions in this domain involve the development of groundbreaking new theories, sophisticated algorithms, and an array of innovative tools all designed to incrementally, or sometimes dramatically, enhance the performance and efficiency of computer systems. This specialty's scope for computer architecture alone is vast, encompassing the intricate details of CPU design, the strategic layout of the cache hierarchy (where speed meets cost), the optimized arrangement of memory organization, and the critical art of load balancing to distribute computational tasks effectively. They are the guardians of digital efficiency and stability.

Computer vision and robotics

[[File:Humanoid robot HRP-2.jpg|thumb|An example of a humanoid robot, where computer vision and robotics converge.]] In this captivating, and sometimes unsettling, specialty, computer engineers direct their considerable talents towards developing advanced visual sensing technology. Their goal is to enable machines to perceive and interpret their environment with increasing sophistication, to create accurate internal representations of that environment, and ultimately, to interact with and manipulate it effectively. The rich, three-dimensional information gathered through these sensory systems is then meticulously processed and implemented to perform a diverse array of tasks. These applications span from the refinement of human modeling and the optimization of image communication to the creation of intuitive human-computer interfaces, as well as the design of specialized devices such as cameras equipped with versatile vision sensors. It's the quest to give machines eyes, and then teach them what to do with what they see—a truly double-edged sword.

Embedded systems

[[File:Several embedded systems.jpg|thumb|Examples of devices that rely on the unseen power of embedded systems.]] Individuals toiling in the realm of embedded systems are dedicated to designing technology that inherently enhances the speed, bolsters the reliability, and boosts the overall performance of countless systems. These ubiquitous, often invisible, systems are the silent workhorses found in an astonishing array of devices, from the simplest FM radio you might absentmindedly tune into, all the way up to the mind-boggling complexity of a space shuttle. According to the Sloan Cornerstone Career Center, the ongoing developments in embedded systems are truly pushing the boundaries of what is possible, encompassing projects like "automated vehicles and equipment to conduct search and rescue, automated transportation systems, and human-robot coordination to repair equipment in space." It's a field that constantly strives to make machines smarter, more autonomous, and frankly, more pervasive. As of 2018, the specialized focus within computer embedded systems has expanded to include cutting-edge areas such as system-on-chip design, the architectural challenges of edge computing, and the ever-growing, interconnected web of the Internet of things.

Integrated circuits, VLSI design, testing and CAD

This highly specialized facet of computer engineering demands a profound and intricate understanding of both electronics and electrical systems. Engineers who choose this path are dedicated to the relentless pursuit of enhancing the speed, fortifying the reliability, and optimizing the energy efficiency of next-generation Very-large-scale integration (VLSI) circuits and microsystems. These are the microscopic worlds where the future of computing is literally etched into silicon. A tangible example of the work performed in this specialty includes the painstaking efforts dedicated to reducing the power consumption of complex VLSI algorithms and architectural designs—because even tiny chips can be energy hogs, and efficiency is paramount.

Signal, image and speech processing

Computer engineers operating in this fascinating area are focused on developing significant improvements in human–computer interaction, striving to make our conversations with machines less awkward and more intuitive. Their work encompasses the intricate fields of speech recognition and synthesis, the complex analysis and manipulation inherent in medical and scientific imaging, and the foundational elements of advanced communications systems. Other pivotal work within this domain includes the continuous evolution of computer vision technologies, such as the increasingly sophisticated and, at times, unnerving recognition of human facial features. It's all about teaching machines to perceive and understand the chaotic, analog world we inhabit, and frankly, they're getting disturbingly good at it.

Quantum computing

This emerging, and frankly mind-bending, area of computer engineering involves the ambitious integration of the inherently strange quantum behaviors of subatomic particles—such as the paradoxical state of superposition, the wave-like phenomena of interference, and the enigmatic connection known as entanglement—with the more conventional realm of classical computers. The ultimate goal is to craft systems capable of solving complex problems and formulating algorithms with an efficiency that utterly dwarfs anything achievable by current computational paradigms. Individuals who venture into this field often focus on highly specialized areas like Quantum cryptography (for truly unbreakable communication, supposedly), the development of physical simulations that leverage quantum mechanics, and the theoretical and practical construction of novel quantum algorithms. It's a field where the rules of the universe itself are bent to compute, and frankly, it feels like we're just asking for trouble.

See also

Related fields

Associations

Notes and references

Notes