Doğukan Barkan
05.09.2022 - 15:08
The Future of Hardware on a Software World
05.09.2022 - 15:08
In 2014 Facebook (now META) purchased WhatsApp; a text messaging app used widely across the globe, for an astonishing number of $21.8 billion [1]. Two years later Microchip Technology Incorporated acquired Atmel Corporation; one of the largest designers and manufacturers of semiconductors mainly focusing on embedded systems built around microcontrollers for $3.56 billion [2]. In one hand, we have the software; a messaging app that connects people which is a worthwhile asset valued for $21.8 billion. On the other hand, the company which produces crucial hardware parts so that this software could run is sold for $3.56 billion. Both software and hardware are important, without hardware you cannot even develop the software and, without software we would still have mechanical computers, punch cards, and so on. However, we see a six-times difference in the evaluation of both parts. Surely, there are many aspects in this difference, from company valuation to market structure to acquiring company & shareholders expectations. Clearly, the market demand is on the software side. When we gaze upon the most valued companies, we see software-based companies such as Microsoft, Alphabet (Google), Meta (Facebook), Amazon [3]. Note that each company has some interest in hardware.
Before we start, one fundamental concept needs some explanation to support this article’s view. How chips are designed? Without getting too much into details, a chip has logic gates; consists of transistors, allows signals to pass according to their state e.g. an ‘AND’ gate outputs 1 (high) if only both input signals are 1s otherwise outputs 0 (low), capacitors to hold voltage and passive components such as resistors, and inductors. By utilizing these components, you can create a simple voltage comparator chip to a complex multiple-layer central processing unit (CPU). This structure is called chip architecture and each manufacturer has a unique approach since there are infinite possibilities to form as a number of components increases.
So, what will happen to hardware in this software-driven world? The hardware development is mainly focused on making smaller and smaller integrated chips. Nonetheless, physics obviously is not allowing smaller hardware due to Moore’s Law. We could also explore this phenomenon, but it definitely deserves a separate article to get into details. Therefore, we will specifically try to explore the future of hardware in this article.
The Future of Hardware is Software
This phrase contradicts with the idea of software is nothing without hardware. However, developments in the general purpose GPU (Graphics Processing Unit) computing helped launching the deep learning era and created a mutual relation between both fields which means deeper learning study required more ML (Machine Learning) oriented GPUs.
The demand on hyper-efficient custom ML hardware created the problem of utilizing the full potential of the hardware on the software side. New powerful hardware is not practical if one cannot use its full potential and it seems that comprehending and utilizing new tools require too much work for software developers. This dilemma creates flopped hardware projects such as Intel’s Itanium; a 64-bit Intel microprocessor suffered the lack of software support and discontinued [4].
To prevent these flops and to create more market/software targeted efficient, powerful, and cost-effective integrated chips, the industry is starting to use machine learning algorithms on chip architectures. Conventional methodologies employed for such tasks are largely manual; thus, time-consuming, and resource-intensive. In contrast, the unique learning strategies of artificial intelligence (AI) provide numerous exciting automated approaches for handling complex and data-intensive tasks in very-large-scale integration (VLSI) design and testing. Employing AI and ML algorithms in VLSI design and manufacturing reduces the time and effort for understanding and processing the data within and across different abstraction levels via automated learning algorithms [5]. Companies who implement these automated algorithms are and will be discrete about it since information about their architectures and their methodologies are a trade secret.
So, in the future, we will get our hands on more and more hyper-efficient hardware without focused advertising on smaller packages but on performance and ‘smart architectures’ powered by software.
Classical computing contains two bits (1 and 0) represented as a voltage with respect to the common ground being 0 volt. While engineers have ensured that transistors have gotten smaller and more numerous, broadening the kind of problems that computer scientists can solve, the technology is not much different from the 1930s and 40s pre-transistor devices based on valves or tubes.
In quantum computing, transistors are replaced with devices that represent quantum bits or “qubits,” which are capable of representing both a 0 and a 1 at the same time. In a quantum computer processor, these qubits are handled to solve computational problems with a high degree of complexity [6]. Such problems would take a modern classical CPU much longer. The real-life application for quantum computing examples are figuring out the optimal route that a salesperson should take to cover multiple cities to save on time and fuel costs, enabling financial companies to better analyze their data to determine fraud or balance their investment portfolios.
As the hardware we use today becomes increasingly complex and small in size, engineers need to calculate new architectures for chips, new topologies for PCB (printed circuit board) routing to optimize space. Complex and important hardware such as the CPU of a satellite or an image sensor of a sophisticated thermal camera designing and manufacturing produces hundreds, thousand data to compute and quantum computers can be used for these data to be comprehensible. Furthermore, this output can be fed to an AI black box to create desired hardware.
Other Contents