Doğukan Barkan

05.09.2022 - 15:08

The Future of Hardware on a Software World

05.09.2022 - 15:08

In 2014 Facebook (now META) purchased WhatsApp; a text messaging app used widely across the globe, for an astonishing number of $21.8 billion [1]. Two years later Microchip Technology Incorporated acquired Atmel Corporation; one of the largest designers and manufacturers of semiconductors mainly focusing on embedded systems built around microcontrollers for $3.56 billion [2]. In one hand, we have the software; a messaging app that connects people which is a worthwhile asset valued for $21.8 billion. On the other hand, the company which produces crucial hardware parts so that this software could run is sold for $3.56 billion. Both software and hardware are important, without hardware you cannot even develop the software and, without software we would still have mechanical computers, punch cards, and so on. However, we see a six-times difference in the evaluation of both parts. Surely, there are many aspects in this difference, from company valuation to market structure to acquiring company & shareholders expectations. Clearly, the market demand is on the software side. When we gaze upon the most valued companies, we see software-based companies such as Microsoft, Alphabet (Google), Meta (Facebook), Amazon [3]. Note that each company has some interest in hardware.

Before we start, one fundamental concept needs some explanation to support this article’s view. How chips are designed? Without getting too much into details, a chip has logic gates; consists of transistors, allows signals to pass according to their state e.g. an ‘AND’ gate outputs 1 (high) if only both input signals are 1s otherwise outputs 0 (low), capacitors to hold voltage and passive components such as resistors, and inductors. By utilizing these components, you can create a simple voltage comparator chip to a complex multiple-layer central processing unit (CPU). This structure is called chip architecture and each manufacturer has a unique approach since there are infinite possibilities to form as a number of components increases.

So, what will happen to hardware in this software-driven world? The hardware development is mainly focused on making smaller and smaller integrated chips. Nonetheless, physics obviously is not allowing smaller hardware due to Moore’s Law. We could also explore this phenomenon, but it definitely deserves a separate article to get into details. Therefore, we will specifically try to explore the future of hardware in this article.

The Future of Hardware is Software

This phrase contradicts with the idea of software is nothing without hardware. However, developments in the general purpose GPU (Graphics Processing Unit) computing helped launching the deep learning era and created a mutual relation between both fields which means deeper learning study required more ML (Machine Learning) oriented GPUs.

The demand on hyper-efficient custom ML hardware created the problem of utilizing the full potential of the hardware on the software side. New powerful hardware is not practical if one cannot use its full potential and it seems that comprehending and utilizing new tools require too much work for software developers. This dilemma creates flopped hardware projects such as Intel’s Itanium; a 64-bit Intel microprocessor suffered the lack of software support and discontinued [4].

To prevent these flops and to create more market/software targeted efficient, powerful, and cost-effective integrated chips, the industry is starting to use machine learning algorithms on chip architectures. Conventional methodologies employed for such tasks are largely manual; thus, time-consuming, and resource-intensive. In contrast, the unique learning strategies of artificial intelligence (AI) provide numerous exciting automated approaches for handling complex and data-intensive tasks in very-large-scale integration (VLSI) design and testing. Employing AI and ML algorithms in VLSI design and manufacturing reduces the time and effort for understanding and processing the data within and across different abstraction levels via automated learning algorithms [5]. Companies who implement these automated algorithms are and will be discrete about it since information about their architectures and their methodologies are a trade secret.

So, in the future, we will get our hands on more and more hyper-efficient hardware without focused advertising on smaller packages but on performance and ‘smart architectures’ powered by software.

Quantum Computing in Hardware

Classical computing contains two bits (1 and 0) represented as a voltage with respect to the common ground being 0 volt. While engineers have ensured that transistors have gotten smaller and more numerous, broadening the kind of problems that computer scientists can solve, the technology is not much different from the 1930s and 40s pre-transistor devices based on valves or tubes.

In quantum computing, transistors are replaced with devices that represent quantum bits or “qubits,” which are capable of representing both a 0 and a 1 at the same time. In a quantum computer processor, these qubits are handled to solve computational problems with a high degree of complexity [6]. Such problems would take a modern classical CPU much longer. The real-life application for quantum computing examples are figuring out the optimal route that a salesperson should take to cover multiple cities to save on time and fuel costs, enabling financial companies to better analyze their data to determine fraud or balance their investment portfolios.

As the hardware we use today becomes increasingly complex and small in size, engineers need to calculate new architectures for chips, new topologies for PCB (printed circuit board) routing to optimize space. Complex and important hardware such as the CPU of a satellite or an image sensor of a sophisticated thermal camera designing and manufacturing produces hundreds, thousand data to compute and quantum computers can be used for these data to be comprehensible. Furthermore, this output can be fed to an AI black box to create desired hardware.


However, quantum computing comes with its own unique problems. One of the major problems being faced by today’s quantum computers is that entangled qubits quickly become decoherent with respect to other qubits. Therefore, an algorithm needs to quickly get its work done before the qubits become decoherent. There are different solutions for this problem, but each quantum computer manufacturer has their own approach on building quantum computers as in very similar to different architecture choices on classical computers. Another problem is that; it requires specialized cryogenic refrigerators to maintain low superconducting temperatures and this adds to the cost of operating one.
Quantum computers are presently being targeted for problems that are compute-intensive and not latency sensitive. Also, today’s quantum computer architectures are not mature enough with respect to handling large data sizes. As a result, in many cases, a quantum computer is typically deployed along with a classical computer in a hybrid manner. IBM is currently working on quantum computers that are serverless and with more qubits and faster computation power [7]. As the development of such computers are determined to have a focus on developing ‘new/better’ hardware or themselves become more common/easy to use like classical computers. They have the potential to revolutionize the future of hardware both in microchip architecture and optimized PCB design.


Conclusion
The future of hardware probably will be determined by customer demands in the market for the short term as we have seen in the valuation of software companies. So, for the next few years, we won’t see a major shift on how hardware is valued in the market. Hardware that specialized on efficiency needs to grow in the market perspective as more and more data is required to be stored, analyzed, and processed. We, as the end user, maybe will not feel the relatively slow hardware developments but the services we use will have it for their infrastructures thus our collective experience can be preserved or improved. The next big thing; Metaverse may have an impact on this hands-on hardware and its development. Check out this article for the Metaverse!

References
[1] A. L. DEUTSCH, "WhatsApp: The Best Meta Purchase Ever?," 2022. [Online]. Available: https://www.investopedia.com/articles/investing/032515/whatsapp-best-facebook-purchase-ever.asp#citation-10.
[2] C. Assis, "Microchip Technology buys chip maker Atmel in $3.56 billion deal," 2016. [Online]. Available: https://www.marketwatch.com/story/microchip-technology-buys-chip-maker-atmel-in-356-billion-deal-2016-01-19.
[3] "Largest Companies by Market Cap," 2022. [Online]. Available: https://companiesmarketcap.com.
[4] M. Lee, "Intel's Itanium is finally dead," 2021. [Online]. Available: https://www.techspot.com/news/90622-intel-itanium-finally-dead.html.
[5] D. Amuru, "AI/ML Algorithms and Applications in VLSI Design and Technology," 2022.
[6] K. V. David Hall, "The Future of Quantum Computing," 2022. [Online]. Available: 
[7] J. Gambetta, "Expanding the IBM Quantum roadmap to anticipate the future of quantum-centric supercomputing," IBM, 2022. [Online]. Available: https://research.ibm.com/blog/ibm-quantum-roadmap-2025.
[8] B. Yu, "Machine Learning and Pattern Matching in Physical Design," 2015.


Other Contents

The rapid development of artificial intelligence (AI) has revolutionized industries worldwide, with generative AI taking center stage recently. Gen...

This blog was written and translated by ChatGPT.Are you ready for a dazzling journey into the world of artificial intelligenc...

Intel CEO Pat Gelsinger likened semiconductors to oil, suggesting that computer chips will play a central role in international relations in the de...