Let’s talk about AI Hardware — the horsepower behind AI advancements
Most of the latest innovations are backed by software and software is ultimately backed by hardware. AI is no different. Let’s talk about how is AI hardware different from general-purpose machines. Let’s take one step back and check how the advancement in hardware can accelerate the impact of AI on the globe.
From the first impression, we think that this hardware is costly, then why are we going that way? It is like developing and investing in new ways of consuming fossil fuel which we know for sure that is going to be exhausted very soon. The simple answer is, we are using the hardware horsepower to train the system and then utilizing transfer learning, we plan to use that trained algorithm on millions of commercial systems (low in hardware). This economies of scale can help us distributing the impact and in turn, minimize the negative impact of high power consumption on the environment and globe. Moreover, we can improve on the hardware front to optimize it for less power consumption and higher speed leaving extra buffer time and power for more analysis and trials.
An improvement of just 100% is already too much. However, the new machine that Google has come up with enhances it by 250% already. Check out the comparison graph below published by Google.
The race of developing hardware for AI started long back in the 90s. Companies wanted more powerful hardware so they utilized the NVidia’s GPU which is known to do huge calculations, especially floating-point calculations in very less time. But it is costly and power-consuming and that is why it drains battery and heats up the system when we play high graphic video games. However, for commercial purposes, this is really good.
But, it seems this was not enough. Hardware companies thought, what we can do if we cannot create competing software? They played at their strength and thought to build better and specialized hardware for AI. Obviously, it will consume more power, but…