Commoditization of Computing Hardware and the Bugs It Contains

Artificial Intelligence as the Next Host of Cyber Attacks

Computer hardware is a commodity item nowadays; anyone that wants one can have one relatively cheaply, as it has a second-hand market as well. We went full circle, as the technology are now mostly supplied from cheap-labor markets to the rich markets, a totally different situation 30 years ago. With the commoditization of the once used niche chips that make up today’s technological gadgets, have we reached a point of no return? A situation where we literally lost our control on how chips are made, that the origin of those chips can insert bugs or functionality that isn’t there during the chip’s planning.

The fiasco that Intel, AMD, ARM, and other chip manufacturers faced recently with the Meltdown and Spectre controversy has been a very hard pill to swallow. Globally speaking, people who depend on affected chips basically only had two options, the red pill, and the blue pill. The latter is not to apply the CPU microcode update, to maintain the expected performance of the processor, opening the system to real risks. The former is installing the CPU microcode update, degrading the performance of the chips in the process, while keeping the CPU safe from cybercriminals taking advantage of the bug.

Also the issue of big tech companies using Chinese labor, with them having the capability to embed unofficial and undocumented chips to legitimate products that should never be there, to begin with. Apple, Amazon and other Tech companies that contract those Chinese companies to produce their products in China have already denied the allegation.

Vulnerabilities in hardware are very difficult to resolve, as already proven by both Intel and AMD as the CPU microcode updates lessen the performance of their processors. It is a great loss of their customers, as the performance per Dollar expectation is not met. It is very difficult to maintain iron-clad security if there is growing complexity. With today’s chips containing hundreds of billions of individual transistors, it is very easy to overlook something as serious as Meltdown/Spectre.

Designers address correctness concerns through verification, the process of extensively validating all the functionalities of a circuit throughout the development process. Simulation-based techniques are central to this process: they exercise a design with relevant test sequences trying to expose latent bugs. However, this approach is often incapable of fully exercising the design space of modern processors.

Formal verification techniques have grown to address the non-exhaustive nature of simulation-based methods. Formal methods utilize mechanisms such as theorem proving and model checking to show that a component violates or upholds a certain property. The primary drawback of formal techniques, however, is that they do not scale to the complexity of modern designs, constraining their use to only a few components within the processor.

The bottom line as the Moore’s Law continues the diminished performance of the processors due to the updates that fix security issues will be over with time, as processors continue to improve. It is just a bump on the road and not a wall that will stop everyone from improving productivity with the use of technology.

Kevin Jones951 Posts

Kevin Jones, Ph.D., is a research associate and a Cyber Security Author with experience in Penetration Testing, Vulnerability Assessments, Monitoring solutions, Surveillance and Offensive technologies etc. Currently, he is a freelance writer on latest security news and other happenings. He has authored numerous articles and exploits which can be found on popular sites like and others.


Leave a Comment

comodo partner

Welcome! Login in to your account

Remember me Lost your password?

Don't have account. Register

Lost Password