Connectivity versus scaling attacks – The Role of Cryptography in the Connected World

1.3.3 Connectivity versus scaling attacks

To summarize, connectivity exposes devices and IT systems to remote attacks that target network-facing software (and, thus, directly benefit from the continuously increasing software complexity), are very cheap to launch, can be launched by a large number of threat actors, and have zero marginal cost.

In addition, there exists a market for zero-day exploits [190] that allows even script kiddies to launch highly sophisticated remote attacks that infest target systems with advanced malware able to open a remote shell and completely take over the infested device.

As a result, connectivity creates an attack surface that facilitates cybersecurity attacks that scale.

1.4 Increasing complexity

While it can be argued that the problem of increasing complexity is not directly mitigated by modern cryptography (in fact, many crypto-related products and standards suffer from this problem themselves), there is no doubt that increasing complexity is in fact a major cause of security problems. We included the complexity problem in our list of crucial factors for the development of cryptography, because cryptography can help limit the damage caused by attacks that were in turn caused by excessive complexity.

Following Moore’s law [191], a prediction made by the co-founder of Fairchild Semiconductor and Intel Gordon Moore in 1965, the number of transistors in an integrated circuit, particularly in a microprocessor, kept doubling roughly every 2 years (see Figure 1.5).

Figure 1.5: Increasing complexity of hardware: Transistors. Data is taken from https://github.com/barentsen/tech-progress-data

Semiconductor manufacturers were able to build ever bigger and ever more complex hardware with ever more features. This went so far that in the late 1990s, the Semiconductor Industry Association set off an alarm in the industry when it warned that productivity gains in Integrated Circuit (IC) manufacturing were growing faster than the capabilities of Electronic Design Automation (EDA) tools used for IC design. Entire companies in the EDA area were successfully built on this premise.

Continuously growing hardware resources paved the way for ever more complex software with ever more functionality. Operating systems became ever more powerful and feature-rich, the number of layers in software stacks kept increasing, and software libraries and frameworks used by programmers became ever more comprehensive. As predicted by a series of software evolution laws formulated by early-day computer scientists Manny Lehman and Les Belady, software exhibited continuing growth and increasing complexity [181] (see also Figure 1.6).

Figure 1.6: Increasing complexity of software: Source Lines of Code (SLOC) in operating systems. Data is taken from https://github.com/barentsen/tech-progress-data

Why should increasing complexity be a problem? According to leading cybersecurity experts Bruce Schneier and Niels Ferguson [65], ”Complexity is the worst enemy of security, and it almost always comes in the form of features or options”.

While it might be argued whether complexity really is the worst enemy of security, it is certainly true that complex systems, whether realized in hardware or software, tend to be error-prone. Schneier and Ferguson even claim that there are no complex systems that are secure.

Complexity negatively affects security in several ways, including the following:

  • Insufficient testability due to a combinatorial explosion given a large number of features
  • Unanticipated—and unwanted—behavior that emerges from a complex interplay of individual features
  • A high number of implementation bugs and, potentially, architectural flaws due to the sheer size of a system

Leave a Reply

Your email address will not be published. Required fields are marked *