Bigger, more complex, and more violent.
Time Complexity is how runtime grows:
for (int i = 0; i < n; i++) { }
Space Complexity is how memory usage grows with input size.
int arr[n]
Cyclomatic Complexity is the number of independent execution paths.
if (a) { }
else if(b) { }
else { }
Logical Complexity is how complicated the decision logic is.
if (a && (b || c) && !d) { }
Halstead Complexity is the number of symbols(operators + operands).
a = b + c * d; // 3[+,*,=] + [a,b,c,d] = 7
Algorithmic Complexity, or Kolmogorov complexity, is the minimum description size of a computation, the smallest program needed to reach a specific state.
abababab // low complexity 4c1j5b90 // high complexity
Some other, more elusive to pinpoint:
- Lines of Code: Length of a program, semi-colons, or parens, as a proxy for complexity.
- Test Complexity: How many tests are needed to cover behavior paths.
- Dependency Complexity: How many external components a unit depends on.
- Cognitive Complexity is how hard the code is for a human to understand at a glance.
In 1949, Claude Shannon was characterizing information loss, and needed a term for the degree to which information is scrambled. Visiting mathematical physicist John von Neumann, he received the following advice:
You should call it entropy... nobody knows what entropy really is, so in a debate you will always have the advantage.

incoming: notation malleable computing collapse computing stack