In the book “Ubiquity” (by Mark Buchanan), the author states that systems have a natural tendency to organize themselves into what is called the “critical state” in what Buchanan states as the “knife-edge of stability”. When the system reaches the “critical state”, all it takes is a small nudge to create a catastrophe.
Now as a person interested in breaking software and uncovering defects, one is curious to understand how one can test better. “How do you ensure that potential critical failures lurking in systems that have matured can still be uncovered?”
Read this interesting article by T Ashok (ash_thiru@twitter), where he makes interesting analogy to highlight that it is a series of small errors that results in a catastrophe -( “A typical accident takes seven consecutive errors” writes Malcolm Gladwell in his book “The Outliers”. )