The convergence of sequences of
random variables to some limit random variable is an important concept
in probability theory, and its applications to statistics and stochastic
processes. Stochasticity refers to the property of being well described by a random probability distribution.
Notice a pattern of appearance and cancellation with the predominant variable in this set of equations. This is identified as the limit random variable.
Stochasticity and randomness are distinct in that the former refers to a modeling approach and the latter refers to phenomena themselves.
Translated into mathematical terms, this means that some variables are easier to calculate than others, and some are more random than others, thus less probabilistic in a single set of outcomes (former), or more patterned in the identity of their distribution (latter). In either case, we experience these variables in random or controlled chaos.
Simply put, the world seems quite chaotic at the moment, but this may very well play to our advantage in that we can manage the chaos fluidly away from the status quo.
Let's look at this in terms of a design paradigm. If we are designing a future system, the image of that future system and the model representing that future system reveal variables of divergence and convergence.
As we observe this, we are able to apply our theory about that future system by testing our assumptions against reality - in other words, the market, the environment or the paradigm shows us potential problems along with potential solutions. The convergence of preexisting systems contrasts with the divergence of new phenomena, thus further revealing to us what is possible.
Which brings us to the phenomenon of stacking convergences.
Convergence denotes the merging of two or more systems, often in the form of disciplines and/or industries. In today's industrial context, stacking convergences means that these respective emerging domains either exhibit a hard limit to their expansion or growth, or, they have softer limits, and even no limits, to their expansion or growth.
These limits are expressed in terms of their centralized or decentralized attributes, and between them, a level of autonomy.
For example, an emerging health sector is highly centralized, but is breaking down significantly due to healthcare insurance risks, and supply chain risks. This emerging health sector's hard limit will undoubtedly force a divergence from the old model, and give rise to something more decentralized, even while certain health parameters are unclear and are converging around that lack of clarity.
An emerging ecological sector is more decentralized in its infrastructure, and therefore has much softer limits on its expansion, and a greater level of autonomy for market participation due to factors like failed governance and a lack of infrastructure.
An emerging educational sector has a highly centralized infrastructure, while fragmenting at an accelerated clip due to teacher attrition, school shutdowns and a lack of community resources in many neighborhoods. Learning programs are already decentralizing as new cohort groups find ways to educate people outside of the current system, from MOOCs, to other online curricula, virtual classrooms and the like.
An emerging finance sector is even more dynamic, mainly because complex financial computations are fragmenting into smaller constituent parts (fractionating interest or debt, as one example), while decentralizing web3 applications move those computations into networks that reengineer their corresponding processes. A lot sooner rather than later, many people will find autonomy in the converging tensions between centralized infrastructure and decentralized applications, as those new innovations become federated, which then result in more and more decentralized infrastructure which replaces the old infrastructure.
All sectors are heading toward a point of general convergence whereby their constituent parts will evolve on their own, and in these decentralized capacities, they will federate or redistribute as that general point of convergence is exceeded.
This is something to be very excited about. Why? Because at no point in history have we ever experienced this much complexity and randomness happening in such a fluid manner.
In other words, we have many different futures, and bright future possibilities, which can be patterned, tracked and navigated.
As for new alternative systems, the key to navigating complexity and mitigating risk is to build simple systems that can scale via their core functions, per Gall's Law. Gall's law is a precept in which environmental selection tests meet systems design.
So, if you want to build a system that works, the best approach is to build a simple system that meets the environment's current selection tests first, then improve it over time. Over time, you'll build a complex system that actually works.
A working simple system is precisely where real change is manifested for the better, and sustained over time. As we all know that change is a constant (Planck), we also know that change is constantly augmented by variability which is random or more controlled.
Reorienting the factors for random variability and finding, then developing, satisfactory calculations that reduce invariable risk, are the key functions for sustaining a system over time.
As we do this, establishing organic baseline algorithms and constantly evolving network coefficients, we can pattern the intervals of convergence, thereby getting a solid handle on what our innovative systems will actually do as they integrate with other systems, or replace them outright.
As a wise man once said: In order to build a better future, we must be able to see that future, design it, and calculate the scenarios for its potential impact on the world.