Title: Using differential equations to model the impact of feedback loops, delays, and thresholds on a variable ## 1. Introduction My fascination with complex systems began in managing my digital metropolis in the game *Cities: Skylines*. For weeks, I watched my population swell as I started from nothing to building more and more buildings and businesses. However, once I reached a certain scale, things began tricky. Growth becomes slower and slower as I'm running out of resources. Sometimes a tiny tax adjustment will lead to a catastrophic effect of businesses shuttering and people leaving. What I suppose to be simple game turned out to be way more complex and hard to control especially in later stages. I was left perplexed, wondering how could a system built on logical, deterministic rules produce such violent and unpredictable reactions to an adjustment in input? And why isn't simple growth possible? This phenomena stimulated my interest and echoes my interest in studying complex systems -- be they cities, economies, or ecosystems, where they exhibit behavior that defies simple, linear intuition (Mitchell, 2011). The game is a tangible demonstration these principles, a world where the intricate dance between reinforcing loops (population booms), balancing loops (depletion of resources, pollution, etc.), and time delays (construction lag, information delay, etc.) dictates prosperity or ruin. Understanding the mathematical engine that drives these phenomena is a worthy investigation into the fundamental nature of the complex, interconnected systems that govern our own world, from urban planning to climate science. My exploration will therefore be a quest to deconstruct and rebuild the growth engine of my virtual city using the language of mathematics. I will begin by modeling the simple, naive assumption of exponential growth that I started my game with. Then, in response to the model’s failures and my own in-game observations, I will systematically introduce layers of complexity by developing and comparing a series of differential equation models. This progression will allow me to explore the stabilizing influence of carrying capacity, the oscillating effects of time delays, complex chaos, and tipping points. Hence, I aim to investigate how the integration of different feedback loops, delays, and thresholds can model the emergence of complex behavior. --- ## 2. The Two Forces ### 2.1 Exponential Growth & Positive Feedback First, starting with the idea of "exponential growth". We seem to hear this word everywhere -- "Our economy is experiencing exponential growth," "The virus is spreading exponentially," "Technological advancements are happening at an exponential rate" -- and people seem to be fascinated by it. Every one is looking for exponential growth whether in personal growth or business. We are living in a world where linear growth is not particularly valuable -- the traditional factory-era assumption of "more input leads more output" is obsolete; Instead, we are looking for inputs that will lead to exponentially more output. This is the same in my virtual city -- by building more housing and businesses, especially at the early stages, population grew at an increasing rate. However, what is "exponential growth" anyway? In IB Math AA HL, the textbook gives us the following characterisitics of exponential growth functions. ![[Screnshot [email protected]]] *Figure 2.1.1 Awada, N. (2019) “Generalizing Relationships: Exponents, Logarithms and Integration,” in Mathematics: Analysis and Approaches. Higher level. Course companion. Oxford: Oxford University Press, p. 475.* However, none of them touches upon the fundamental characteristic of a exponential growth function, until later when we learned about calculus -- ![[Screnshot [email protected]]] *Figure 2.1.2 Awada, N. (2019) “Generalizing Relationships: Exponents, Logarithms and Integration,” in Mathematics: Analysis and Approaches. Higher level. Course companion. Oxford: Oxford University Press, p. 477.* But still, what does "invariant under differential" mean? We know that the derivative of a second-degree polynomial function is a linear function, and the derivative of a linear function is a constant -- but there exists a function in this world, that when being taken the derivative, is itself! What does this crazy "coincidence" imply about the nature of exponential growth? To understand we will have to go back to the definition of "derivative". I like to think of derivative as the "rate of change" -- not the change of the quantity, but how fast that change is happening to a certain quantity. The rate of change determines the size of the change within a certain span of time. Now, here's the interesting part: we just said that for exponential functions, their derivative is themself -- in other words, the quantity itself determines its rate of chance Now we got, for exponential growth function: 1. its rate of change determines the size of the change of the quantity (the definition of derivative) 2. the quantity determines its rate of change (derivative = itself) Notice anything interesting? If not, a diagram will make things clear ```mermaid flowchart LR     Q["Quantity"]     R["Rate of Change"]     Q -- determines --> R     R -- determines (the change of) --> Q ``` On one hand, a larger quantity will lead to a larger rate of chance; on the other hand, a larger rate of change will lead to a larger quantity. I have learned on my own that such patterns are recognized as feedback loops -- “Feedback is a process whereby an initial cause ripples through a chain of causation ultimately to reaffect itself. (Gould, Tobochnik and Christian, 2007)” If you are familiar with system dynamics, you will instantly notice this is a positive or reinforcing feedback loop. > “Feedback is positive if an increase in a variable, [...], leads to a further increase in the same variable. (Martin, 1997)” In the example of population: suppose that every pair of parents give birth to two children, then the more pairs of parents we have, the more children will be born -- the more children are born, who become future parents, the more children of children we have. This is the underlying force that dictates the exponential growth of from bateria to people and to economy. How do we translate our finding into mathematical language? Our discovery is that "the instantaneous rate at which a population grows is proportional to the size of the population" -- In Math, we use $\frac{dP}{dt}$ to represent the derivative or instantaneous rate of chance of the population ($P$). As a result, $\frac{dP}{dt} = r P(t)$ where r is an arbitrary constant. Solving the differential equation gives $ \frac{dP}{dt} = r P(t) $ $ \frac{1}{P(t)} \, dP = r \, dt $ $ \int \frac{1}{P(t)} \, dP = \int r \, dt$ $ $\ln |P(t)| = rt + C$ $|P(t)| = e^{rt + C}$ $|P(t)| = e^C \cdot e^{rt}$ $P(t) = A e^{rt} \quad \text{where } A = e^C$ To find the specific solution, apply the initial condition ($P(0) = P_0$) $P(0) = A e^{0} = A = P_0$ $\boxed {P(t) = P_0 e^{rt}}$ And voila, that is the standard form of any exponential growth function we are familiar with. To sum up, the essence of a exponential growth is a positive feedback loop -- more population will lead to a higher rate of growth, and vice versa. But does the positive feedback go on forever? ### 2.2 Carrying Capacity & Negative Feedback However, as proved in my virtual city game, all dreams of infinite exponential growth are only fantasies. We live in a world of finite resources -- resources like food, air, and even space are limited. This predetermines the fact that dreams of infinite growth in a finite world is not possible (Meadows, Club of Rome, and Potomac Associates, 1974). Thus the concept of carrying capacity ($K$) -- the maximum sustainable population that the nature is able to withhold. Any value above the carrying capacity will be dragged down back to K by a mysterious force. However, how exactly does this "force" functions restricts the growth of the population? To understand this, I studied the concept of negative feedback. While a positive feedback loop is about unlimited growth, negative feedback loops serves to control the value to a certain target -- once the value sways away from the target, the negative feedback loops creates a force to drag the value back. Again in diagram: ```mermaid flowchart LR         Value -->|compare| Target         Target --> Differences         Differences --> RateOfChange         RateOfChange --> Value     %% Labels     Value[Value]     Target[Target]     Differences[Difference]     RateOfChange[Rate of Change] ``` Imagine a thermometer trying to control the actual temperature $T$ to the target value $T_t$. The difference is thus $T-T_t$. Suppose the current temperature is higher than the target value, than $T-T_t$ is positive. Right now, since we want to force the actual temperature to drop to the target value, we want the rate of change to be negative. In contrast, when the temperature is lower than the target value ($T-T_t$ is negative), we want the rate of change to be positive. In order to achieve this effect, we and time the different ($T-T_t$) with a negative constant $-k$, and makes it the rate of change. $\frac{dT}{dt} = -k(T - T_t)$ And voila, this is the standard form of a negative feedback loop! 1. **If $T > T_t$ (temperature higher than target):** The term $T_a-T_t$ will be positive. Since $-k$ is negative, $\frac{dT}{dt}$ will be negative. This means the temperature is decreasing, moving it closer to target value. 2. **If $T < T_t$ (temperature lower than target):** The term $T_a-T_t$ will be negative. Since $-k$ is negative, $\frac{dT}{dt}$ will be positive. This means the temperature is increasing, moving it closer to target value. 3. **If $T = T_t$ (temperature equal to target):** The term $T_a-T_t$  will be zero. Consequently, $\frac{dT}{dt}$ will be zero. This means the temperature of the thermometer is no longer changing; it has reached equilibrium with the room. Notice the crucial aspect here: the _difference_ between the current state ($T$) and the target state ($T_t$) directly dictates the _direction and magnitude_ of the change. This is the hallmark of a negative feedback loop. The system continuously measures its deviation from the target and adjusts itself to reduce that deviation. Solving the differential equation $\frac{dT}{dt}=-k(T-T_t)$ $\frac{dT}{dt}=-kT+kT_t$ $\frac{dT}{dt}+kT=kT_t$ The integrating factor is $\mu(t)=e^{\int k\,dt}=e^{kt}$ Multiplying both side by the IF $e^{kt}\left(\frac{dT}{dt}+kT\right)=kT_t e^{kt}$ $\frac{d}{dt}\left(T e^{kt}\right)=kT_t e^{-kt}$ $\int \frac{d}{dt}\left(T e^{kt}\right)dt=\int kT_t e^{kt}dt$ $T e^{kt}=T_t e^{kt}+C$ $T=T_t+Ce^{kt}$ Applying the initial condition $T(0) = T_0$ $T(0)=T_t+C$ $C=T(0)-T_t$ $\boxed{T(t)=T_t+\bigl(T(0)-T_t\bigr)e^{-kt}}$ Putting this result into a function grapher with the ability to tweak the parameters, we get these graphs, with the same $T_t$ and different $T_0$. ![[Screenshot 20251219 at [email protected]]] The graphs remind me of the Newtonian Law of Cooling we learned in Physics class -- "the rate at which an object cools is proportional to the difference in temperature between the object and the object’s surroundings (Staff, 2022)." I realized what the law states is also exactly a negative feedback loop. In summary, at the core of carrying capacity is the force of a negative feedback loop. ## 3. The Integration of the Two Forces ### 3.1 Logistic Differential Equation In the previous section, we have explored two fundamental forces governming change: 1. Positive Feedback (Exponential growth) -- $\frac{dP}{dt} = r P$ 2. Negative Feedback (Stabalization towards a target) -- $\frac{dP}{dt} = -k(P - K)$ Now the challenge is -- how to integrate these two opposing forces into a single model that can describe how the interplay between these two feedback loops come together to influence a single variable? Simply finding their product does not make sense, since the unit of the result would be $[Population]^2/[time]^2$, which is not what we want. Instead, rather creating a separate rate of change, the negative feedback *modifies* the rate of chance created by the positive feedback. We must see the negative feedback not as a separate force to be added or multiplied, but as a mechanism that _regulates_ the positive feedback. Let's start again from the basic model: $\frac{dP}{dt} = (\text{growth rate}) \times P$ In this simple model, "growth rate" is a constant $r$ -- the maximum, ideal, intrinsic growth rate per capita growth rate per capita, or how many children will each parent have on average. The key here is that in a limited environemnt, this growth rate is not constnat -- it must decrease as the population increases and approaches the carrying capacity $K$, because there would not be enough resources for the newcoming babies. Let's call this new, variable growth rate the *effective growth rate*, $r_{\text{eff}}$. Our equation is now $\frac{dP}{dt} = r_{\text{eff}}\times P$ Naturally, the next step is to figure out that does $r_{\text{eff}}$ behave, according to the population growht. From first principles: 1. When the population is very small ($P \approx 0$), the effective growth rate should be at its maximum (the intrinsic rate $r$ because the environmental constraints are negligible) -- $r_{\text{eff}} = r$ when $P = 0$ 2. when the population gradually reaches the carry capacity K, the effective growth rate should approach zero, as there are no more resources for the new births -- $r_{\text{eff}} = 0$ when $P = K$ Thus we got these two points: $(0,r)$ and $(K,0$) in the $r_{\text{eff}}$-$P$ diagram. Following Occam's Razor law, the simplest mathematical relationship is a straight line. This assumption, while a simplification, provides the most straightforward mathematical representation that satisfies our two boundary conditions. Thus this graph: ![[Math IA 1st Draft.png|400]] Easily we get: $r_{\text{eff}} = \left( -\frac{r}{K} \right) P + r$ Factoring out the common term $r$, we find the expression for our effective growth rate: $r_{\text{eff}} = r \left(1 - \frac{P}{K} \right)$ This expression perfectly captures the behavior we need. It starts at $r$ and linearly decreases to $0$ as $P$ goes from $0$ to $K$. Substituting the expression of effective growth back in the growth equation, we get $ \frac{dP}{dt} = r_{eff} \cdot P = \left[r\left(1 - \frac{P}{K}\right)\right] \cdot P $ Rearranging for the standard form, we arrive at the final destination. $ \boxed{\frac{dP}{dt} = rP\left(1 - \frac{P}{K}\right)} $ This model is a sophisticated integration of exponential growth and carrying capacity, where the negative feedback mechanism, represented by the term $(1 - \frac{P}{K})$, acts as a dynamic brake on the engine of positive feedback, $rP$. Another way to think of this: while (K-P) is the *absolute* difference or remaining capacity (the factor playing it role in the negative feedback loop), we want such a *proportional* factor of capacity remaining: $\frac{K-P}{K} = 1- \frac{P}{K}$ This is also famously know as the logistic differential equation -- the ultimate product of the interplay between the two opposing forces: the positive feedback from exponential growth and the negative feedback from the carrying capacity. Given the logistic differential equation: $ \frac{dP}{dt} = rP\left(1 - \frac{P}{K}\right) $ Move all terms involving $P$ to one side and terms involving $t$ to the other. $ \frac{dP}{P\left(1 - \frac{P}{K}\right)} = r \, dt $ Simplify the denominator on the left side. $ \frac{dP}{P\left(\frac{K-P}{K}\right)} = \frac{K \, dP}{P(K-P)} $ $ \frac{K \, dP}{P(K-P)} = r \, dt $ Integrate both sides -- The right side is straightforward: $ \int r \, dt = rt + C_1 $ For the left side, we need to use **partial fraction decomposition**. Let's decompose $\frac{K}{P(K-P)}$: $ \frac{K}{P(K-P)} = \frac{A}{P} + \frac{B}{K-P} $ Multiply both sides by $P(K-P)$: $ K = A(K-P) + BP $ To find $A$: Set $P=0$: $ K = A(K-0) + B(0) \implies K = AK \implies A = 1 $ To find $B$: Set $P=K$: $ K = A(K-K) + B(K) \implies K = BK \implies B = 1 $ So, the decomposition is: $ \frac{K}{P(K-P)} = \frac{1}{P} + \frac{1}{K-P} $ Now, integrate the left side: $ \int \left(\frac{1}{P} + \frac{1}{K-P}\right) \, dP $ $ = \int \frac{1}{P} \, dP + \int \frac{1}{K-P} \, dP $ $ = \ln|P| - \ln|K-P| + C_2 $ Using logarithm properties ($\ln a - \ln b = \ln(a/b)$): $ = \ln\left|\frac{P}{K-P}\right| + C_2 $ Fianally, combine the integrated sides. $ \ln\left|\frac{P}{K-P}\right| = rt + C $ (where $C = C_1 - C_2$ is a new arbitrary constant) Exponentiate both sides: $ \left|\frac{P}{K-P}\right| = e^{rt+C} $ $ \frac{P}{K-P} = A e^{rt} $ (where $A = \pm e^C$ is a new arbitrary constant, which can be positive or negative. For population models, $P$ is positive, and we'll see $A$ is typically positive.) Now, isolate $P$: $ P = A e^{rt} (K-P) $ $ P = AK e^{rt} - A P e^{rt} $ Move terms with $P$ to one side: $ P + A P e^{rt} = AK e^{rt} $ Factor out $P$: $ P(1 + A e^{rt}) = AK e^{rt} $ $ P(t) = \frac{AK e^{rt}}{1 + A e^{rt}} $ Let $P(0) = P_0$ be the initial population at time $t=0$. $ P_0 = \frac{AK e^{r(0)}}{1 + A e^{r(0)}} $ $ P_0 = \frac{AK}{1 + A} $ Solve for $A$: $ P_0(1+A) = AK $ $ P_0 + AP_0 = AK $ $ P_0 = AK - AP_0 $ $ P_0 = A(K - P_0) $ $ A = \frac{P_0}{K - P_0} $ Substitute $A$ back into the solution $ P(t) = \frac{\frac{P_0}{K - P_0} K e^{rt}}{1 + \frac{P_0}{K - P_0} e^{rt}} $ To simplify, multiply the numerator and denominator by $(K - P_0)$: $ P(t) = \frac{P_0 K e^{rt}}{(K - P_0) + P_0 e^{rt}} $ Divide the numerator and denominator by $P_0 e^{rt}$: $ P(t) = \frac{K}{\frac{K - P_0}{P_0 e^{rt}} + 1} $ $ P(t) = \frac{K}{1 + \left(\frac{K}{P_0} - 1\right) e^{-rt}} $ Let $C_0 = \frac{K}{P_0} - 1$. This is a constant determined by the initial conditions. $ \boxed{P(t) = \frac{K}{1 + C_0 e^{-rt}}} $ This is the general solution for the logistic equation, and graphing this function gives us the standard S-shaped curve. ![[Screenshot 20251219 at [email protected]]] --- As a result, the combination of a simple positive and negative feedback loop creates this S-shaped curve, with initial exponential growth, a slowing growth rate as $P$ approaches $K$, and eventual stabilization at $K$. This perfecting explains why in my game, the beginning is easy, and the difficulty only increases as I play along -- there are more and more pressure from the limiting resources, and the negative feedback loop take over the dominance from the exponential growth. ### 3.2 Logistic Differential Equation with Delay Now we have found the model, but how does it explain the non-linear behaviors of bust and booms I experience in my game? The S-curve is a picture of predictability, not matching turbulent dynamics I witnessed. Seems like however we tweak the parameters in the logistic differential equation, the final curve is always smooth. What key ingredient is missing? I began my investigation into reading more about system dynamics and the bahaviur of complex systems. Donella Meadow's book "Thinking in Systems" provide me a key insight. > “Delays in feedback loops are critical determinants of system behavior. They are common causes of overshoots and oscillations. (Meadows and Wright, 2011)” As we were talking about feedback loops, we assume that information and feedback flows instantaneously -- a growth in population instantly signals the growth rate to increase as well. However, we forget that feedback loops -- more essentially, the flow of information -- costs time, giuving rise to delays. While my differential equation adjusts its rate of chance instantaneously based on the population, in reality, the growth rarely responds to the feedbacks instantaneously. A decision for a new citizen to move in is based on the conditions of the recent past—the availability of jobs, housing, and resources _then_, not _now_. There is a lag between perception, action, and consequence. The system is always reacting to outdated information. How can we embed this reality into our equation? The concept of delay tells us is that rather than reacting on current information, the negative feedback loop always responds to the population value at some time $\tau$ in the past. Modifying our model -- $\frac{dP(t)}{dt} = rP(t) \left( 1 - \frac{P(t - \tau)}{K} \right)$ This gives us the famous delayed logistic differential equation, or Hutchinson's equation. I researched into Hutchinson's equation and found that with this simple addition of $\tau$, the model's personality transforms radically. Instead of a smooth curve, the model now can have wildly different bahaviors depending on the delay $\tau$ and $r$ (Hutchinson, 1948). Hutchinson's equation is notoriously hard to solve analytically, I will directly present the findings here. The system's behavior critically depends on the product of the intrinsic growth rate $r$ and the time delay $\tau$. This product, $r\tau$, acts as a single control parameter (what I call stability index) that dictates the system's fate (Rao and Preetish, 2012). Case 1: Nearly no impact ($r\tau > \frac{1}{e}$) When the delay is so small beneath a threshold ($\frac{1}{e}$), the system quickly reacts and doesn't overshoot, just like the smooth curve we see without the delay. ![[Screenshot 20251219 at [email protected]]] _Figure 3.2.1: Nearly No impact. The system doesn't overshoot._ Case 2: Damed Oscilation ($\frac{1}{e} < r\tau < \frac{\pi}{2}$) When the product is relatively small, the delay is short compared to the system's response time. The feedback, while not instantaneous, is quick enough to prevent significant miscalculation. The population converges towards the carrying capacity . For very small values of , this convergence is smooth, much like the original logistic model. As  increases but remains below the critical threshold of , the system may exhibit **damped oscillations**, overshooting  slightly before spiraling in towards equilibrium. ![[Screenshot 20251219 at [email protected]]] _Figure 3.2.2: Damped oscillations. The system corrects its overshoot and stabilizes._ Case 3: Stable Oscillations ($r\tau > \frac{\pi}{2}$) As we increase the delay such that the product $r\tau$ crosses the critical value of $\frac{\pi}{2}$, a **Hopf bifurcation** occurs. The equilibrium at $P=K$ loses its stability. The system's "reaction time" $\tau$ is now too long for its growth rate $r$. The population, reacting to the low density of the past, grows rapidly and significantly overshoots $K$. By the time the negative feedback kicks in, the population is far too high. This triggers a sharp decline that undershoots $K$, and the cycle repeats. The system no longer settles but enters a **stable limit cycle**, a continuous and predictable boom-and-bust rhythm. ![[Screenshot 20251219 at [email protected]]] *Figure 3.2.3: Stable limit cycle. The system is trapped in a perpetual boom-and-bust cycle.* The lesson at heart is that -- The introduction of a simple time delay has shattered the predictable stability of the logistic model, revealing a rich spectrum of behaviors from simple equilibrium to relatively more complex oscillations, depending on the size of the growth rate and delay. ## 4. Logistic Map ### 4.1 From Continuous to Discrete Our journey so far has been in the world of differential equations, modeling the smooth, continuous flow of time. The delayed logistic equation, $\frac{dP}{dt} = rP(t)(1 - \frac{P(t-\tau)}{K})$, was a major breakthrough. It showed us *that* a simple time lag could be the source of the oscillations and instability I witnessed in my virtual city. Yet, the mathematical framework of differential equations inherently assumes an infinitesimal rate of change, where variables evolve smoothly over continuous time. This is a powerful idealization, but it doesn't always align with how systems are measured or how they fundamentally operate. Many processes, from population counts to economic cycles, are inherently discrete, with states updated at regular, distinct intervals rather than continuously. For instance, populations might be counted annually, or generations might be non-overlapping. This realization prompts a fundamental question: What happens to our logistic model when we shift from continuous to discrete time? Instead of viewing time as a continuous flow, what if we model it in discrete steps? Instead of making the instantaneous rate of change proportional to the current population, $ \frac{dP}{dt} = rP\left(1 - \frac{P}{K}\right) $ let's make the right side the the population of the "next" generation $ P_{n+1} = r P_n \left(1 - \frac{P_n}{K} \right) $ Let $x_n$ represent the population as a fraction of the carrying capacity $K$. $x_n = \frac{P_n}{K}$ This means $P_n = Kx_n$. Similarly, for the next generation, $P_{n+1} = Kx_{n+1}$. $Kx_{n+1} = r (Kx_n) \left(1 - \frac{Kx_n}{K} \right)$ Divide both sides by $K$: $x_{n+1} = r x_n (1 - x_n)$ And thus we arrived at the famous Logistic Map. ### 4.2 From Predictable to Chaotic The versatile and chaotic nature of the Logistics Map is world-renowned. The easiest way to learn about the behavior of the logistic map is by directly interacting with it. As a result, I coded an interactive program to help me understand its complexity. It is observed that when $r$ is smaller than 3, no matter the initial population, the model is aways attracted to and stabalizes at 0.5. ![[Screenshot 20251219 at [email protected]]] When $r$ enters the region between $3.0$ and $3.57$, after some fluctuation, we see a stable oscillation just as observed in the previous delay logistic equation. ![[Screenshot 20251219 at [email protected]]] When $r$ is bigger than 3.57, the interesting part begins. The first thing we notices is that the graph is no longer the beautiful oscillation; instead, it becomes completely random with no period. Second, in the range, the behavior of the function is sensitively dependent on the initial population $P_0$. This means that a tiny different in the P_0 will produce two radically different results, especially in later stages. It's unlike in the functions we study in class where a tiny difference in input will also lead to two very similar outputs. In the graph, the red dotted line shows how a =0.0001 difference in $P_0$ will change the effect -- as we can see, the red dotted graph is completely different from the black line. ![[Screenshot 20251219 at [email protected]]] In fact, the behavior of the logistic map is so unpredictable that it can even be used to generate random numbers (Zhou, 2024). The lesson here is that a seemingly simple and deterministic model that generate such chaotic and unpredictable results -- this completely defies the Newtonian notion that things can be calculated and controlled as long as we have enough compute power. This is the underlying force behind complex systems like weather and economy, which behavior oftentimes defy our linear-thinking brain and demonstrate surprising behaviors. Most importantly, it serves as a possible explanation for the sense of chaos I experienced in my game -- how a tiny move can lead to a cascades of effects, and how hard is it to actually plan, manage, and control. ## 5. Tipping Points & Collapse Threshold ### 5.1 The Missing Piece Up till now, the models pretty much captured the oscillation, turbulence and unpredictability I saw in my city. However, there's a crucial piece of the story they all miss. In the logistic model, the population never hits zero. This is because mathematically, for any population $P$ between 0 and the carrying capacity $K$, the growth rate in the logistic equation $\frac{dP}{dt} = rP(1 - \frac{P}{K})$ is positive. The population is always pushed *away* from extinction. But just as in reality and in my city, it doesn't just fluctuate but collapses. Sometimes, when I pushed too hard, it hits a point of no return. This implies there's a fundamental mechanism at play that my models haven't accounted for—a **tipping point** that, once crossed, leads not to recovery, but to ruin. ### 5.2 The Allee Effect Following more research, I realized that the flaw in my previous models was the assumption that the growth rate is always highest when the population is smallest. However, the is obviously wrong. A city with only a handful of people cannot function. It lacks the critical mass to support businesses, run public services, or attract new residents. In fact, its decline would accelerate and create a spiral of negative feedback (Courchamp, Clutton-Brock and Grenfell, 1999). This concept is known in ecology as the **Allee effect**: for some species, there is a minimum viable population required to sustain growth (Courchamp, 2008). Below this threshold, the population is doomed. To incorporate this into my model, I need to modify the logistic equation. We need a new mathematical term that turns the growth rate negative when the population drops below a certain critical threshold, let's call it $A$. Similar to how we derived the factor $(1 - \frac{P}{K})$, we derive the other factor to be $(\frac{P}{A} - 1)$or $\frac{P-A}{A}$ -- the proportional difference between the current population and the tipping point $A$ (Courchamp, Clutton-Brock and Grenfell, 1999).. This is derived from the observertaions that * If the population $P$ is greater than the threshold $A$, this term is positive, and the population grows towards the carrying capacity $K$. * However, if $P$ falls below $A$, this term becomes negative. It flips the sign of the entire equation, turning growth into decay. The population is now forced downwards, towards extinction, no matter how high the $K$ is. Let's add this new factor to our logistic equation $ \frac{dP}{dt} = rP\left(1 - \frac{P}{K}\right)\left(\frac{P}{A} - 1\right) $ Again, I modeld it on a program. ![[Screenshot 20251219 at [email protected]]] This equation now tells a much richer story. We still have our growth engine $rP$ and our carrying capacity brake $(1 - \frac{P}{K})$. But the new term, $(\frac{P}{A} - 1)$, acts as a critical switch. This single modification introduces the point of no return I was looking for. The system now has three key population levels: $P=0$ (extinction), $P=K$ (carrying capacity), and the new tipping point at $P=A$. While $K$ is a stable state where the population wants to be, $A$ is an unstable threshold. If the population falls below $A$, it cannot recover. It is pulled inexorably towards zero. ![[Screenshot 20251219 at [email protected]]] ### 5.3 The Implications This mathematical journey through collapse thresholds brings me to a realization about sustainability. The models reveal that complex systems—whether virtual cities, real ecosystems, or global economies—don't degrade linearly. They can absorb stress while appearing stable, then suddenly collapse when pushed past invisible thresholds. The key insight is that sustainability isn't just about staying below carrying capacity $K$; it's about maintaining sufficient distance from collapse thresholds $A$. The buffer zone between our current state and these thresholds represents our **resilience**—our capacity to absorb shocks without triggering irreversible collapse. In our real world, where we face challenges from climate tipping points to economic instability, understanding these collapse thresholds is essential knowledge for navigating a future where small actions can have catastrophic consequences, where prevention is easier than cure, and where the mathematics of collapse can help us choose the mathematics of sustainability. ## 6. Conclusion & Further Implications My journey began with the failure of control of my digital metropolis in *Cities: Skylines*. This sparked a mathematical investigation that started with the naive assumption of exponential growth, driven by a positive feedback loop. Recognizing its limitations, I integrated the stabilizing force of a carrying capacity, or negative feedback, to form the logistic model. However, this smooth S-curve failed to capture the turbulence I witnessed. The introduction of time delays transformed the model and unlocks behaviors from damped to stable oscillations, mirroring the bust and booms of my city. Shifting from continuous to discrete time with the logistic map further underscored how simple, deterministic rules can generate profoundly chaotic outcomes that are sensitive on initial conditions. Finally, the model was completed by incorporating the Allee effect, which introduced a critical collapse threshold. This progression fulfilled my aim: to demonstrate how integrating feedback loops, time delays, and tipping points within a system of differential equations can successfully model the emergence of complex, stable, and chaotic behavior. Each layer of mathematical complexity brought the model closer to the rich, non-linear dynamics of the system I sought to understand. The implications of this exploration extend far beyond my game on the screen. Understanding the mathematics of tipping points is therefore an essential tool for navigating a future where the interconnectedness of our world means that small, seemingly isolated actions can have profound and lasting consequences. Our world is inherently a complex systems that refuse to be simplified with the mathematical tools we learn in high school. As a result, it is extremely danmgerous for policy-makers to naievely assume the world to be a simple function that changes linearly according to our action, and as a result try to fix errors by simply tweaking inputs such as tax rates. It's more important than ever to recognize and acknowledge the complexity of our world, embrace it, so that we can better dance with it towards a sustainable future. ## Bibliography Awada, N. (2019) “Generalizing Relationships: Exponents, Logarithms and Integration,” in _Mathematics: Analysis and Approaches. Higher level. Course companion_. Oxford: Oxford University Press, pp. 475, 477. Courchamp, F. (2008) _Allee Effects in Ecology and Conservation_. Oxford: Oxford University Press, Incorporated. Courchamp, F., Clutton-Brock, T. and Grenfell, B. (1999) “Inverse density dependence and the Allee effect,” _Trends in Ecology & Evolution_, 14(10), pp. 405–410. Available at: [https://doi.org/10.1016/s0169-5347(99)01683-3](https://doi.org/10.1016/s0169-5347\(99\)01683-3). Gould, H., Tobochnik, J. and Christian, W. (2007) _An Introduction to Computer Simulation Methods: Applications to Physical Systems_. 3. ed. San Francisco, Calif. Munich: Pearson Addison Wesley. Hutchinson, G.E. (1948) “Circular Causal Systems in Ecology,” _Annals of the New York Academy of Sciences_, 50(Art 4), pp. 221–246. Available at: [https://doi.org/10.1111/j.1749-6632.1948.tb39854.x](https://doi.org/10.1111/j.1749-6632.1948.tb39854.x). Martin, L.A. (1997) “An Introduction to Feedback.” Meadows, D.H., Club of Rome, and Potomac Associates (eds.) (1974) _The Limits to Growth: A Report for the Club of Rome’s Project on the Predicament of Mankind_. 2. ed. New York: Universe books (A potomac associates book). Meadows, D.H. and Wright, D. (2011) _Thinking in Systems: A Primer_. Nachdr. White River Junction, Vt: Chelsea Green Pub. Mitchell, M. (2011) _Complexity: A Guided Tour_. 1. iss. as an paperback. New York, NY: Oxford University Press. Rao, M.M. and Preetish, K.L. (2012) “Stability and Hopf Bifurcation Analysis of the Delay Logistic Equation.” arXiv. Available at: [https://doi.org/10.48550/arXiv.1211.7022](https://doi.org/10.48550/arXiv.1211.7022). Staff, C. (2022) “Newton’s Law of Cooling - Carolina Knowledge Center,” 24 August. Available at: [https://knowledge.carolina.com/discipline/interdisciplinary/math/newtons-law-of-cooling/](https://knowledge.carolina.com/discipline/interdisciplinary/math/newtons-law-of-cooling/) (Accessed: December 19, 2025). Zhou, C. (2024) “Analysis of Logistic Map for Pseudorandom Number Generation in Game Development.” arXiv. Available at: [https://doi.org/10.48550/arXiv.2403.00864](https://doi.org/10.48550/arXiv.2403.00864). --- ==I wanted to put all the codes I used to model the functions here, but I think they are way to much -- could takes dozens of pages. == ==Also how do I cite the use of AI used to help me write the codes?==