Kevin HY Hau - Hello World đź‘‹

Shiny App - Dynamic Stochastic Model


Objective:

This algorithm model a stochastic process that combines the forward-branching structure of a binomial tree with the memory-retaining dynamics of a recurrent neural network (similar to Hopfield network when historical step (k) = 3), while computing expected values at every step under binary (0/1) outcomes.

Applications:


Instructions:


Details:

In a binomial tree, each node’s probability depends only on the immediately previous state. Here, the transition probability at step k is derived from a grouped summary of multiple earlier steps—effectively treating the history as a compact state vector whose influence decays or aggregates in a controlled manner.

This is similar to the recurrent connections in a Hopfield network, where the network’s energy landscape stores and retrieves patterns from past activations. However, unlike Hopfield neural network (which typically performs deterministic or probabilistic retrieval), the algorithm injects a forward-looking expected-value layer.

This expectation is propagated forward, turning the process into a hybrid recurrent stochastic process that remembers past configurations while maintaining an analytically tractable martingale-like property.



TO DO (based on funding):

  • Different option of input methods
  • Extend length of steps forward
  • Extend length of historical steps
  • Include the option of different set of parameters for each step up to the cycle of 12 steps
  • Include length of terms for loans
  • Include breakdown of number of events by each step