Posts

Showing posts from October, 2024

Introduction to Markov Chains

Image
Introduction Markov Chains are just a fancy way of saying what the probability between two or more things would be using current present state date. They are ran on thousands of pieces of data to estimate what the next "state" will be. Let's take below as an example,  Cite: Medium Suppose we feed the past 5 years' data on the weather into this simulation. The three big circles that we have identified are what we call "states." The arrows are the possibility for one state to go to another in a certain time interval you choose. We will be doing a one day interval. In our case, we have three states of the weather: rainy, sunny, and cloudy. The chance for the weather going from sunny to cloudy is 40% (as represented by the arrow). The chance for the weather going from sunny to rainy is 10%. It could also be sunny the next day, and that chance is 50%. Notice that the chances of the Sunny state all add up to 100%. This happens for all the other states as well.   T...