Introduction to Markov Chains
Introduction
Markov Chains are just a fancy way of saying what the probability between two or more things would be using current present state date. They are ran on thousands of pieces of data to estimate what the next "state" will be. Let's take below as an example,
Cite: Medium
Suppose we feed the past 5 years' data on the weather into this simulation.
The three big circles that we have identified are what we call "states." The arrows are the possibility for one state to go to another in a certain time interval you choose. We will be doing a one day interval. In our case, we have three states of the weather: rainy, sunny, and cloudy. The chance for the weather going from sunny to cloudy is 40% (as represented by the arrow). The chance for the weather going from sunny to rainy is 10%. It could also be sunny the next day, and that chance is 50%. Notice that the chances of the Sunny state all add up to 100%. This happens for all the other states as well.
The three big circles that we have identified are what we call "states." The arrows are the possibility for one state to go to another in a certain time interval you choose. We will be doing a one day interval. In our case, we have three states of the weather: rainy, sunny, and cloudy. The chance for the weather going from sunny to cloudy is 40% (as represented by the arrow). The chance for the weather going from sunny to rainy is 10%. It could also be sunny the next day, and that chance is 50%. Notice that the chances of the Sunny state all add up to 100%. This happens for all the other states as well.
This could also be represented in a table,
Why?
Markov Chains also have an embedded way that accounts "uncertainty." For example, stocks can be extremely volatile in some cases (take for example the 2008 Stock Market Crash). Markov Chains can be good in accounting this extreme volatility. And, they are the basis of stochastic modeling and predicting. They are found in several different places!
1. Finance/Economics
1. Finance/Economics
2. Betting (e.g. sports betting)
3. Biology (e.g. making predictive modeling on what the next sequence of DNA could be)
4. What lunch the cafeteria is gong to have (👀)
5. Game Theory
4. What lunch the cafeteria is gong to have (👀)
5. Game Theory
6. Natural Language Processing (e.g. guess what the next word is going to be in a paragraph)
7. Chemistry (e.g. modeling chemical reactions)
Pros/Cons
Pros
1. Easy to use out of the get-go and simple.
2. Accounts for uncertainty better than most other models.
3. Has flexibility in its modeling of the transitions.
4. Markov chains have a more natural way of expressing probabilities than do most other combinatorial methods (representing how objects are counted).
Cons
1. It isn't good with "understanding" stuff (it's another reason why you would use an attention mechanism).
2. Can't know all possible states/predict all real-world scenarios (e.g. in Biology perhaps it doesn't account for a pathogen that is in the bloodstream).
3. Requires a lot of data to properly predict transitions.
4. Creating/Forming data in a manner that accounts for the states and the "lines" (or transition) to the next state. It can be hard to differentiate between them.
5. Markov proprieties are only good for the example you have created. They can not be used for multiple different types of probabilistic modeling (e.g. attempting to ask how do you make a bridge to an AI trained to assist you for low-level tasks such as creating a to-do list).
6. The simplicity part also means that modeling complex processes can be extremely difficult (and, are better not to be modeled by Markov Chains in most cases).
Comments
Post a Comment