Uncertainity - Markov Chain
Introduction:
A Markov chain is a sequence of random variables where the distribution of each variable follows the Markov assumption. That is, each event in the chain occurs based on the probability of the event before it. A Markov chain is a mathematical system that experiences transitions from one state to another according to certain probabilistic rules. The defining characteristic of a Markov chain is that no matter how the process arrived at its present state, the possible future states are fixed. In other words, the probability of transitioning to any particular state is dependent solely on the current state and time elapsed.
To start constructing a Markov chain, we need a transition model that will specify the probability distributions of the next event based on the possible values of the current event. Suppose there is a transition model of a rainy day and a sunny day. In this transition model "If it is a sunny day the probability of sunny today is 0.8 and if it is rainy day the probability of rain is 0.7 ". Using this kind of transition model, it is possible to sample a Markov chain. Start with a day being either rainy or sunny, and then sample the next day based on the probability of it being sunny or rainy given the weather today.
This is an Artificial Intelligence course conducted in City Universit by Nuruzzaman Faruqui. This is the best course in Bangladesh. Because Sir explained all the necessary things to us well. The problem we are dealing with in the lab. First, sir gives us a very good idea of the problem by using many examples. After that sir explains about the solutions like how we can solve the problem, how they work. For that, we understand everything very easily.
Problem:
The problem is :
In this picture, we see that this is a transition model of a rainy day and a suuny day. If it is a rainy day the probability of Sun is 0.8 and the rain is 0.2, If it is a rainy day the probability of sun is 0.3, and the probability of rain 0.7. Using this transition model,It is possible to sample a Markov chain like this.
The python code to implement sampling
from pomegranate import *
# Define starting probabilities
start = DiscreteDistribution({
"sun": 0.5,
"rain": 0.5
})
# Define transition model
transitions = ConditionalProbabilityTable([
["sun", "sun", 0.8],
["sun", "rain", 0.2],
["rain", "sun", 0.3],
["rain", "rain", 0.7]
], [start])
# Create Markov chain
model = MarkovChain([start, transitions])
# Sample 10 states from chain
chain = model.sample(10)
print(chain)
Result :
If everything is fine then we will get this type of result.
After coding we set sample 10 states from chain for that, we get 10 states in this Markov chain. If we use 50 then we will get 50 states in the Markov chain.
No comments