Chathumal, RAK; Dias, WPS
[Conference-Abstract]
A Markov Chain is a model of a sequence of event transitions in which the transitions from one event to other events occur with a fixed probability for a fixed time duration. In general both the events and time steps are ...