Author_Institution :
Sch. of Comput. Eng., Nanyang Technol. Univ., Singapore, Singapore
Abstract :
Various Markov models have been proposed to model individuals´ mobility, i.e., the transitions between locations. Although these studies are able to show high predicting accuracy of individuals´ next move, two basic assumptions of these studies, namely the stationarity of individuals´ mobility sequence and the dependency of visiting the locations, have never been validated. Moreover, a famous recent research suggests that individuals prefer to revisit the locations merely according to the visitation frequency. In this paper, we study these two presumed properties of individuals´ mobility sequence. Specifically, stationarity is validated by analyzing the autocorrelation of the mobility sequences that are encoded under three popular schemes. With appropriate choice of spatial and temporal scale, two types of periodicity, e.g., daily and weekly, are observed. The visiting dependency between the locations is validated by comparing the entropy of the mobility sequence to the entropies generated from either i.i.d. or Markov sources. Based on these two statistical properties, we then construct two Markov models with three revisiting rules to model individuals´ mobility. The average log-loss values show that the best revisiting rule for the Markov model is a combination of the visitation frequency and the distance of the next location to the current locations. This finding suggests that individuals´ mobility is influenced by both temporal and spatial localities, rather than by either form of locality alone as suggested in the previous study.
Keywords :
Markov processes; mobile computing; statistical analysis; Markov models; average log-loss values; individual mobility sequence; revisiting rules; spatial localities; spatial scale; stationarity; statistical properties; temporal localities; temporal scale; visitation frequency; Accuracy; Context; Correlation; Entropy; Hidden Markov models; Markov processes; Predictive models; Data mining; Information entropy; Markov processes; Statistical analysis;