The Moving Cloud: Predictive Placement in the Wild.

2012 
Latency presents an enduring—and worsening— challenge to mobile systems designers. With the increasing adoption of cellular devices as the primary avenue of network connectivity for many users around the world, the “reach” of latency as a first-class concern is extending. Bandwith grows roughly as the square of latency, while storage capacity is growing faster still [16]. There are several reasons for this. The instantaneous nature of latency makes it hard to improve the metric by simply packing more bits on the wire. To make things worse, additional devices on the network, such as firewalls and switches, add more delay to network packets while minimally impacting the aggregate bandwidth. Finally (and perhaps most importantly), bandwidth is simply easier to sell in the marketplace. Unfortunately, even in systems with an adequate balance between latency and bandwidth—and such systems will become increasingly rare—humans are acutely sensitive to delay and jitter. Performance analyses of interactive applications, ranging from video streaming to interactive web services, show a modest increase in latency can make a session noticeably annoying or unusable [14]. Even at a few hundred milliseconds of latency, all too common in cellular networks that often power developing region connectivity, users reported many applications start to become noticeably annoying [23]. For highly interactive applications, user experience degrades significantly much sooner. The latency problem is even more pronounced in challenged network environments endemic to developing regions, where resources are limited to begin with. With cellular links and shared dial up connections in internet kiosks as the typical ways to connect to the internet [17], developing countries face significant challenges in network access. This makes even simple network tasks unpleasant, and rich media prohibitively difficult. Working through an interactive session in one of these kiosks can be charitably described as frustrating [19]. Provisioning data and computational support as close to demand as possible is the key to solving this problem [20]. To address this provisioning problem, we propose the moving cloud, a framework for proactive data delivery. The moving cloud leverages route fingerprints in individual mobility and users’ contextualized behavior of data access for predictive data placement. Essentially, we are trading bandwidth and storage for latency, exchanging resources that grow more quickly for the one that grows most slowly. The moving cloud alleviates the latency problem by proactively placing content where it needs to be in the near future, so that resources are closely and readily available when requested by the user. This paradigm enables a number of networking scenarios including bulk data access, mobile resource augmentation, on-demand social networks and personal content distribution. People are creatures of habit, and move in repeated patterns that can be probabilistically learned. As such, several models have been suggested for predicting human mobility [8, 21, 22]. Given the history of locations visited, these models typically predict the next location for a user. Our framework employs a new approach for augmenting these predictions with time bounds, producing actionable information for data placement. This temporal component of mobility is crucial as data delivery often has some freshness constraint. Our approach can be combined with most existing location predictors, and enhance them with an expected-time dimension. For example, coupled with a second order Markov model [22], our system was able to predict time of arrival within an hour in more than 90% of the hits in a sample dataset. Another important observation is the contextual nature of data use. Not only do people move in repeated patterns, they also access data in a habitual manner. Data accessed in one context, say during school hours, is often different from data accessed in another, say while at
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    24
    References
    0
    Citations
    NaN
    KQI
    []