Automobiles have become one of the necessities of modern life, but also introduced numerous traffic accidents that threaten drivers and other road users. Most state-of-the-art safety systems are passively triggered, reacting to dangerous road conditions or driving maneuvers only after they happen and are observed, which greatly limits the last chances for collision avoidances. Timely tracking and predicting the driving maneuvers calls for a more direct interface beyond the traditional steering wheel/brake/gas pedal. In this paper, we argue that a driver's eyes are the interface, as it is the first and the essential window that gathers external information during driving. Our experiments suggest that a driver's gaze patterns appear prior to and correlate with the driving maneuvers for driving maneuver prediction. We accordingly present GazMon, an active driving maneuver monitoring and prediction framework for driving assistance applications. GazMon extracts the gaze information through a front-camera and analyzes the facial features, including facial landmarks, head pose, and iris centers, through a carefully constructed deep learning architecture. Both our on-road experiments and driving simulator based evaluations demonstrate the superiority of our GazMon on predicting driving maneuvers as well as other distracted behaviors. It is readily deployable using RGB cameras and allows reuse of existing smartphones towards more safely driving.
The 3G/4G cellular networks as well as the emerging 5G have led to an explosive growth on mobile services across the global markets. Massive base stations have been deployed to satisfy the demands on service quality and coverage, and their quantity is only growing in the foreseeable future. Given the many more base stations deployed in remote rural areas, maintenance for high service availability becomes quite challenging. In particular, they can suffer from frequent power outages. After such disasters as hurricanes or snow storms, power recovery can often take several days or even weeks, during which a backup battery becomes the only power source. Although power outage is rare in metropolitan areas, backup batteries are still necessary for base stations as any service interruption there can cause unafforable losses. Given that the backup battery group installed on a base station is usually the only power source during power outages, the working condition of the battery group therefore has a critical impact on the service availability of a base station. In this paper, we conduct a systematical analysis on a real world dataset collected from the battery groups installed on the base stations of China Mobile Ltd co., and we propose an event-driven battery profiling approach to precisely extract the features that cause the working condition degradation of the battery group. We formulate the prediction models for both battery voltage and lifetime and propose a series of solutions to yield accurate outputs. By real world trace-driven evaluations, we demonstrate that our approach can boost the cellular network service availability with an improvement of up to 18.09%.
Base stations play a key role in today's cellular networks. Their reliability and availability heavily depend on the electrical power supply. Modern power grid is known to be highly reliable, but still suffers from outage due to severe weather or human-driven accidents, particularly in remote areas. Most of the base stations are thus equipped with backup battery groups. Given their limited numbers and capacities, they however can hardly sustain a long power outage without a proper allocation strategy. A deep discharge will also accelerate the battery degradation and eventually contribute to a higher battery replacement cost.
The explosion of online shopping brings great challenges to traditional logistics industry, where the massive parcels and tight delivery deadline impose a large cost on the delivery process, in particular the last mile parcel delivery. On the other hand, modern cities never lack transportation resources such as the private car trips. Motivated by these observations, we propose a novel and effective last mile parcel delivery mechanism through car trip sharing, to leverage the available private car trips to incidentally deliver parcels during their original trips. To achieve this, the major challenges lie in how to accurately estimate the parcel delivery trip cost and assign proper tasks to suitable car trips to maximize the overall performance. To this end, we develop Car4Pac, an intelligent last mile parcel delivery system to address these challenges. Leveraging the real-world massive car trip trajectories, we first build up a 3D (time-dependent, driver-dependent and vehicle-dependent) landmark graph that accurately predicts the travel time and fuel consumption of each road segment. Our prediction method considers not only traffic conditions of different times, but also driving skills of different people and fuel efficiencies of different vehicles. We then develop a two-stage solution towards the parcel delivery task assignment, which is optimal for one-to-one assignment and yields high-quality results for many-to-one assignment. Our extensive real-world trace driven evaluations further demonstrate the superiority of our Car4Pac solution.
Edge computing is a novel paradigm designed to improve the quality of service for latency sensitive cloud applications. However, the state-of-the-art edge services are designed for specific applications, which are isolated from each other. To better improve the utilization level of edge nodes, public resource sharing among edges from distinct service providers should be encouraged economically. In this work, we employ the payment channel techniques to design and implement EdgeToll, a blockchain-based toll collection system for heterogeneous public edge sharing. Test-bed has been developed to validate the proposal and preliminary experiments have been conducted to demonstrate the time and cost efficiency of the system.
Cognitive communication and computing have seen deep penetration in many networking areas in the past decades. With the recent advances in big data analysis and deep learning, we have seen great potential toward exploring cognitive intelligence for a wide range of applications. A notable example therein is human activity recognition, especially through RFID. Existing RFID activity identification solutions are mostly designed for static or slowly moving targets, rendering them far from satisfactory. More importantly, we observe that they suffer serious performance degradation in typical indoor environments with multipath interference. In this article, we argue that the recent advance of deep learning brings new cognitive intelligence for human activity identification. We first review the literature and research challenges of multipath effects in indoor environments. Then we introduce an advanced RFID activity identification framework, DeepTag, which uses a deep-learning-based approach for activity identification in multipath-rich environments. DeepTag gathers massive phase information from multiple tags, and preprocesses them to extract such key features as pseudospectrum and periodogram. We feed the preprocessed signal power and angle information into a deep learning architecture that combines a convolutional neural network and long short-term memory (LSTM) network. Our DeepTag framework can well adapt to both tag-attached and tag-free activity identification scenarios. Our extensive experiments further demonstrate its superiority in activity identification in multipath-rich environments.
Base stations have been widely deployed to satisfy the service coverage and explosive demand increase in today's cellular networks. Their reliability and availability heavily depend on the electrical power supply. Battery groups are installed as backup power in most of the base stations in case of power outages due to severe weathers or human-driven accidents, particularly in remote areas. The limited numbers and capacities of batteries, however, can hardly sustain a long power outage without a well-designed allocation strategy. As a result, the service interruption occurs along with an increasing maintenance cost. Meanwhile, a deep discharge of a battery in such case can also accelerate the battery degradation and eventually contribute to a higher battery replacement cost. In this paper, we closely examine the base station features and backup battery features from a 1.5-year dataset of a major cellular service provider, including 4,206 base stations distributed across 8,400 square kilometers and more than 1.5 billion records on base stations and battery statuses. Through exploiting the correlations between the battery working conditions and battery statuses, we build up a deep learning based model to estimate the remaining lifetime of backup batteries. We then develop BatAlloc, a battery allocation framework to address the mismatch between the battery supporting ability and diverse power outage incidents. We present an effective solution that minimizes both the service interruption time and the overall cost. Our real trace-driven experiments show that BatAlloc cuts down the average service interruption time from 4.7 hours to nearly zero with only 85 percent of the overall cost compared to the current practical allocation.