In recent years, online education has become a mature, recognised, and heavily used alternative for delivering higher education programmes. Beyond its benefits, online education faces a number of challenges, some of which relate to its engagement and impact on student performance. To support the ongoing research into the complex relationships developed, this research investigated the relationship between engagement and academic performance for students that undertake standalone online programmes. The study uses as input the module content engagement data, as collected from an e-learning platform, including the number of content views, forum posts, completed assignments, and watching of videos. The study used Pearson correlation to evaluate the relationship between learner engagement and academic performance. The analysis revealed that the student engagement was positively correlated to the student performance both for individual modules as well as across the cohort. In addition, correlation between initial engagement with individual subjects and the overall engagement was also strong, indicating both variables lead to improved academic results.
Identity Access Management (IAM) is an area posing significant challenges, particularly in the context of remote connectivity and distributed or cloud-based systems. A wide range of technical solutions have been proposed by prior research, but the integration of these solutions in the commercial sector represent steps that significantly hamper their acceptance. The study aims to outline the current perception and security issues associated with IAMs solutions from the perspective of the beneficiaries. The analysis relies on a series of interviews with 45 cyber security professionals from different organisations all over the world. As results showed, cloud IAM solutions and on premises IAM solutions are affected by different issues. The main challenges for cloud based IAM solutions were Default configurations, Poor management of Non-Human Identities such as Service accounts, Poor certificate management, Poor API configuration and limited Log analysis. In contrast, the challenges for on premise solutions were Multi Factor Authentication, insecure Default configurations, Lack of skillsets required to manage IAM solution securely, Poor password policies, Unpatched vulnerabilities, and compromise of Single-Sign on leading to compromise of multiple entities. The study also determined that, regardless the evolving functionality of cloud based IAM solutions, 41% of respondents believe that the on premise solutions more secure than the cloud-based ones. As pointed out by the respondents, cloud IAM may potentially expose organisations to a wider range of vulnerabilities due to the complexity of the underlying solutions, challenges with managing permissions, and compliance to dynamic IAM policies.
Software defined networking (SDN) provides a centralized control framework with real-time control of network components, residential customer routers in particular, that allows automated per-user bandwidth allocation. However, employing dynamic traffic shaping for efficient bandwidth utilization among residential users is a challenging task. In this context, understanding application usage requirements for each individual user and translating them into network policies requires expertise beyond most residential users. This paper proposes a user-centric traffic optimization scheme by profiling users based on their application trends recorded using generic NetFlow records, in order to provide a better view of per user utilization. We also propose an SDN traffic monitoring and management application for implementing Linux-based hierarchical token bucket (HTB) queues customized for individual user profiles in real-time, according to user-defined priorities. The traffic management scheme scales well under both upstream and downstream network congestion by dynamically allocating dedicated bandwidth to users based on their profile priority, resulting in a decreased packet loss and latency for a selected set of high priority users.
The multimedia enriched transport supported by the Internet today continues to pose a challenge to engineers who attempt to provide QoS for its users. The popular method of using Diffserv solutions is typically too static to meet the mixed user traffic model. This paper introduces a novel user-centric approach of dynamically evaluating and policing incoming Internet flows to control the ratio of traffic types for individual users. After describing its theoretical rationale, the proposed method is presented as an integrated architecture (Congestion Aware Packet Scheduler - CAPS), which allows seamless integration with existing Diffserv networks. An ns2 implementation of the CAPS architecture is presented and investigated for a number of different scenarios. The evaluation of CAPS indicates that the architecture outperforms best effort, traditional Diffserv and weighted-RED alternatives, providing a better, dynamic QoS balance for a wide range of traffic profiles without the need for explicit predetermined precedence on traffic types.
Quality of Service (QoS) is a very important aspect in Next Generation Telecommunication Networks. A traffic-saving QoS monitoring concept, based on the virtual grouping of user terminals, has been developed. This paper shows a bootstrap issue existing within this concept, and introduces a mechanism supporting the reliable initialisation of the monitoring concept.
Recent years have witnessed a significant increase in monitoring network traffic in order to profile user behaviour and to provide better service. This paper will provide a review of these efforts, highlighting the benefits brought by traffic profiling, particularly in relation to providing a better user experience and higher quality of service. The discussion will focus on three headings: identifying and profiling applications based on statistical analysis of traffic, identifying users and anomaly detection based on network interaction, and providing fairness in a heterogeneous user environment. Profiling applications is a challenging task in the context of encryption and tunnelling, but allows better provision of network resources, in line with the needs of each application, from email to video streaming. Identifying users may raise concerns in terms of privacy, but the primary aim is not to single them out but to cater for their needs at an aggregate level, both in terms of dealing with significant variations as well as potentially acting as a first line of defence when anomalies are detected. Finally, while globally the behaviour of users may appear similar, there is significant variation in terms of the demand, usage, and expectations of each user; ensuring fairness in such a diverse environment requires acknowledging the user requirements and accommodating them against a heterogeneous environment in terms of provision and demand. The presentation will draw from a number of research studies undertaken over the recent years in the above areas, both across the research community as well as at Plymouth University, and discuss how the findings impact on the wider user community.
Cloud computing, supported by advancements in virtualisation and distributed computing, became the default options for implementing the IT infrastructure of organisations. Medical data and in particular medical images have increasing storage space and remote access requirements. Cloud computing satisfies these requirements but unclear safeguards on data security can expose sensitive data to possible attacks. Furthermore, recent changes in legislation imposed additional security constraints in technology to ensure the privacy of individuals and the integrity of data when stored in the cloud. In contrast with this trend, current data security methods, based on encryption, create an additional overhead to the performance, and often they are not allowed in public cloud servers. Hence, this paper proposes a mechanism that combines data fragmentation to protect medical images on the public cloud servers, and a NoSQL database to secure an efficient organisation of such data. Results of this paper indicate that the latency of the proposed method is significantly lower if compared with AES, one of the most adopted data encryption mechanisms. Therefore, the proposed method is an optimal trade-off in environments with low latency requirements or limited resources.