To test the following hypothesis in the assessment of head injuryonly patients with 5 min or more of post-traumatic amnesia (PTA) are at risk of acute olfactory dysfunction (OD).This was a retrospective comparative study of olfactory status in head injury patients seen at a head injury clinic at Glasgow Royal Infirmary from 1985 to 2003. Of 828 clinic attenders, 101 had acute OD. These subjects were compared with a randomly selected control group of 102 patients with head injury but normal olfactory function. The main outcome measure was a significant likelihood of patients with PTA lasting for 5 or more minutes having acute OD compared with those with PTA of less than 5 min.The likelihood of patients with a PTA of 5 min or more having acute OD compared to those with PTA of less than 5 min is clinically significant with an odds ratio of 9.6 (p<0.01).Examination of patients with 5 min or more of PTA should include a simple test of sense of smell. Patients with impaired smell sensation should be aware of their condition prior to discharge from hospital. In addition, the need for a CT brain scan and appropriate follow up should be considered.
The significance of the cloud environment is growing in the current digital world. It provides several advantages, such as reduced expenses, the ability to adjust to different needs, adaptability and enhanced cooperation. The field of digital forensic investigations has encountered substantial difficulties in reconciling the requirement for efficient data analysis with the increasing apprehensions regarding privacy in recent times. As investigators analyse digital evidence to unearth crucial information, they must also traverse an intricate network of privacy rules and regulations. Given the increasing prevalence of remote work and the necessity for businesses to be adaptable and quick to react to shifting market circumstances, the cloud infrastructure has become a crucial asset for organisations of various scales. Although the cloud offers benefits such as scalability, flexibility and enhanced collaboration, it presents difficulties in digital forensic investigations regarding data protection, ownership and jurisdictional boundaries. These concerns are becoming increasingly significant as more data is kept in the cloud. In this paper, we present three major challenges that are faced during cloud-based forensics investigation. We analyse the extent to which different data formats increase complexity in forensics investigations in cyberspace. This paper analyses three core challenges facing digital forensics in the cloud environment: legitimacy, complexity and an increase in data volume, looking at the implications these have on data liable for legal issues in court. These challenges contribute to the backlog in digital forensics investigations due to a lack of modularisation of the procedures. To address these concerns, modularisation model is proposed to offer a way to integrate traditional processing functions while ensuring strict adherence to privacy protocols. To overcome these challenges, we propose modularisation as a strategy for improving the future of digital forensic research's operational efficiency, overcoming the identified challenges faced during cloud-based investigations and demonstrating how organisations can mitigate potential risks associated with storing sensitive information in the cloud.
The insider threat within organisational cybersecurity continues to be of great concern globally. The current insider threat detection strategies are acknowledged as ineffective, evidenced by the increased reported events in high-profile insider threats and cyber data loss cases borne from insider and privilege misuse. The impact of insider incidents on Financial Service (FS) organisations is vast, operationally disruptive, and costly from a regulatory, financial, and reputational perspective. Many United Kingdom (UK) FS organisations have invested in insider risk programmes, but there is no sign of the insider threat diminishing. This paper will address the following research questions: (1) What factors influence employees to become malicious insider threats and apply this to employees working within the UK? (2) What preventative measures could be effectively operationalised within UK FS organisations to prevent malicious insider attacks? A literature review was conducted, reviewing 54 articles in peer-reviewed journals. Additional and relevant articles were incorporated to enrich the review, further substantiating the academic currency and context of the study. The review reveals five primary emerging insider threat themes, subsequently discussed and including behavioural indicators, information security behaviours, technical controls, insider threat strategies, and regulation. Throughout the literature review, one primary challenge highlighted the lack of articles published concerning the FS industry; however, the studies reviewed were relevant, appropriate, and applied across this review. Furthermore, the review also considers outcomes from a practitioner's perspective, offering insights into the limitations of insider threat approaches and strategies and offering potential recommendations.
Mobile crowdsensing (MCS) systems rely on the collective contribution of sensor data from numerous mobile devices carried by participants. However, the open and participatory nature of MCS renders these systems vulnerable to adversarial attacks or data poisoning attempts where threat actors can inject malicious data into the system. There is a need for a detection system that mitigates malicious sensor data to maintain the integrity and reliability of the collected information. This paper addresses this issue by proposing an adaptive and robust model for detecting malicious data in MCS scenarios involving sensor data from mobile devices. The proposed model incorporates an adaptive learning mechanism that enables the TCN-based model to continually evolve and adapt to new patterns, enhancing its capability to detect novel malicious data as threats evolve. We also present a comprehensive evaluation of the proposed model’s performance using the SherLock datasets, demonstrating its effectiveness in accurately detecting malicious sensor data and mitigating potential threats to the integrity of MCS systems. Comparative analysis with existing models highlights the performance of the proposed TCN-based model in terms of detection accuracy, with an accuracy score of 98%. Through these contributions, the paper aims to advance the state of the art in ensuring the trustworthiness and security of MCS systems, paving the way for the development of more reliable and robust crowdsensing applications.
In today's ever evolving technology, malware is one of the most significant threats faced by individuals and corporate organizations. With the increasing sophistication of malware attacks, detecting malware becomes harder as many malware variants use different techniques, such as obfuscation, to evade detection. Even though advanced techniques, such as use of deep learning, prove to be of great success in classifying malware, the high computational resources needed for training and deploying deep learning models may not be feasible for all organizations or individuals. It is therefore essential to use fewer computational techniques to understand how malware can be analysed using shared code execution, which uses less computational resources. In this paper, we explored shared code execution as a novel approach for analyzing and understanding the behavior of malware. We dynamically analysed the shared code execution of the malicious payloads by looking at the dynamic link library found in NTDLL.dll. We demonstrated how samples make use of the LoadLibrary function using inline hooking techniques to overwrite the actual function code to create service execution and persistence using shared code execution. We identified functions that address the problem of encoding routine and domain obfuscation when malware uses seDebug Privilege to escalate privilege. Through realistic experiments, we found that executables such as Mod_77D4 Module, change at different instances using XOR encoding operations for each payload byte with a pre-defmed key. This helps sophisticated malware to create and bind address structures for remote control. Our proposed technique shows high analytical accuracy for sophisticated samples that use encoding and obfuscation methods to evade detection.
Objective: To assess whether there is a breath alcohol level (BrAC) below which confusion in the head injured patient should not be attributed solely to the acute effects of alcohol Method: Based in the Accident and Emergency Ward in Glasgow Royal Infirmary, a prospective observational study was carried out over a five month period. Patients admitted to the ward were recruited for the study if they had a primary diagnosis of head injury. The outcome measures recorded and analysed were sequential 2 hrly BrAC readings (mg/L) and Glasgow Coma Scale findings (Eye opening, motor and verbal responses). The relationship between these was investigated, which revealed additional relevant factors affecting level of consciousness. Results: The breath alcohol analyser was found to be a useful non-invasive, quick and easy to use tool. The results obtained were consistent with the expected pattern of reducing BrAC levels over a 6 hour period. Within this group of patients, a poor correlation was found between each of the three responses of the Glasgow Coma Scale and BrAC readings. For those patients who remained confused, when their BrAC reading was less than 1 mg/L, other causes of a lowered level of consciousness were identified. Conclusion: Confusion in the head injured patient with a BrAC of less than 1 mg/L, should alert one to the likelihood of causes other than alcohol intoxication.
Abstract Allocating resources is crucial in large-scale distributed computing, as networks of computers tackle difficult optimization problems. Within the scope of this discussion, the objective of resource allocation is to achieve maximum overall computing efficiency or throughput. Cloud computing is not the same as grid computing, which is a version of distributed computing in which physically separate clusters are networked and made accessible to the public. Because of the wide variety of application workloads, allocating multiple virtualized information and communication technology resources within a cloud computing paradigm can be a problematic challenge. This research focused on the implementation of an application of the LSTM algorithm which provided an intuitive dynamic resource allocation system that analyses the heuristics application resource utilization to ascertain the best extra resource to provide for that application. The software solution was simulated in near real-time, and the resources allocated by the trained LSTM model. There was a discussion on the benefits of integrating these with dynamic routing algorithms, designed specifically for cloud data centre traffic. Both Long-Short Term Memory and Monte Carlo Tree Search have been investigated, and their various efficiencies have been compared with one another. Consistent traffic patterns throughout the simulation were shown to improve MCTS performance. A situation like this is usually impossible to put into practice due to the rapidity with which traffic patterns can shift. On the other hand, it was verified that by employing LSTM, this problem could be solved, and an acceptable SLA was achieved. The proposed model is compared with other load balancing techniques for the optimization of resource allocation. Based on the result, the proposed model shows the accuracy rate is enhanced by approximately 10–15% as compared with other models. The result of the proposed model reduces the error percent rate of the traffic load average request blocking probability by approximately 9.5–10.2% as compared to other different models. This means that the proposed technique improves network usage by taking less amount of time due, to memory, and central processing unit due to a good predictive approach compared to other models. In future research, we implement cloud data centre employing various heuristics and machine learning approaches for load balancing of energy cloud using firefly algorithms.
Open fractures of the distal phalanx commonly present to the Accident and Emergency Department. Controversy surrounds the use of prophylactic antibiotics in treating this injury. A double-blind, prospective, randomized placebo-controlled study was undertaken comparing the use of prophylactic flucloxacillin to placebo in addition to meticulous wound toilet. One hundred and ninety-three adult patients with an open fracture of the distal phalanx were studied. Seven patients developed superficial infections, an overall infection rate of 4%. No patient developed osteitis or a deep wound infection. There were three cases of infection in the 98 patients (3%) in the antibiotic group and four cases of infection in the 95 patients (4%) in the placebo group. A difference of proportion test confirmed no significant difference. It is concluded that the addition of prophylactic flucloxacillin to thorough wound toilet and careful soft-tissue repair of open fracture of the distal phalanx confers no benefit.