Objective: PTEN, a tumor-suppressor gene, located on chromosome 10q23.3, is implicated in various types of cancer including breast cancer. The aim of this study is to investigate the promoter methylation, loss of expression and significance of PTEN gene in breast cancer and to determine the correlation between promoter methylation and gene expression. Methods: Promoter methylation and loss of expression of PTEN gene were analyzed using methylation-specific PCR and immunohistochemical methods respectively. The chi square test is used to correlate the promoter methylation and gene expression with their clincopathologic parameters. Results: We examined 53 breast cancer specimens and 10 normal tissues adjacent to tumor. The results showed a 58.5% promoter methylation in PTEN gene and none in normal tissue. PTEN methylation was observed in advanced stages III-IV (81.8%, 18 of 22, P=0.015) and higher grades G2-G3 (71.4%, 20 of 28, P=0.043) of disease. The correlation of PTEN methylation with clinical stage and tumor grade was found to be statistically significant. Nuclear PTEN expression was detected in 73.6% (39 of 53) cases of breast cancer and in the remaining 26.4% (14 of 53) cases expressional loss was observed. The loss of PTEN expression was observed in all normal tissues (10 of 10). The loss of PTEN expression was significantly correlated with patient’s age (P=0.028) and clinical stage (P = 0.029). The expressional loss was observed in 12 (38.7%) cases among 31 methylation positive cases, whereas among 22 methylation- negative cases, only 2 (9.1%) cases were seen as immunostaining negative with the statistically significant value (P=0.016). Conclusion: Promoter methylation and loss of expression of PTEN gene occur frequently in breast cancer. Our results suggest that PTEN plays an important role in breast carcinogenesis.
Healthcare is one of the emerging application fields in the Internet of Things (IoT). Stress is a heightened psycho-physiological condition of the human that occurs in response to major objects or events. Stress factors are environmental elements that lead to stress. A person's emotional well-being can be negatively impacted by long-term exposure to several stresses affecting at the same time, which can cause chronic health issues. To avoid strain problems, it is vital to recognize them in their early stages, which can only be done through regular stress monitoring. Wearable gadgets offer constant and real information collecting, which aids in experiencing an increase. An investigation of stress discovery using detecting devices and deep learning-based is implemented in this work. This proposed work investigates stress detection techniques that are utilized with detecting hardware, for example, electroencephalography (EEG), photoplethysmography (PPG), and the Galvanic skin reaction (GSR) as well as in various conditions including traveling and learning. A genetic algorithm is utilized to separate the features, and the ECNN-LSTM is utilized to classify the given information by utilizing the DEAP dataset. Before that, preprocessing strategies are proposed for eliminating artifacts in the signal. Then, the stress that is beyond the threshold value is reached the emergency/alert state; in that case, an expert who predicts the mental stress sends the report to the patient/doctor through the Internet. Finally, the performance is evaluated and compared with the traditional approaches in terms of accuracy, f1-score, precision, and recall.
With the use of medical image analysis, Artificial Neural Networks (ANNs) have shown effective results in the early detection and treatment of pancreatic cancer. A summary of current studies using ANNs for AI-driven pancreatitis cancer detection is given in this research study. Numerous studies have shown that ANNs could always identify and diagnose pancreatic cancer with high accuracy when utilizing CT scans, MRIs, and other types of medical scanning. The early detection of pancreatic cancer using ANNs is very useful for enhancing outcomes for patients. The design and implementation of ANNs for the detection of pancreatic cancer does not come without difficulties, though. These include the demand for extensive and varied datasets, the necessity for ongoing model training, and the necessity for point procedures to guarantee the precision and dependability of ANN-based diagnostic tools. In summary, using ANNs for AI-driven pancreatic detection of cancer has enormous potential to enhance outcomes for patients. These technologies need to be improved and validated, and the ethical and legal issues surrounding their usage in therapeutic contexts need to be addressed.
In the realm of scene reconstruction, conventional methods often struggle with challenges posed by occlusions, lighting variations, and noisy data. To address these limitations, this paper introduces a Transduction-based Deep Belief Network (T-DBN) within a learning-based multi-camera fusion framework, offering robust scene reconstruction by effectively fusing data from multiple cameras and adapting to diverse conditions. Traditional scene reconstruction methods often struggle with challenging scenarios due to limitations in handling occlusions, lighting variations, and noisy data. The proposed T-DBN model overcomes these limitations by effectively fusing information from multiple cameras using a transduction scheme, allowing it to adapt to varying conditions. The network learns to decipher scene structures and characteristics by training on a diverse dataset. Experimental results demonstrate the superiority of the Proposed T-DBN in achieving accurate and reliable scene reconstruction compared to existing techniques. This work presents a significant advancement in multi-camera fusion and scene reconstruction through the integration of deep learning and transduction strategies.
Unregulated glucose levels in the blood lead to the chronic disease diabetes. Diabetic foot ulcers (DFUs) and other devastating outcomes may be avoided with early detection. The lower limb of a diabetic patient may need to be amputated if they experience a DFU. DFU is difficult to diagnose and usually requires a number of expensive and time-consuming clinical investigations for the treating physician. Applying deep learning, machine learning, and computer vision techniques in today's age of data deluge has resulted in a number of solutions that can help doctors make more accurate diagnoses in less time. As a result, researchers have recently focused more on developing methods for automatically identifying DFU. Preprocessing, segmentation, feature extraction, and model training are all used in the suggested method. It does noise reduction and RGB to HSI color space conversion during preprocessing. OSTU thresholding segmentation is used for the separation. It uses histogram for feature extraction and Improved CNN-SVM for model training. The new method is compared to two common approaches, including CNN and CNN-SVM, and fares better than both.