Nearly Infinitesimal Trial Analysis Algorithm: A Machine Learning Approach towards Probability

2020 
Machine learning as a technology had many remarkable advances in the recent years by renowned research proposals and is not expected to end soon. It has significantly grown in the direction of statistics than probability, justified by evident reasons. Whether Probability should be viewed as discrete and distinguished from statistics has been a progressing debate for a long time that has involved domain experts like statisticians, machine learning engineers and data scientists, due to a discord of ideas that appear to be equally appealing A germane phenomenon explaining this conflict would be that there may be instances when perception that has been tuned to discern only finiteness may get deluded by infinity leading to the inconsistency of approach. One such epitome of the above-stated phenomenon is the Classic Monty Hall Problem Through this paper, we propose a machine learning-based approach incorporating an algorithm that can work, run and learn from nearly infinite experiences that could be collaborated and extended to areas that have never been intervened by machines hereby making possible rectifications and making naive jobs faster and more efficient. This would idealize the computational power of machines further by one huge step. Also, we would propose ad rem answers and justifications to several misconceptions that have been revolving around for a while now, Random vs Pseudo Random, Probability vs Possibility, and the question of machines dealing with infinity to mention a few
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    2
    References
    0
    Citations
    NaN
    KQI
    []