logo
    Transformation of the probability density function in an optical parametric amplifier: Application to rogue-wave-like statistics
    2
    Citation
    33
    Reference
    10
    Related Paper
    Citation Trend
    Keywords:
    Optical parametric amplifier
    Supercontinuum
    SIGNAL (programming language)
    Previous chapter Next chapter Other Titles in Applied Mathematics Inverse Problem Theory and Methods for Model Parameter Estimation6. Appendicespp.159 - 251Chapter DOI:https://doi.org/10.1137/1.9780898717921.ch6PDFBibTexSections ToolsAdd to favoritesExport CitationTrack CitationsEmail SectionsAboutExcerpt 6.1 Volumetric Probability and Probability Density A probability distribution A→P (A) over a manifold can be represented by a volumetric probability F (x), defined through P (A) = ∫A dV (x) F (x) , 6.1 or by a probability density f (x), defined through P (A) = ∫A d x ƒ (x) , 6.2 where dx = dx1dx2…. While, under a change of variables, a probability density behaves as a density (i.e., its value at a point gets multiplied by the Jacobian of the transformation), a volumetric probability is a scalar (i.e., its value at a point remains invariant: it is defined independently of any coordinate system). Defining the volume density through V (A) = ∫A d x υ (x) 6.3 and considering the expression V (A) = ∫A d V (x) , we obtain dV (x) =υ (x) dx . 6.4 It follows that the relation between volumetric probability and probability density is ƒ (x) =υ (x) F (x) . 6.5 While the homogeneous probability distribution (the one assigning equal probabilities to equal volumes of the space) is, in general, not represented by a constant probability density, it is always represented by a constant volumetric probability. Although I prefer, in my own work, to use volumetric probabilities, I have chosen in this text to use probability densities (for pedagogical reasons). Previous chapter Next chapter RelatedDetails Published:2005ISBN:978-0-89871-572-9eISBN:978-0-89871-792-1 https://doi.org/10.1137/1.9780898717921Book Series Name:Other Titles in Applied MathematicsBook Code:OT89Book Pages:ix + 339Key words:Inverse problems, inverse methods, probability, uncertainties, least-squares
    Symmetric probability distribution
    Location parameter
    Constant (computer programming)
    Probability provides the theoretical basis for almost all of machine learning and most of analytics, and it is a critical mindset for data scientists to be able to adopt. This chapter attempts to build up the subject of probability in a very intuitive way. It shows two of the simplest, most intuitive, and most important probability models, including coin flipping, and dart throwing. Using these as motivation, the chapter gives a more formal treatment of probability concepts. It focuses on several of the most important probability distributions for data scientist. A single-dimensional random variable (RV) is described by a probability mass function if it is discrete or a probability distribution function if it is continuous. We can also have a RV that returns a random vector of d-dimensions. The concepts of probability mass function and probability density generalize naturally.
    Empirical probability
    Probability and statistics
    Symmetric probability distribution
    Frequentist probability
    The optical parametric effect based on third-order nonlinearity in fibers and semiconductor optical amplifiers has been applied to ultrafast all-optical signal processing. We describe recent experimental results on wavelength conversion, gate switching, and parametric amplification.
    Optical parametric amplifier
    SIGNAL (programming language)
    Semiconductor optical gain
    Citations (0)
    The predictive posterior probability density function in the case of rectangular probability model is here derived. An example of application of this probability density function to a calculation of measurement uncertainty of a radiofrequency measurement is also illustrated. This probability density function is particularly useful when no prior information is available concerning the distribution of a quantity (e.g. electromagnetic field) over a region of space or time interval.
    Quantile function
    Location parameter
    Symmetric probability distribution
    This chapter provides steps and snapshots for computing probabilities in Minitab. The option Probability density appears when the distribution is continuous and provides the ordinate of the probability density function for the given input value. Probability is the option that appears when the distribution is discrete, and provides the probability associated with the input value. The option Cumulative Probability provides the probability of having a value less than or equal to the input. The shape of probability distributions can easily be represented using the option: Graph > Probability Distribution Plot. A process is said to be 'Six Sigma' when the distance between the nominal value and the tolerance limits of the produced output is equal to six times the standard deviation with which the output is produced. Controlled Vocabulary Terms computational statistics; Minitab; probability distribution; six sigma
    Symmetric probability distribution
    Citations (0)
    This paper presents probability density functions applicable to peaks, troughs and peak-to-trough excursions of coastal waves with finite water depth in closed form. It is found that for a non-Gaussian waves for which the skewness of the distribution is less than 1.2, the probability density function of peaks (and troughs) can be approximately represented by the Rayleigh distribution with a parameter which is a function of three parameters representing the non-Gaussian waves. The agreement between the probability density functions and the histograms constructed from data obtained by the Coastal Engineering Research Center is satisfactory.
    Rayleigh distribution
    Symmetric probability distribution
    Kurtosis
    Citations (0)
    We discuss several tests for determining whether a given set of independent and identically distributed (i.i.d.) draws does not come from a specified probability density function. The most commonly used are Kolmogorov–Smirnov tests, particularly Kuiper’s variant, which focus on discrepancies between the cumulative distribution function for the specified probability density and the empirical cumulative distribution function for the given set of i.i.d. draws. Unfortunately, variations in the probability density function often get smoothed over in the cumulative distribution function, making it difficult to detect discrepancies in regions where the probability density is small in comparison with its values in surrounding regions. We discuss tests without this deficiency, complementing the classical methods. The tests of the present paper are based on the plain fact that it is unlikely to draw a random number whose probability is small, provided that the draw is taken from the same distribution used in calculating the probability (thus, if we draw a random number whose probability is small, then we can be confident that we did not draw the number from the same distribution used in calculating the probability).
    Symmetric probability distribution
    Empirical probability
    Empirical distribution function
    Probability integral transform
    Citations (15)
    An inexpensive probability analyzer is designed for the measurement of the probability density function and distribution functions of random signals. The operation principle of the system is explained through detailed discussion of each active stage. It utilizes Schmitt triggers with low hysteresis to provide good accuracy. The output of the analyzer is measured by using a counter. Experimental probability density and distribution functions are obtained for a sinusoidal input, and the measured results are in good agreement with the theoretical functions.
    Hysteresis
    Citations (1)