language-icon Old Web
English
Sign In

Majorization

In mathematics, majorization is a preorder on vectors of real numbers. For a vector a ∈ R d {displaystyle mathbf {a} in mathbb {R} ^{d}} , we denote by a ↓ ∈ R d {displaystyle mathbf {a} ^{downarrow }in mathbb {R} ^{d}} the vector with the same components, but sorted in descending order. Given a , b ∈ R d {displaystyle mathbf {a} ,mathbf {b} in mathbb {R} ^{d}} , we say that a {displaystyle mathbf {a} } weakly majorizes (or dominates) b {displaystyle mathbf {b} } from below written as a ≻ w b {displaystyle mathbf {a} succ _{w}mathbf {b} } iff In mathematics, majorization is a preorder on vectors of real numbers. For a vector a ∈ R d {displaystyle mathbf {a} in mathbb {R} ^{d}} , we denote by a ↓ ∈ R d {displaystyle mathbf {a} ^{downarrow }in mathbb {R} ^{d}} the vector with the same components, but sorted in descending order. Given a , b ∈ R d {displaystyle mathbf {a} ,mathbf {b} in mathbb {R} ^{d}} , we say that a {displaystyle mathbf {a} } weakly majorizes (or dominates) b {displaystyle mathbf {b} } from below written as a ≻ w b {displaystyle mathbf {a} succ _{w}mathbf {b} } iff Equivalently, we say that b {displaystyle mathbf {b} } is weakly majorized (or dominated) by a {displaystyle mathbf {a} } from below, written as b ≺ w a {displaystyle mathbf {b} prec _{w}mathbf {a} } . If a ≻ w b {displaystyle mathbf {a} succ _{w}mathbf {b} } and in addition ∑ i = 1 d a i = ∑ i = 1 d b i {displaystyle sum _{i=1}^{d}a_{i}=sum _{i=1}^{d}b_{i}} , we say that a {displaystyle mathbf {a} } majorizes (or dominates) b {displaystyle mathbf {b} } , written as a ≻ b {displaystyle mathbf {a} succ mathbf {b} } . Equivalently, we say that b {displaystyle mathbf {b} } is majorized (or dominated) by a {displaystyle mathbf {a} } , written as b ≺ a {displaystyle mathbf {b} prec mathbf {a} } . Note that the majorization order does not depend on the order of the components of the vectors a {displaystyle mathbf {a} } or b {displaystyle mathbf {b} } . Majorization is not a partial order, since a ≻ b {displaystyle mathbf {a} succ mathbf {b} } and b ≻ a {displaystyle mathbf {b} succ mathbf {a} } do not imply a = b {displaystyle mathbf {a} =mathbf {b} } , it only implies that the components of each vector are equal, but not necessarily in the same order. Note that the notation is inconsistent in the mathematical literature: some use the reverse notation, e.g., ≻ {displaystyle succ } is replaced with ≺ {displaystyle prec } . A function f : R d → R {displaystyle f:mathbb {R} ^{d} o mathbb {R} } is said to be Schur convex when a ≻ b {displaystyle mathbf {a} succ mathbf {b} } implies f ( a ) ≥ f ( b ) {displaystyle f(mathbf {a} )geq f(mathbf {b} )} . Similarly, f ( a ) {displaystyle f(mathbf {a} )} is Schur concave when a ≻ b {displaystyle mathbf {a} succ mathbf {b} } implies f ( a ) ≤ f ( b ) . {displaystyle f(mathbf {a} )leq f(mathbf {b} ).} The majorization partial order on finite sets, described here, can be generalized to the Lorenz ordering, a partial order on distribution functions. For example, a wealth distribution is Lorenz-greater than another iff its Lorenz curve lies below the other. As such, a Lorenz-greater wealth distribution has a higher Gini coefficient, and has more income inequality. The order of the entries does not affect the majorization, e.g., the statement ( 1 , 2 ) ≺ ( 0 , 3 ) {displaystyle (1,2)prec (0,3)} is simply equivalent to ( 2 , 1 ) ≺ ( 3 , 0 ) {displaystyle (2,1)prec (3,0)} . (Strong) majorization: ( 1 , 2 , 3 ) ≺ ( 0 , 3 , 3 ) ≺ ( 0 , 0 , 6 ) {displaystyle (1,2,3)prec (0,3,3)prec (0,0,6)} . For vectors with n components

[ "Inequality", "Matrix (mathematics)", "Combinatorics", "Discrete mathematics", "Algebra", "Stress majorization", "Schur-convex function", "Schur–Horn theorem" ]
Parent Topic
Child Topic
    No Parent Topic