logo
    An information-spectrum approach to capacity theorems for the general multiple-access channel
    51
    Citation
    19
    Reference
    10
    Related Paper
    Citation Trend
    Abstract:
    The paper deals with the capacity problems for the general multiple-access channel, where the channel transition probabilities are arbitrary for every blocklength n. The approach used here, which is called the information-spectrum approach, is quite different from the standard typical-sequence and/or AEP techniques. The general formula for the capacity region for the general multiple-access channel is thus established. In order to examine the potentiality, we apply it to the mixed channel to obtain the formula for its capacity region. The case where input cost constraints are imposed is also considered.
    Keywords:
    Information Theory
    Sequence (biology)
    Abswac-A novel channel estimation algorithm with PN sequence employed in TD-SCDMA systems is presented. At the transmitrer, training sequence is designed with PN sequence as described in 3GPP specifications and at the receiver the proposed algorithm is employed tu estimate the channel responses. The algorithm makes use of the correlation property of PN sequence tu maximize the estimation gain. Thanks tu the characteristic of PN sequence, the algorithm considerably decreases the computational load with quality. The performance and computation are analyzed compared with those of traditional algorithm. The simulation results compare the performance of two estimation algorithms.
    Sequence (biology)
    Citations (0)
    To improve the performance of detecting the transmitted sequence,the Viterbi algorithm(VA) used in the multi-path channels was modified.The recursive metric equation was built and the renewed branch transition metric was given.The modified VA can estimate the channel information and detect the transmitted sequence simultaneously.Meanwhile,the channel information can be estimated by adaptive filter.The simulation results showed that the modified VA is near to the optimal algorithm,however,its computational speed was faster than the optimal algorithm.
    Viterbi algorithm
    Sequence (biology)
    Citations (0)
    It is indispensable to take a low complexity and high accuracy channel estimation algorithm into account under rapidly growth in wireless communication. In this paper, we propose an algorithm for channel estimation using complementary sequence (CS). By utilizing the fantastic autocorrelation property of CS, time domain channel estimation can be easily achieved. We analyzed the complementary sequence, m sequence and combinational Barker code (CB-c) including autocorrelation function (ACF) and expansion principle. Furthermore, the system model and the design of channel estimation sequence are described. The normalized mean square error (NMSE) performance shows that the CS is superior to the other two kinds of sequence. Meanwhile, the BER performance proved the feasibility of complementary sequences for time domain channel estimation.
    Sequence (biology)
    Code (set theory)
    Shannon's theoremShannon's theorem is one of the most important results in the foundation of information theory (Shannon & Weaver, 1949). It says that the Channel capacity channel capacity Channel Capacity Channel capacity c determines exactly what can effectively be transmitted across the Channel channel. If you want to transmit less than c bits of information per time unit across the Channel channel, you can manage to do it in such a way that you can recover the original information from the channel output with high fidelity (i.e., with low error probabilities Error probability ). However, if you want to transmit more than c bits per time unit across the Channel channel, this cannot be done with high fidelity. This theorem again underlines the fact that information is incompressible (like water) and that a given Channel channel can only transmit a given amount of it in a given time.
    Shannon–Hartley theorem
    Information Theory
    Binary erasure channel
    A channel estimation algorithm based on superimposed training sequence is proposed in this paper. The algorithm does not need the allocation of time slot for superimposed training sequence, and it can estimate the channel coefficients without loss of bandwidth. The mean squared difference, Cramer-Rao bound and the lower bound of channel capacity are deduced based on the LS algorithm of channel estimation. It has simple structure, less computational requirement by using the LS algorithm. The result of simulation shows that the performance of system is improved to a large extent by using superimposed training sequence rather than direct training sequence, and the capacity of system is also increased.
    Sequence (biology)
    While Shannon information theory deals with the optimum coding of data in a bandwidth and signal to noise ratio (SNR) constrained communications channel, he provides no guidance on what data to transmit through the channel. With the increased demand for information from networked sensors, we propose the use of expected information value rate (EIVR) as the criterion to maximize the usage of the effective channel capacity in an information constrained channel. This criterion, utilizing the concepts of both situation information and sensor information, is able to determine which information to acquire as well as the sensor which has the best next collection opportunity to obtain that information.
    Information Theory
    Value of information
    Citations (0)
    Shannon's two-way channel problem has attracted the attention of information theorists for many years. In a classic paper Shannon gave both an outer and an inner bound to the capacity region of the two-way channel. Schalkwijk recently obtained an improvement to the inner bound for the Blackwell multiplying channel (BMC). We present the first improvements on Shannon's outer bound. Calculation shows that our results are close to optimum when applied to the BMC.
    Information Theory
    Citations (54)
    The inference from Shannon formula that the continuous channel with no noise possesses infinite channel capacity is analysed, and it is shown that the inference is inconsistent with the derivation of Shannon formula , and that the definition of Shannon differential entropy is inperfect , as well as that there exists a problem of information singularity in the Shannon information theory .
    Information Theory
    Shannon–Hartley theorem
    Differential entropy
    Citations (0)
    The applications of the general formulae of channel capacity developed in the quantum information theory to evaluation of information transmission capacity of optical channel are interesting subjects. In this review paper, we will point out that the formulation based on only classical-quantum channel mapping model may be inadequate when one takes into account a power constraint for noisy channel. To define the power constraint well, we should explicitly consider how quantum states are conveyed through a transmission channel. Basing on such consideration, we calculate a capacity formula for an attenuated noisy optical channel with genreral Gaussian state input; this gives certain progress beyond the example in our former paper.
    Classical capacity
    In this paper, we study a scenario of multiple input multiple output system using relays in an angle showing how the topology of the system can play its role in the situation. We assume perfect channel knowledge at the transmitter and the relay(s) which is represented by a matrix of zero mean circularly symmetric complex Gaussian (ZMCSCG) random variables with known covariances, but imperfect channel state information (CSI) at the receiver. The imperfect CSI at the receiver causes errors to the received signal and as a consequence the channel capacity saturates. We derive the channel capacity formulas for two different transmission algorithms; uniform and waterfilling. Finally we simulate and compare the results
    Channel state information
    Citations (4)