Resource Allocation and Throughput Maximization for IoT Real-time Applications

2020 
The foreseen enormous generation of mobile data would result in congestion of the spectrum available. To efficiently use the available spectrum new paradigm named fog computing is a promising solution. In this paper, we developed a fog-IoT network to provide an $\varepsilon$-optimal resource allocation to maximize the overall network throughput. A joint cloudlet selection and power allocation problem is formulated under association and Quality-of-Service (QoS) constraints. The formulated problem falls in class of mixed-integer nonlinear programming (MINLP) problem which is NP-hard generally. We solved our problem by applying a less complex linearization technique that uses the outer approximation algorithm (OAA). Resource allocation and power allocation are efficiently conducted as a result of this optimization, which is less complicated compared to exhaustive search.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    7
    References
    1
    Citations
    NaN
    KQI
    []