In this work, we adopt the analysis of a heterogeneous cellular network by means of stochastic geometry, to estimate energy and spectral network efficiency. More specifically, it has been the widely spread experience that practical field assessment of the Signal-to-Noise and Interference Ratio (SINR), being the key physical-layer performance indicator, involves quite sophisticated test instrumentation that is not always available outside the lab environment. So, in this regard, we present here a simpler test model coming out of the much easier-to-measure Bit Error Rate (BER), as the latter can deteriorate due to various impairments regarded here as equivalent with additive white Gaussian noise (AWGN) abstracting (in terms of equal BER degradation) any actual non-AWGN impairment. We validated the derived analytical model for heterogeneous two-tier networks by means of an ns3 simulator, as it provided the test results that fit well to the analytically estimated corresponding ones, both indicating that small cells enable better energy and spectral efficiencies than the larger-cell networks.
In this paper, we analyze an arbitrary heterogeneous cellular network applying stochastic geometry, and propose a modified model for assessing network spectral and energy efficiency. With this regard, we recognize that, in practice, determining Signal-to-Noise-and-Interference Ratio (SINR) as the key performance indicator, requires complex field test equipment, which might not be available or affordable. Therefore, we propose here a simple model that is based on the relatively easy measurable Bit-Error Rate (BER), whose degradation caused by various impairments is considered here as if it was due to the according additive white Gaussian noise (AWGN), thus abstracting any specific non-AWGN distortion. The proposed analytical model is verified by ns3 software network simulator, whose test results are found to match the corresponding estimated values. This indicates that both spectral and energy efficiencies of small-cell networks are higher than in larger-cell networks, even more for heterogeneous two-tier networks.
For quite a while, it has been evident that homogeneous network architectures, based on cells with a uniform radiation pattern, cannot fulfill the ever increasing demand of mobile users for capacity and service quality while still preserving spectrum and energy. However, only with the introduction of the Fourth Generation mobile communication networks to deal with the surging data traffic of multimedia applications, have smaller cells been widely used to break down service zone areas of macro base stations into multiple tiers, thus improving network performance, reducing traffic congestion, and enabling better management of spectrum and energy consumption in a macro network. In this paper, we present an analytical model for assessing the efficiency of bandwidth and energy usage, as well as of network deployment, taking into account overall network investment and maintenance costs. This paves the way to the improved planning of network coverage, and its capacity and reliability, thus preserving its spectrum and energy, as well as the environment. The analysis considers the downlink of an arbitrary heterogeneous cellular network by using tools of stochastic geometry that adopt the distribution of base stations in the form of a Poisson Point Process. The proposed analytical model is verified by the according software simulations using the ns-3 network simulator. The obtained results closely match the theoretically predicted values and boundaries, clearly indicating that, in all three analyzed aspects: spectral, energy, and deploymental, the efficiency of small-cell networks was higher with respect to traditional large-cell networks and increased even further for heterogeneous (two-tier in our tests) networks.
The main OFDM drawbacks are Carrier Frequency Offset (CFO) and large Peak-to-Average Power Ratio (PAPR), which both degrade the Bit Error Rate (BER). Specifically, we consider here clipping or any other PAPR reduction method sufficient to prevent the nonlinear high-power amplifier from generating errors. Moreover, in small cells, the signal-to-noise ratio is large, while the small time dispersion allows the OFDM symbol cyclic prefix to prevent intersymbol interference. This retains the CFO to solely determine the BER and vice versa, enabling indirect estimation of CFO-induced phase distortion by simple BER testing. However, a particular problem is measuring very low BER values (generated by alike residual CFO), which could last a long time in order to acquire statistically enough errors. The test time can be drastically reduced if the noise margin is reduced in a controllable way, by adding the interfering signal to each subcarrier at the receiver. This approach is shown to enable efficient and accurate short-term BER (and so CFO phase error) testing.
In this paper we present an analytical model for planning and using network resources to improve network coverage, capacity and reliability, reduce network investment and maintenance costs, as well as reduce the electrical power consumption. The analysis considers the downlink of an arbitrary heterogeneous cellular network by using tools of stochastic geometry that adopts the distribution of base stations in the form of Poisson Point Process (PPP). To prove the analytical model, simulation based on ns-3 network simulator has been conducted, with accurately matching the theoretical values and boundaries.
Ova stranica koristi kolačiće da bi vam pružila najbolje iskustvo
Saznaj više