Abstract
Data throughput is an important metric used in the performance evaluation of the next generation cellular networks such as Long-Term Evolution (LTE) and LTE-Advanced. To evaluate the performance of these networks, Monte Carlo simulation schemes are usually used. Such simulations do not provide the throughput of intermediate call state, instead it gives the overall performance of the network. We propose a hybrid model consisting of both analysis and simulation. The benefit of the model is that the throughput of any possible call state in the system can be evaluated. Here, the probability of possible call distribution is first obtained by analysis, which is used as input to the event-driven based simulator to calculate the throughput of a call state. We compare the throughput obtained from our hybrid model with that obtained from event-driven based simulation. Numerical results are presented and show good agreement between both the proposed hybrid model and the simulation. The maximum difference of relative throughput between our hybrid model and the simulation is found in the interval of(0.04%;1.06%) over a range of call arrival rates, meanholding times and number of resource blocks in the system.
| Original language | English |
|---|---|
| Pages (from-to) | 62-70 |
| Journal | Journal of Engineering |
| Volume | 2014 |
| Issue number | 3 |
| DOIs | |
| Publication status | Published - 1 Dec 2013 |
Keywords
- OFDMA
Fingerprint
Dive into the research topics of 'Hybrid model for throughput evaluation of OFDMA networks'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver