Performance comparison of LTE FDD and TDD based Smart Grid communications networks for uplink biased traffic

Publication Type:
Conference Proceeding
2012 IEEE 3rd International Conference on Smart Grid Communications, SmartGridComm 2012, 2012, pp. 276 - 281
Issue Date:
Full metadata record
Files in This Item:
Filename Description Size
06485996.pdfPublished version373.17 kB
Adobe PDF
LTE is a candidate wide area communications network for the Smart Grid and can enable applications such as AMI, Demand Response and WAMS. We compare the uplink performance of the LTE FDD and TDD modes for a typical Smart Grid scenario involving a large number of devices sending small to medium size packets to understand the advantages and disadvantages of these two modes. An OPNET simulation model is employed to facilitate realistic comparisons based upon latency and channel utilization. We demonstrate that there is a critical packet size above which there is a step increase in uplink latency due to the nature of the LTE uplink resource scheduling process. It is shown that FDD leads to better uplink performance in terms of latency, while TDD can provide greater flexibility when the split between uplink and downlink data is asymmetrical (as it is expected to be in a Smart Grid environment). It is also demonstrated that the capacity of both FDD and TDD systems in terms of the number of serviced devices is control channel (PDCCH) limited for small infrequent packets, but TDD has the advantage that the capacity remains data channel (PUSCH) limited for smaller packet sizes and lower data burst rates than an FDD system. © 2012 IEEE.
Please use this identifier to cite or link to this item: