QoE and power efficiency tradeoff for fog computing networks with fog node cooperation
- Publication Type:
- Conference Proceeding
- Proceedings - IEEE INFOCOM, 2017
- Issue Date:
Files in This Item:
|QoE and power efficiency tradeoff for fog computing networks with fog node cooperation.pdf||Published version||494.2 kB|
Copyright Clearance Process
- Recently Added
- In Progress
- Closed Access
This item is closed access and not available.
© 2017 IEEE. This paper studies the workload offloading problem for fog computing networks in which a set of fog nodes can offload part or all the workload originally targeted to the cloud data centers to further improve the quality-of-experience (QoE) of users. We investigate two performance metrics for fog computing networks: users' QoE and fog nodes' power efficiency. We observe a fundamental tradeoff between these two metrics for fog computing networks. We then consider cooperative fog computing networks in which multiple fog nodes can help each other to jointly offload workload from cloud data centers. We propose a novel cooperation strategy referred to as offload forwarding, in which each fog node, instead of always relying on cloud data centers to process its unprocessed workload, can also forward part or all of its unprocessed workload to its neighboring fog nodes to further improve the QoE of its users. A distributed optimization algorithm based on distributed alternating direction method of multipliers (ADMM) via variable splitting is proposed to achieve the optimal workload allocation solution that maximizes users' QoE under the given power efficiency. We consider a fog computing platform that is supported by a wireless infrastructure as a case study to verify the performance of our proposed framework. Numerical results show that our proposed approach significantly improves the performance of fog computing networks.
Please use this identifier to cite or link to this item: