A security framework in G-Hadoop for big data computing across distributed Cloud data centres

Publication Type:
Journal Article
Citation:
Journal of Computer and System Sciences, 2014, 80 (5), pp. 994 - 1007
Issue Date:
2014-01-01
Full metadata record
Files in This Item:
Filename Description Size
Thumbnailpaper.pdfPublished Version541.31 kB
Adobe PDF
MapReduce is regarded as an adequate programming model for large-scale data-intensive applications. The Hadoop framework is a well-known MapReduce implementation that runs the MapReduce tasks on a cluster system. G-Hadoop is an extension of the Hadoop MapReduce framework with the functionality of allowing the MapReduce tasks to run on multiple clusters. However, G-Hadoop simply reuses the user authentication and job submission mechanism of Hadoop, which is designed for a single cluster. This work proposes a new security model for G-Hadoop. The security model is based on several security solutions such as public key cryptography and the SSL protocol, and is dedicatedly designed for distributed environments. This security framework simplifies the users authentication and job submission process of the current G-Hadoop implementation with a single-sign-on approach. In addition, the designed security framework provides a number of different security mechanisms to protect the G-Hadoop system from traditional attacks. © 2014 Elsevier Inc.
Please use this identifier to cite or link to this item: