Cost-Effective Foliage Penetration Human Detection under Severe Weather Conditions Based on Auto-Encoder/Decoder Neural Network

Publication Type:
Journal Article
Citation:
IEEE Internet of Things Journal, 2019, 6 (4), pp. 6190 - 6200
Issue Date:
2019-08-01
Filename Description Size
08516280.pdfAccepted Manuscript Version824.06 kB
Adobe PDF
Full metadata record
© 2014 IEEE. Military surveillance events and rescue activities are vital missions for the Internet-of-Things. To this end, foliage penetration for human detection plays an important role. However, although the feasibility of that mission has been validated, we observe that it still cannot perform promisingly under severe weather conditions, such as rainy, foggy, and snowy days. Therefore, in this paper, experiments are conducted under severe weather conditions based on a proposed deep learning approach. We present an auto-encoder/decoder (Auto-ED) deep neural network that can learn the deep representation and conduct classification task concurrently. Since the property of cost-effective, the device-free sensing techniques are used to address human detection in our case. As we pursue the signal-based mission, two components are involved in the proposed Auto-ED approach. First, an encoder is utilized that encode signal-based inputs into higher dimensional tensors by fractionally strided convolution operations. Then, a decoder is leveraged with convolution operations to extract deep representations and learn the classifier simultaneously. To verify the effectiveness of the proposed approach, we compare it with several machine learning approaches under different weather conditions. Also, a simulation experiment is conducted by adding additive white Gaussian noise to the original target signals with different signal to noise ratios. Experimental results demonstrate that the proposed approach can best tackle the challenge of human detection under severe weather conditions in the high-clutter foliage environment, which indicates its potential application values in the near future.
Please use this identifier to cite or link to this item: