Cost-Effective Foliage Penetration Human Detection under Severe Weather Conditions based on Auto-Encoder/Decoder Neural Network

Publication Type:
Journal Article
Citation:
IEEE Internet of Things Journal, 2018
Issue Date:
2018-01-01
Filename Description Size
08516280.pdfAccepted Manuscript Version824.06 kB
Adobe PDF
Full metadata record
IEEE Military surveillance events and rescue activities are vital missions for the Internet-of-things. To this end, foliage penetration for human detection plays an important role. However, although the feasibility of that mission has been validated, we observe that it still cannot performs promisingly under severe weather conditions such as rainy, foggy, and snowy days. Therefore, in this paper, experiments are conducted under severe weather conditions based on a proposed deep learning approach. We present an Auto-Encoder/Decoder (Auto-ED) deep neural network that can learn the deep representation and conduct classification task concurrently. Since the property of cost-effective, the device-free sensing (DFS) techniques are used to address human detection in our case. As we pursue the signal-based mission, two components are involved in the proposed Auto-ED approach. First, an encoder is utilized that encode signal-based inputs into higher dimensional tensors by fractionally-strided convolution operations. Then, a decoder is leveraged with convolution operations to extract deep representations and learn the classifier simultaneously. To verify the effectiveness of the proposed approach, we compare it with several machine learning approaches under different weather conditions. Also, a simulation experiment is conducted by adding Additive White Gaussian Noise (AWGN) to the original target signals with different Signal to Noise Ratios (SNRs). Experimental results demonstrate that the proposed approach can best tackle the challenge of human detection under severe weather conditions in the high-clutter foliage environment, which indicates its potential application values in the near future.
Please use this identifier to cite or link to this item: