Preference Neural Network and its Applications

Publication Type:
Thesis
Issue Date:
2021
Full metadata record
This thesis proposes a novel network for label ranking and classification problems. The Preference Neural Network (PNN) uses spearman correlation gradient ascent and two new activation functions, namely positive smooth staircase (PSS) and smooth staircase (SS) that accelerate the ranking by creating deterministic preference values. PNN is proposed in two forms, fully connected simple layers and Preference Net (PN), where the latter is the deep ranking form of PNN to learning feature selection using a novel ranker kernel to solve images classification problem instead of convolution. PNN achieves state-of-the-art for label ranking, and PN achieves promising results on CFAR-10 with high computational efficiency and open a new research direction in terms of image recognition. This thesis also introduces Novel new network architecture called the subgroup preference neural Network (SGPNN) that combines multiple networks that have different activation functions, learning rate, and output layer into one network to discover the hidden relation between the subgroups’ multi-labels. The SGPNN propose a novel type of neuron called multi-activation function neuron (MAFN) where the neuron has more than one activation function. Each activation function serve a subgroup of labels. The PNN many applications in image recognition and BCI where the data is ambiguous.
Please use this identifier to cite or link to this item: