Privacy and evolutionary cooperation in neural-network-based game theory

Publisher:
Elsevier
Publication Type:
Journal Article
Citation:
Knowledge-Based Systems, 2023, 282, pp. 111076
Issue Date:
2023-12-20
Filename Description Size
Privacy and evolutionary cooperation.pdfAccepted version2.32 MB
Adobe PDF
Full metadata record
How cooperation evolves is one of the fundamental research problems of multi-agent systems. With a deeper understanding of the forces that promote cooperation, we could increase the proportion of cooperative agents. However, existing methods for cultivating cooperation have two common limitations. Firstly, most do not take the privacy for the agents into account. Privacy preservation is an essential to the systems with multiple agents, because, without privacy preservation, an adversarial agent can use the private information of other agents to maximize its own payoff. Beyond the obvious black hat implication, this is also detrimental because maximizing one's own payoff typically implies minimizing the payoffs of others, which tends to reduce the number of agents willing to cooperate. The second limitation is that most existing methods have a limited scope for generalization. Their performance is usually highly dependent on specific circumstances, e.g., the system topology, the initial proportion of cooperative agents, etc. To overcome these two drawbacks, we propose a novel method that combines differential privacy to protect an agent's private information from adversaries with a neural network architecture to optimize the decision making power of agents in a wider range of situations. Through this joint implementation, each agent's privacy is guaranteed, yet agents are still encouraged to cooperate even when adversaries are present. One notable application example is federated learning (FL) systems. In FL, clients can be incentivized by our method to actively cooperate with the central server by contributing high-quality model updates, while necessitating the utmost assurance of their data privacy protection.
Please use this identifier to cite or link to this item: