Hostname: page-component-586b7cd67f-l7hp2 Total loading time: 0 Render date: 2024-11-25T04:56:56.144Z Has data issue: false hasContentIssue false

The challenge of trust, The Autonomous Agents '98 Workshop on Deception, Fraud and Trust in Agent Societies

Published online by Cambridge University Press:  01 May 1999

RINO FALCONE
Affiliation:
Group of "AI, Cognitive Modelling and Interaction'', IP-CNR, Rome, Italy. Email: [email protected]
BABAK SADIGHI FIROZABADI
Affiliation:
Department of Computing, Imperial College of Science, Technology and Medicine, University of London, UK. Email: [email protected]

Abstract

Both the advent of large communication networks and the development of agent models designing more and more sophisticated (in some cases autonomous) behaviours, are producing what can be called an Agent Society. Agent societies are constituted by both artificial and human agents (in perspective indistinguishable from each other) communicating with each other (not necessarily each with all the others), with individual or collective tasks, different resources, different skills, and so on. In other words, these agent societies will become more and more similar to the human ones. As in real societies, in these virtual societies it will be necessary to consider all the problems connected with secure and reliable interaction, with trusting the other agents, with the possibility to be frauded, cheated, deceived in some exchange or relationship.

Type
Review Article
Copyright
© 1999 Cambridge University Press

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)