An autonomous agent is an intelligent agent operating on an owner's behalf but without any interference of that ownership entity. An Intelligent agent, however appears according to a multiply cited statement in a no longer accessible IBM white paper as follows:
Intelligent agents are software entities that carry out some set of operations on behalf of a user or another program with some degree of independence or autonomy, and in so doing, employ some knowledge or representation of the user's goals or desires.
Such an agent is a system situated in, and part of, a technical or natural environment, which senses any or some status of that environment, and acts on it in pursuit of its own agenda. Such an agenda evolves from drives (or programmed goals). The agent acts to change part of the environment or of its status and influences what it sensed.
Non-biological examples include intelligent agents, autonomous robots, and various software agents, including artificial life agents, and many computer viruses. Biological examples are not yet defined.
Lee et al. (2015) post safety issue from how the combination of external appearance and internal autonomous agent have impact on human reaction about autonomous vehicles. Their study explores the humanlike appearance agent and high level of autonomy are strongly correlated with social presence, intelligence, safety and trustworthy. In specific, appearance impact most on affective trust while autonomy impact most on both affective and cognitive domain of trust where cognitive trust is characterized by knowledge-based factors and affective trust ls largely emotion driven 
- Lee, Jae-Gil (Summer 2015). "Can Autonomous Vehicles Be Safe and Trustworthy? Effects of Appearance and Autonomy of Unmanned Driving Systems". International Journal of Human-Computer Interaction. 31: 682–691 – via Taylor & Francis Online.
- Franklin, Stan and Graesser, Art (1997) "Is it an Agent, or just a Program?: A Taxonomy for Autonomous Agents", Intelligent Agents III, Berlin: Springer Verlag, 21-35
- Sun, Ron (September 1, 2001). Duality of the Mind: A Bottom-up Approach Toward Cognition. New Jersey: Lawrence Erlbaum. p. 304. ISBN 978-0-585-39404-6.