peas description of task environment with different types of properties
2,675 views
13 slides
Dec 15, 2021
Slide 1 of 13
1
2
3
4
5
6
7
8
9
10
11
12
13
About This Presentation
Ai related Topics: PEAS: A task environment specification that includes Performance measure, Environment, Actuators, and Sensors. Agents can improve their performance through learning. This is a high-level presentation of agent programs.
Size: 172.58 KB
Language: en
Added: Dec 15, 2021
Slides: 13 pages
Slide Content
PEAS description of task environment with different types of properties Presented by Md. Monir Ahammod 16CSE061 Department of Computer Science and Engineering BSMRSTU Supervised by Md. Nesarul Hoque Assistant Professor, Department of Computer Science and Engineering BSMRSTU
Contents W hat is PEAS ? PEAS Description . Example of PEAS. Properties of task environments. 1
What is PEAS ? PEAS is under the heading of the task environment. When we define a rational agent , we group these properties under PEAS , the problem specification for the task environment. PEAS stands for: – P erformance – E nvironment – A ctuators – S ensors 2
PEAS Description Performance Measure : Performance measure is the unit to define the success of an agent . Environment : Environment is the surrounding of an agent at every instant . Actuator : Actuator is a part of the agent that delivers the output of an action to the environment . Sensor : Sensors are the receptive parts of an agent which takes in the inpu t for the agent. 3
Example of PEAS. What is PEAS for a self-driving car ? • P erformance: Safety, time, legal drive, comfort . • E nvironment: Roads, other cars, pedestrians, road signs . • A ctuators: Steering, accelerator, brake, signal, horn . • S ensors: Camera, sonar, GPS, Speedometer, odometer, accelerometer , engine sensors, keyboard 4
Properties of task environment s/ Environment types/ task environments behavior. Fully Observable vs Partially Observable Deterministic vs Stochastic Episodic vs sequential Single-agent vs Multi-agent Static vs Dynamic Discrete vs Continuous 5
Fully Observable vs Partially Observable When an agent sensor is capable to sense or access the complete state of an agent at each point in time, it is fully observable environment else partially observable . Example : Chess – the board is fully observable, so are the opponent’s moves. Driving – the environment is partially observable because what’s around the corner is not known. 6
Deterministic vs Stochastic When the agent’s current state completely determines the next state of the agent, the environment is deterministic. The stochastic environment is random in nature which is not unique and cannot be completely determined by the agent . Example: Chess – there would be only a few possible moves for a coin at the current state and these moves can be determined. Self Driving Cars – the actions of a self-driving car are not unique, it varies time to time. 7
Episodic vs sequential In each the next episode does not depend on the actions taken in previous episodes. In sequential environments, on the other hand, the current decision could affect all future decisions. Part picking robot is episode. Chess and taxi driving are sequential. 8
Single-agent vs Multi-agent An environment consisting of only one agent is said to be a single-agent environment. A person left alone in a maze is an example of the single-agent system. An environment involving more than one agent is a multi-agent environment. The game of football is multi-agent as it involves 11 players in each team is an example of the multi -agent system . 9
Static vs Dynamic Static environment changing itself when the agent is up with some action is said to be dynamic. A roller coaster ride is dynamic as it is set in motion and the environment keeps changing every instant. An idle environment with no change in its state is static environment. An empty house is static as there’s no change in the surroundings when an agent enters. 10
Discrete vs Continuous If an environment consists of a finite number of actions, it is said to be a discrete environment. The game of chess is discrete as it has only a finite number of moves. The environment in which the actions performed cannot be numbered i.e. is not discrete, is said to be continuous. Self-driving cars are an example of continuous environments. 11