Perception
Perception allows enemies to detect players and other actors in the world using sight, hearing, or custom senses. Detected events are passed to the controller, which updates the AI behaviour.
Overview
The Perception System enables enemies to sense and react to their surroundings. Built on Unreal Engine’s AI Perception Component, it allows enemies to detect players or other actors through different channels such as sight, hearing, or custom-defined senses. When a perception event occurs, the Enemy AI Controller updates the relevant blackboard keys, ensuring that the behaviour tree can respond immediately to new information.
This system is what drives transitions between states like patrolling, chasing, and searching. By adjusting parameters such as sight radius, field of view, or hearing range, you can fine-tune how aware enemies are of their environment. Combined with other systems, perception provides the foundation for making enemy behaviour feel responsive and dynamic.
Included Senses
The Enemy AI - Toolkit automatically handles visual & sound perception. It also provides you with custom events in order to easily hook other perception senses into your AI.
Sight / Visual Sense
Allows the enemy to see. Updates the target, based on an alogorithm selecting the closest target of all visual perceived actors.
Hearing / Sound Sense
Allows the enemy to hear. The enemy can hear sounds in its invironment. This can trigger the alerted state allowing the enemy to investigate sound stimuli.
You can also extend the system with custom senses or tweak parameters such as sight radius, field of view, or hearing range to create more advanced and specialised behaviours. In addition, the controller provides an On Other Perception (within the 'Perception Logic' Graph) event that allows you to handle any non-visual and non-sound stimuli. Use this to implement custom perception logic (for example, smell, vibration, or magic detection) and update blackboard values or behaviour-tree states accordingly.
Filter Tags
Filter tags are great! They allow you to define the tags that stimuli must have in order to be recognised as a perceivable source by your agent. They are defined in the enemy configuration asset and can be set for visual and sound perception, respectively.

Filter tags can be used for all sorts of things. For example, you could create different teams! Let's say you want to create two teams: "Red" and "Blue". You can then create two configurations, both the same, but with one using the 'Red' tag and the other using the 'Blue' tag for their visual perception. Then assign those configs to two character blueprints. Make sure the characters have tags that match your teams (Red and Blue). The enemy with the 'Red' tag in its config will then attack all enemy characters with the 'Red' tag, and vice versa. In other words: Filter tags allow you to specify what your agents should ignore or highlight. Get creative with that!
Last updated