During the first week of the vertical slice, a significant amount of time was spent discussing the direction of the game. There was no shortage of ideas being tossed around, but we eventually nailed down the core mechanics that we would develop. A light narrative around the character's origins was also established to give context to the gameplay and level designs. The team decided to shift the focus more towards the stealth aspect of the game, rather than the concept of letting players steal everything in sight.
Due to the focus on stealth, I began working on the scripts that would be used for handling the AI of the NPCs. I created a DLL plugin which held scripts which could be placed on any object. These scripts would give the object the AI controller to move around and react to the player. Only a simple visual detection mechanic was implemented. It works by calculating the direction vector from the AI agent to the player. This direction vector is compared to the agent’s forward vector and if the angle between the 2 vectors is within the agent’s field of view, the second check is carried out. This check is a ray cast from the agent to the player character to determine if an object is occluding it. These simple checks will need to be further developed to allow for more robust visual detection. Currently, it only ray casts to the centre point of the player character. This means that the player character could have almost half of its model not occluded by a wall but it would still be invisible to the AI agent due to its central point being behind the wall. This needs to be developed to determine an approximate percentage of how much the player model is visible, which will affect how fast the detection level will increase. Our team can edit the values for the NPC's visual and audial perception, and field of view in the inspector. Another script in the DLL allows us to automatically create detection meters on the object which communicates with the AI controller script. This detection meter can be edited in the inspector, without having to access the detection meter’s canvas. The detection meter fills up to show how aware an NPC is of the player and can be used with any sprite. When the detection meter is full, the NPC will chase the player. The detection meter script has been set up in a way that allows the team to easily switch out the sprites. However, the NPC is not required to have a detection meter, it is merely a representation of the data held in the AI controller script. I've also begun work on the path editor. I experimented with scriptable objects, custom editor windows, and tree view windows. However, it was difficult to create an implementation that was intuitive and easy to use. During the second week, I will focus on developing the tool to be simpler.
0 Comments
Leave a Reply. |
AuthorContrary Scholars ArchivesCategories
All
|