This topic contains 36 replies, has 7 voices, and was last updated by  prime 2 years ago.

Viewing 15 posts - 16 through 30 (of 37 total)
  • Author
  • #5782


    thanks Aaron!
    OK, but before I do. I want to understand why!
    The detect already has a variable thats being set to true when the Sensor is triggered, and that is already being used by the constraints. So I dont understand why putting an expression right next to the detect is going to change that. Surly the detect will fire, the expression will foundplayer == true and that will get used in the constraints, how is that different?

    The sequencer? So will it detect, then, cos its a sequencer, move along one node to the expression, and fire that, do the soncstrainst, abut NOT re-fire… cos the sequencer kind of stops if repeating? So why dont I just set the detect to repeat “untill success” doesnt that do the same thing? (Doesnt work by the way…)

    I can try what you say, but I just dont get it…




    Wow.. that worked…
    Explanation someone.. please?



    Since the detect node is constantly running, it is constantly resetting your variable. If you put it in a sequencer and then assign a second variable after it - the second variable will be reset only when the sensor detect returns success (i.e., when it detects something). When the detect does not detect anything, the sequencer fails before it gets to the assign expression. In that case, the detect has reset its variable to null, but the second expression hasn’t.



    GREAT, thanks for the explanation, all is becoming clear!
    1 small q. Does this mean that my original variable (I think it was “player == null”) is not used at all now? The second CON not sets the new varaible, but the first still has that first one. Shall I remove it, (I can try this of course) or is it still doing something, like causing the first CON to work WHILEE the variable is still NULL?

    (I stay in this thread as it all related to detect)

    SO now, when too close to the AI, my player triggers the Sensor, and the AI runs home.
    I have 3 improvements to do on this, I think 2 are trivial, but the 3rd is maybe a new ball-game?

    1. Restart the whole Behaviour when done. I want the AI to have gone home, Wait for me to move away, and then (after a wait) come out of his house again and continue. Is this just to RESET the foundplayer variable to NULL when he is home? If Im still in the vicinity, the detect should imediatly fire again, and he will stya home) if Im further way, the variable becomes NULL and he comes out again? Is this OK?

    2.I would have 2 levels of his running away. So at a distance of 50 meters, he walks slowely home, but may pause and go back to wanderinf, but if I follow hime and become closeer than 25 meters, he runs (as above) Is this just to set 2 Sensors with bigger circles of influence? The outer one has him walk, the inner one has him run!

    3. (the trickier one) I have noticed that, by being sneaky, I can get my player to be BETWEEN my AI adn his house, then when he gets scared (i trigger his detect) he runs straight at me, and to his house. Is it possible for him to run AROUND me? Like my player having a auto nav mesh… I should NOT be able to touch him. Anyway to do this?
    OR. Another way might be that although he is trying to get home, he will always run AWAY from me? Anyway to have him alter his path dependant of MY location?

    Thats probably enough to chew on, hope IM not being annoying… my wife thinks I am.. she might not be the only one..



    Aaron Mueller


    Just a few random thoughts…

    1) You could set up an Expression node to set “foundplayer = false” to reset when the NPC reaches his “home”. I think you said your sensor was set to use line of sight. So, if the player chases him home, he shouldn’t be able to “see” him. I assume you meant FALSE when you said NULL.

    2) In a different thread, I recall @prime mentioned using two sensors to detect different distances to determine when to perform a melee attack behavior versus a ranged attack behavior. The new sensors are “passive” (from what I read elsewhere on these forums) so they should be less of a resource drain, but if you have a lot of NPCs with multiple sensors you may experience a performance drop (don’t quote me on that, I’m only creating a single NPC with a single sensor at the moment).

    If you want to code it, you could do a distance check between the player and the NPC and if it was < 25 meters do one behavior and if it’s > 25 meters walk slowly to destination.

    3) Maybe I didn’t set it up right, but I wasn’t able to get collision avoidance working using RAIN. I ended up using Unity’s built in NavMesh generation, NavMeshAgent and NavMeshObstacle. If you attach a NavMeshAgent to your Player and a NavMeshAgent to your NPC, when he gets close to you he should steer around you. NOTE: I haven’t tested this, but I noticed that if I run toward my NPCs, they avoid each other but I can pass through them like a ghost.

    The NavMeshObstacle (Unity Pro only) was something I actually needed in my current project, so for now Unity NavMesh is the option I have to use. I need to move around an obstacle (vehicle door).

    Anyway, just some thoughts. Hope that helps a little or at least gives you some ideas.




    Wow thats alot of help, thanks Aaron, The Expression being reset seems the way to go, I really need to keep an eye on performance, cos Im having terrible trouble with that already (Kind of on the back burner to solve, I really need Uniyy Pro to use the Profiler…) So IM wary of known hogs, however, I envisage only 3 or 4 character with this kind of Behavior on them, so maybe its gonna be OK?
    And I think I have to go another way, that nav mesh to have me avoid them, can I not just rotate then away from me? er.. somehow?…

    >I need to move around an obstacle (vehicle door).

    Er whats wrong with colliders?



    Aaron Mueller

    Since I am still wrestling with understanding behavior trees myself, I’d say keep it simple (haha, easy to say right??).

    I honestly don’t have a good way to benchmark performance. With Unity Pro I do have access to the Profiler, but I’m not ready to dive into that right now (bigger fish to fry). For the moment, I only have 8 NPCs. Only one is “heavy” with a sensor and a complex behavior tree. The rest use the same tree with 2 simple nodes.

    In your case, with 3 or 4 NPCs with 2 sensors each, I can’t say for sure but I don’t think would be too bad (maybe @prime can chime in on typical performance metrics for your scenario).

    Regarding “just rotating them away from me”, I don’t know. Maybe you could, but Unity’s NavMeshAgent has local steering. So, in my scenario, I aim two of my NPCs almost directly at each other. They kind of navigate around each other (kind of awkwardly like humans do sometimes).

    Regarding your last comment: Colliders are no help in this case. The reason is that both the vehicle and the player both have colliders on them but when I tell my NPC to navigate to the passenger door, he walks through the vehicle (not very smart looking, eh?).

    One other “gotcha” I’ll share. Unity’s NavMeshAgent component attaches to a game object as a Cylinder. In Unity 3.5.7 it was a Capsule collider. So, for the case of the vehicle, the cylinder shape didn’t quite fit right. I thought to add multiple NavMeshAgents and move them, but it they are only applied at the center of the parent game object. For the vehicle, I’d have to increase the radius to reach front to back, and then there would be invisible space on either side (not ideal). This is mostly fine if you use it with a NPC that’s roughly humanoid. Not good for vehicles.

    The NavMeshObstacle feature (new in Unity Pro 4.3) allows you to attach something similar to a game object. I ended up creating an empty game Object and adding the NavMeshObstacle component to it. Then I duplicated it so I could change the size and move them around. I ended up using a combination of multiple NavMeshObstacles with the “Carve” option checked. The “Carve” option tells Unity to cut holes in the NavMesh. Handy. So, practically speaking, I have to force my NPC to find a path that goes around the open door. It’s not ideal, but with the right cuts in the NavMesh, I was able to get the NPC to walk around the door.

    EDIT: I just realized it as I wrote this, that by adding empty game objects and attaching the NavMeshAgents to those, I might have been able to achieve a combination that would rough out the shape of the vehicle, but that seems inefficient. Oh well…

    Of course, maybe there is a way to do this in RAIN (or it is planned on their roadmap), but if it is, I’m not aware. I have had enough of a struggle to get to this point.

    Good luck with your game.



    Sensors shouldn’t have a performance hit unless:
    1) You have hundreds of Aspects,
    2) You are using Line of Sight checks
    3) Many of those Aspects are within sensor range.

    We’re continuing to work to improve performance across RAIN. The next patch has a number of improvements for speed and memory efficiency.



    he continue to detect blue

    i do somethinks bad ?



    Not sure what you are asking. Your highlighted detect node is set to repeat forever, so yes - it will continue to detect.



    about the mask sorry here he get mask on blue and he continue to detect him



    The mask defines what can be detected, not what can’t be detected. Select all the layers that you want to include.



    Would you be so kind as to post your Final BT tree. I’m trying to learn how to use Random properly and I noticed that you are using it, plus Detects, so a final screen shot of it working would be really appreciated.



    Having some trouble since the network I am on doesnt allow youtube viewing, and your sample projects dont seem to unzip after downloading.
    Just been trying to peice stuff together reading the forum here but some info seems from older versions which confuse me.

    This thread seemed most applicable to what I am trying to do, so hopefully the right place to post.

    I am trying to setup a visual detector where if the spider sees fire it will move towards it.
    I set up the Fire game object with the visual aspect “fire” and the mount point of the fire game object itself.
    I set up the spider AI with the visual sensor named “firesensor” and the mount point of the spider game object itself.

    In the spiders AI behavior editor I am keeping it simple till I can figure out how it works.
    I started with a Parallel decision branch, and placed a Detect inside it.
    The parallel is set to repeat forever.
    The Detect is set to repeat forever, sensor “firesensor” (in double quotes), aspect “fire” (in double quotes), and form variable firespotted (not in quotes)

    It would appear that it does not see it since when I run it, the variable firespotted shows up in the AI, but with a value of (null).

    Here is a screeshot of what I have so far.

    I am sure its something simple,

    Thanks for the help,




    @narfi - Try clicking on another object in the scene hierarchy, then back on the AI. I have seen that the variables displayed in the AI in the Inspector are not always updated in real-time. You may be detecting it, but the value is just not showing. I did have this same issue.

Viewing 15 posts - 16 through 30 (of 37 total)

You must be logged in to reply to this topic.