News Forums Updating to RAIN Sensor/Aspect system and ExpandingBoxSensor

This topic contains 6 replies, has 2 voices, and was last updated by  WiedemannD 9 months, 4 weeks ago.

Viewing 7 posts - 1 through 7 (of 7 total)
  • Author
  • #4571



    first I have to say I very much like the new RAIN interface and the new debugging of behavior trees seems very promising.

    Of course there is a BUT:
    1) Before the upgrade I quite often used not to any AI connected sensors for example for my projectiles. As it seems, this is not possible anymore (please correct me if I’m wrong). So one way could be to add an AI to each projectile and then the needed sensor. But this seems to be an awful lot of computing overhead to me when you just want to check if you the projectil hits an object with the aspect lalala. Another way of course is to go back to a “custom made sensor” script and trigger colliders or custom manual “collision” checking (for example simply checking the distance against a threshold). In the case of projectiles, which you don’t want to have a lot of overhead, I guess the last approach seems more reasonable than adding a AI.
    So am I maybe wrong with this and is there performant new RAIN way to accomplish this?

    2) ExpandingBoxSensor: I already had heard from you guys before, that the new RAIN would not have an expanding sensor anymore and it looks like it’s exactly that way. So with the previous RAIN version I used an extended class of your ExpandingBoxSensor (added yoyo possibility and constraints) to for example check when a character is performing a melee attack AND if he is actually hitting another character. Is there still a way with the new RAIN to accomplish something similar, or would I again need to create my own expanding sensor?

    Last but not least, of course thank you again for currently offering RAIN for free, at least in the past, it has been a huge help!




    Hi Daniel -

    I’m not sure I understand question #1. You are saying that with RAIN{indie} you would place Sensors on objects not connected to the AI? Were they still feeding information back to the AI, but just not mounted on it? If that’s the case, you can still do it. Each Sensor has a MountPoint. You could easily set your projectile as the mount point, and then Sensor would travel with it and send info back to the AI.

    For #2, the quickest way to create an expanding sensor would be to create a custom AI element that updates the Range of your sensor. Sensors are now passive and don’t have their own update call, but custom elements do. I would do it in the Act() callback probably. Just grab AI.Senses.Sensors, find your sensor, and update its Range.



    Thank you for your answer prime! I hope I can clear some things out this time.

    In 1) I really meant the sensor was not connected or relevant for any AI at all. So no they were not feeding back information to an AI. I used it completely separate. And I also think that’s the more logical approach: you fire a projectile from a gun and after that it’ll be completely autarkic.
    As I already had to use aspects on the characters for example to identify them for actual AIs and as sensors were already able to read that data and would only trigger when they got in “touch” with an object with the right aspect, it seemed just logical to use the same system for “dumb” objects like projectiles too. Don’t tell me nobody else ever did that before?

    And for 2) is there a way to let the expansion only work in specified directions (I guess not as there now are only spherical ones?)? Because for a hit sensor it would not make sense to hit enemies behind you for example, or even not ones that are standing to one of your sides. You would only want those in front of you to be hit. At least if it’s not a round house kick.

    There is actually a 3) now: In the AI component there is a dropdown were I can select a sensor to create. Is there any way I can get a custom build sensor in that list?

    I hope this helped to better understand how I have been using some of your technology in (I think not so absurd) cases before. Thanks again!



    To get a custom sensor in the list do the following:

    Create a new class that inherits from either RAINSensor or from one of the built in sensors.

    Done. RAIN will pick it up and automatically add it to the list.

    BTW - We’ve purposefully built RAIN to support users like yourself creating (and sharing) custom elements. If you create something cool that you want to share with other people, and you want to make your creation available for sale (say in the Asset Store, or

    eventually through RAIN itself), we’re all for that too.



    Ok, wow, custom expansion sounds really cool and handy (!), though is there a way for me to get a look at least into the source of the RAINSensor class (or a detailed description of the new sensor system)? Because writing a new sensor without understanding how the original one works (which the API description doesn’t make clear) seems quite hopeless. At least if I don’t want to step back to adding for example colliders again and overwriting much of the original functionality etc etc.

    But the possibility to create sharable classes/assets seems very cool!



    We’ve got a lot of work to do in the coming weeks and months to put out sample projects and videos that show the details of how to customize RAIN. I’ll mention that we are working on a new Developer program that will have additional information for both regular RAIN users as well as devs who want to create custom extensions, plugins, and content from RAIN.

    So, here’s a little info on how you could create a new Sensor:

    We moved away from using Unity’s collision system in the new RAIN. Performance was killing us, and we were constantly fighting with layer issues. The new solution uses the SensorManager class - a singleton you can access through SensorManager.instance. The SensorManager performs two functions:

    1) Entities and Aspects register themselves with the SensorManager whenever they are active
    2) Sensors can get a list of active Aspects by calling SensorManager.instance.GetAspect(string aAspectType)

    Every Aspect has an AspectType. For VisualAspects this is “visual”. For AudioAspects this is “audio”. So the VisualSensor finds VisualAspects by calling SensorManager.instance.GetAspect(“visual”)

    The next thing you need to know is that Sensors are now passive rather than active. That means they don’t normally do any work, except when you ask them to. Sensors do their work in the MatchAspect or MatchAspectName methods. MatchAspect(RAINAspect aAspect) is used to ask a Sensor whether it can detect a particular Aspect. For the VisualSensor, it does this by grabbing the list from the SensorManager, making sure the Aspect is in there, and then running a visibility test based on range, detection angles, and raycasting. MatchAspectName works similarly, but will return all Aspects that have the given AspectName. So, your primary job in creating a new Sensor is to implement those two methods.

    The last thing you need to know is that each Sensor must keep a Matches list. That’s a list of all the Aspects that have been detected in the most recent MatchAspect/MatchAspectName call. So when your Sensor processes a MatchAspectName call, for example, it should store the results in some sort of list (we just use List). Then return that list in the Matches call (the call actually returns an IList - we return list.AsReadOnly()). We do things this way in order to reduce performance overhead.

    I’m going to stop there for the moment an wait for questions, but there is one more related topic: Sensor Filters.



    Thank you prime for that detailed explanation!!

    That’s definitely an answer I can work with. I’ll let you let you guys know, what I can come up with.

Viewing 7 posts - 1 through 7 (of 7 total)

You must be logged in to reply to this topic.