Minimum main thread block with Async overload
There are 3 interesting methods on EntityQuery
that has an overload with out JobHandle
:
public NativeArray<ArchetypeChunk> CreateArchetypeChunkArray(Allocator allocator);
public NativeArray<ArchetypeChunk> CreateArchetypeChunkArrayAsync(Allocator allocator, out JobHandle jobhandle);
public NativeArray<T> ToComponentDataArray<T>(Allocator allocator) where T : struct, IComponentData;
public NativeArray<T> ToComponentDataArrayAsync<T>(Allocator allocator, out JobHandle jobhandle) where T : struct, IComponentData;
public void CopyFromComponentDataArray<T>(NativeArray<T> componentDataArray, out JobHandle jobhandle) where T : struct, IComponentData;
public void CopyFromComponentDataArrayAsync<T>(NativeArray<T> componentDataArray) where T : struct, IComponentData;
What's the deal? Have you been using only the normal version?
Unity schedules and complete a job just to do those work for you
The existence of out JobHandle
means that inside these method Unity schedules mini jobs to "gather" and make you the desired NativeArray
. It complete the job immediately because the call is synchronous. You get the NativeArray
right away. (Don't forget to dispose it)
Remember that these methods are called from the main thread since it is a method of EntityQuery
. You get worker thread utilization even when working with main thread things. It is really considerate!
Also this is why you could not use Allocator.Temp
, since it use the newly allocated NativeArray
in that mini job. Requiring at least Allocator.TempJob
.
Unlocking the main thread
Where you kick off job stuff you are probably working in the system's OnUpdate
. It is in the main thread for just a while. The part that I say "complete the job immediately" is unfortunately blocking the main thread.
But you could make it return an incomplete NativeArray
by using the out JobHandle
overload. If you call .Complete
on that JobHandle
it would be equivalent to normal overload and block the main thread.
But the point is you don't call complete, but use that JobHandle
as a dependency for the next job likely in the same system, directly below.
You can instantly use the (incomplete) NativeArray
as an input for that job. Normally your Entities.ForEach
or Job.WithCode
will tie to the Dependency
smart job handle system. Since you just did something that it couldn't apply to the Dependency
, you can do this below your Async
method, and before performing Entities.ForEach
or Job.WithCode
:
Dependency = JobHandle.CombineDependencies(Dependency, gatherJobHandle)
This ensures the best possible scenario :
- The
OnUpdate
code is blazing fast, it does not yet gatherNativeArray
but already give you what would be the product of that gather. Squeezing you that precious main thread ms you are trying to get less than 16.66ms. - That broken
NativeArray
can be given to the job right now. Thanks to dependency chaining with thatJobHandle
, you can ensure theNativeArray
is completed by the time that job runs.
The NativeArray
needs to be deallocated, remember. Which you can use .WithDisposeOnCompletion(_)
on the final job you use it. The _
at the moment that the job is called is still an incomplete NativeArray
! But by the time the job finishes it is already the complete one ready to be deallocated.
Power moves for each methods
CreateArchetypeChunkArray
This method looks like the most lightweight operation out of all, but still Unity schedule a job to gather you chunks. (how considerate?)
Bringing NativeArray<ArchetypeChunk>
to the job is of course for when you want maximum tailor made job possible.
- You want that raw, hardcore, memory area to iterate/read/write (depending if your
ComponentTypeHandle
allows you to write or not). - And maybe you want to ask the chunk something along the way. (Like
Has
orDidChange
) - Or maybe you are not satisfied with one archetype based
EntityQuery
and you usedEntityQueryDesc
to get you assortment of chunks based on the uniqueAny
criteria. (All
andNone
is doable without query)
You cannot get any more custom than this in the job. But don't just stop there if you had come this far. Before the job you can still optimize the chunk gathering as an another job!
ToComponentDataArray
"To" methods means you get a new NativeArray
that is not linked with ECS database anymore to do as you please. (But dispose it too) If you change something inside it, nothing get updated. So it means you get a copy of data. You can feel that this might going to be a bit expensive. But you could alleviate it with out JobHandle
overload. And maybe stick that to run before an another job that use it. Probably you may want to add .WithDisposeOnCompletion
.
(So the product from this method is not the same as that NativeArray
from archetypeChunk.GetNativeArray
. That's the real deal, portal to ECS database. If you change it, things changes *if your ComponentTypeHandle
allows)
By the way jobs are usually Entities.ForEach
and Job.WithCode
, that will automatically use Dependency
when run with empty .Schedule()
. Instead of chaining into the .Schedule
I just combine with Dependency
and assign overwrite the Dependency
so the job picks that up. It make your code easier to read since you do it right after the Async
, instead of having to scroll down to the bottom of your job.
CopyFromComponentDataArray
This one is to combo with ToComponentDataArray
. If you did change the content in in that and want to apply back you can delay the apply with out JobHandle
overload. (Make sure the length of that NativeArray
still matches total entities from the EntityQuery
you are applying back, there is a throw to warn you if you try to plug an alien array thats incompatible back in.)
The mini job inside this method automatically knows which dependency to wait on because it is a method of entity query, the thing full of types. Complete immediately overload or not. Then if you use the out JobHandle
overload you can avoid completing immediately those dependency that it automatically knows.
Also... I could imagine insane hacking potential from this method. Because currently, the Set
variant of EntityManager
is done per-entity basis. (How else would it know which data could go to which Entity
?) Except for this one method that seems to be able to batch-apply values. Read more here.