At the end of the previous article, I anticipated I would be busy and would skip a week. Things happened, and I skipped five weeks instead. Annoying, but it is what it is.
Let's see if I manage to do a bit of extra work in these XMAS holidays, and make up for the lost time.
2024-12-23 - Hint particles setup
So, what was I doing before this forced break?
I started working on the smart particles, doing a bit of design and then introducing a subset of them called probe particles which I linked to the portal opening/closing mechanic.
That last part wasn't completely done, but before going back at it I prefer to draft the other smart particles behaviours.
Let's copy the definitions of a subset of smart particles states from the design done during week 29:
`
captivate
`: the particles should catch the player's eye. A basic way to do it is move towards the centre of the player's field of view, at a certain distance, and idle around there.input: the player state data
`
guide_move
`: the particles should guide the player towards a certain spot.input: the player state data and the destination spot (a location and a hexagonal slice identifier should suffice)
`
guide_look
`: the particles should invite the player to look towards a certain pointinput: the player state data and the destination point
These states sound quite interconnected, because they deal with particles that catch the player attention and guiding them.
So, I'm going to introduce another subset of smart particles, called hint particles, that will take care of these states, similarly to how the probe particles dealt with the portal interactions.
To guide the player towards a specific spot, or to have them looking towards a certain point, I need to get their attention first. That's the idea behind the `captivate
` state.
But thinking about it, I'm not sure I need an explicit state for that. Maybe, I could implement it as part of the `guide_move
` and `guide_look
` states, with a nested state machine which makes sure that, before guiding the movement or the game of the player, the hint particles have the attention of the player.
Anyway, I'm going to keep the state separated for now, which also facilitates testing and keeps things simple.
Let's say that, one way or the other, the hint particles have a "core", which is a sphere that acts as placeholder volume for the particles. We can assume that, if such sphere is in the player field of view, they're seeing the particles too.
I'm going to put a sphere in the scene, and try to steer it in front of the player. I also want to add a check to see if the sphere is actually in the camera frustum at a certain instant.
Done! Here's the result, while testing in-editor:
Notice that when the sphere is totally visible, is green, when it's partially visible, it's yellow, and when it's completely outside the frustum, is red.
Here's how I implemented a simple check using the camera frustum planes and the sphere center and radius:
public enum EFrustumCheckRes {
totallyInside,
totallyOutside,
partiallyInside
}
public static EFrustumCheckRes SphereFrustumCheck(
Camera rCamera,
Vector3 vSphereCenter,
float fSphereRadius) {
bool bIsTotallyInside = true;
Plane[] planes = GeometryUtility.CalculateFrustumPlanes(rCamera);
foreach (Plane plane in planes) {
float fDist = plane.GetDistanceToPoint(vSphereCenter);
if (fDist < -fSphereRadius) {
return EFrustumCheckRes.totallyOutside;
}
if (fDist < fSphereRadius) {
bIsTotallyInside = false;
}
}
return bIsTotallyInside ?
EFrustumCheckRes.totallyInside :
EFrustumCheckRes.partiallyInside;
}
The floating behaviour steers the sphere towards a point at two meters from the player, along their gaze vector. Notice that the sphere goes up and down too, following the "look" direction.
Basic but effective. The only problem is that, looking down, I can send the sphere below the platforms, which doesn't make sense.
I'll fix it later projecting to the ground plane when under a minimum height.
2024-12-24 - Guiding towards a specific spot
Today, I'm going to take care of the `guide_move
` state:
`
guide_move
`: the particles should guide the player towards a certain spot.
A long time ago, I defined `RTLocation
`, a pair of identifiers (zone id and platform id).
Then, I have the `Hex.EDir
` enumeration, which identifies each of the six slices of an hexagon:
public struct Hex : System.IEquatable<Hex> {
// ...
public enum EDir {
NE,
N,
NW,
SW,
S,
SE,
};
// ...
}
I think that, in terms of "guiding the player somewhere", I won't need to be more specific than "that slice of that platform". Additionally, I remember passing as parameter an `Hex.EDir
` together with a `RTLocation
` a few times already.
This tells me that it's a good idea to introduce a new simple type to indicate a specific platform slice. I could add an optional `Hex.EDir
` to `RTLocation
`, but that could get confusing.
So, let's define `RTSpot
` (not much imagination with naming, today) pairing an `RTLocation
` and a `Hex.EDir
`.
Quick and easy. For testing/debugging purposes, another thing I often wanted is having the platform slices labelled, so that I can quickly check that things are correctly positioned.
Let's take care of that too.
Platform labelling done! While I was at it, I not only labelled the edges but the platforms themselves, using the string identifiers I set in the level definitions. This way, I will be able to easily recognize the "meaningful" platforms while testing.
Now it's time to add the logic to send the hint sphere towards a specific spot. Should be easy enough.
I added the zone and platform numeric identifiers to the platform label, so I can easily know what values I need to insert in the testing script if I want to send the sphere to a specific spot.
This will need a bit of tuning, but the essence is there!
2024-12-25 - Guiding the player gaze
Hey it's Christmas! Maybe I shouldn't work... but come on, just a little!
So, while to guide the player towards a specific spot I just sent the hint sphere there, just above platform height, I imagine guiding their gaze will be a little more complicated.
While in the `captivate
` state the hint sphere goes to the middle of the player field of view, in this case I should try to send it to the edge of that field of view, suggesting the player to look that way.
You are probably familiar with this kind of visual hint if you have played any 3D space sim: you get some tiny arrows at the sides of the screen, pointing at enemy ships outside the field of view. If you rotate towards one of these arrows, you end up getting the enemy ship it was pointing at in front of you.
My case is slightly different, because I don't have to position UI elements at the edge of the screen, but the hint sphere in 3D space. Still, the core idea is the same.
I'm not sure if I'll ever need to have the player look up or down, that will be influenced by how much I will go crazy with the level design.
To limit the scope, I'm considering planar levels, so the hint sphere should basically push the player to rotate left or right (whichever is faster).
For example, let's assume the player head is vertical, and they're looking towards North, but the current hint is to look towards South-West: the hint sphere should go to the left edge of the player field of view until they're rotated to look towards SW (and the sphere would be at the center of the FOV).
I can imagine different ways to implement this - let's try the simplest one.
I tried implementing the straightforward approach I had in mind, but it doesn't work well. Bug or fundamentally wrong idea? I'll find out tomorrow...
2024-12-26 - Fixing the gaze hint
If you've been reading the DevLog for a while, you already know what I'm going to do now: add a bit of debug visualization to understand what's wrong.
I added a couple of extra spheres for debugging purposes: one to indicate the point we want the player to look at (the hint target), and another to indicate the position where we want the hint sphere to go, at the edge of the field of view.
I did an even simpler version of yesterday's code, which only considers the horizontal angle between the player forward direction and the target direction, because I felt that the error was in the calculation of the vertical rotation.
The resulting code worked pretty well, as can be seen in the following video:
The blue sphere is the target point, and the sepia sphere is the hint point, which the usual hint sphere continuously tries to reach. Notice that the sepia sphere sticks to one side of the frustum when the target sphere is outside of the field of view, and lies on the invisible line between player and blue sphere instead.
So how does this work?
I get the forward vector of the player and the head position
I calculate the target direction, which is the vector from the player head to the target point, normalized
I calculate the signed angle between these two direction vectors, which tells me the "yaw" needed to rotate the head towards the target
I clamp this needed rotation angle to half the horizontal field of view
I rotate the forward vector horizontally by the clamped rotation angle
Finally, I calculate the point along the "rotated forward" at the desired distance from the head
Here's the relevant code snippet:
public static Vector3 getGuideLookPos(Data rData) {
Vector3 vHeadPos = BodyInputManagerBhv.i().getHeadPose().position;
Vector3 vFwd = BodyInputManagerBhv.i().getPlayerFwd();
Vector3 vUp = Vector3.up;
Vector3 vTargetDir = (rData.m_lookAtPoint - vHeadPos).normalized;
float fFwdToTargetAngle =
Vector3.SignedAngle(vFwd, vTargetDir, vUp);
float fHalfHFOV = CamerasManagerBhv.i().getHorizFOV() * 0.5f;
float fLookToTargetClampedAngle =
Mathf.Clamp(fFwdToTargetAngle, -fHalfHFOV, fHalfHFOV);
Quaternion qHorizRot =
Quaternion.AngleAxis(fLookToTargetClampedAngle, vUp);
Vector3 vHintDir = qHorizRot * vFwd;
Vector3 vHintPos = vHeadPos + vHintDir * Params.fLOOKAT_DIST;
return vHintPos;
}
There are probably better solutions to this kind of problem, the one I implemented is essentially a 2D solution, but considering the initial assumption of planar levels, it should work pretty well.
Maybe I'll refine it or change it in the future!
2024-12-27 - Hint VFX placeholder
I'm tempted to draft the last remaining state of the smart particles, the `show_anim
` state which is supposed to teach the player a gesture by displaying an animation to perform.
But it's been quite some time since I implemented the core features I'm going to need (it's a good moment to go back and read the posts for weeks 16, 17, 18 and 19), and I feel like it's going to take more than one day to revisit all that stuff and make good use of it.
So, considering it's Friday, I'm going to postpone that to next week. So, what I can do instead?
It feels a good moment to do a bit of clean-up and refactoring of the work done in the past days.
I'm going to add a placeholder VFX for the hint particles, and relegate the spheres I've been showing so far to the debug mode.
Ok, that went pretty well.
I put a placeholder VFX (basically the one I did for the fireball a long time ago, but set to yellow) in place of the hint sphere I've been using so far.
The visualization that was in place until now was relegated to the debug mode, so if I sense that something is off with the VFX placement, I can easily look behind the scenes.
While I was at it, I drafted a generic script for the "debug only" visualization of a subsystem, factoring out some logic that until now I had repeated multiple times in other debug behaviours.
I will progressively move to this new system other debug components. By isolating these elements, I will be able to easily strip them out from a build.
In the Unity scene, I have a separated subtree for the debug nodes. The smart particles debug node has a reference to the "non-debug" manager of that subsystem, which uses as its data source. Only when the debug mode is active, it takes care of providing an alternate visualization (in this case, placing the three spheres we've used yesterday).
Week 31: done!