Sixteen days with nothing else on my calendar, ideas distilled through years, and the thrill of an empty project repository: #prototyping for the next game by BinaryCharm starts now! #vr #indieDev #gameDev
(https://twitter.com/darioscarpa/status/1687762932623380480)
As a first step, I did my development environment setup on both systems I regularly use (pc and notebook). When I start a new project, I usually install the latest non-beta version of the software I know I'm going to use, and usually stay on those version without updating anything unless is needed for a specific reason.
For the prototype, I'm going to use Unity, because it's the engine that I have the most experience with and I want to be quick, but I plan to consider alternatives for the full fledged game.
So, I installed/updated to
Unity 2022.3.6f1
Visual Studio 2022
Meta Quest Developer Hub 3.7.0
Oculus Integration SDK v55
I've been doing plenty of VR development for PCVR (since the Oculus DK2!), but haven't shipped anything for Quest, so I just followed the latest Meta "first app" tutorial to make sure I use the recommended Unity settings and features.
Just to add a little twist to it, I generated a procedural hexagonal platform (which will be a staple of Particular Reality, unless I change my mind) instead of using the classic Unity cube.
I tried the "Build And Run" to have it running natively on my Quest 2, and it worked fine.
To iterate quickly, though, it would be crazy to make and push Android builds. So, I made sure that the project setup was also valid to have the application running on device when pressing "play" in the Unity editor, via Wireless Link, as I usually do when developing for PCVR. To get this working, I had to flag the Oculus checkbox in the "standalone" tab of the Unity XR Plug-in management (the guide I had followed only took care of the Android tab).
And that's a good moment for the first commit to the GIT repository I had previously initialized, setting the proper gitignore/gitattributes files, recycled from other Unity projects.
As I often do, after committing, I cloned the project on the notebook and checked that it also worked properly there. It's not bulletproof, but catches lots of silly mistakes (like forgetting to commit a new script).
The first thing I want to get done is the locomotion system that I have in mind. This involves teleporting between platforms.
As a test setup, a grid of platforms will be fine, so I quickly added the code to generate a bunch of them, correctly spaced.
The basic locomotion idea of Particular Reality is that you move teleporting between platforms, while staying in the same "physical" hexagon of your room space.
Different levels will feature all kinds of platform placements, but, to get started, a uniform planar grid will be fine.
To end the day, I read a bit of documentation about the Meta Interaction SDK, and watched a video about its features.
I've been doing a lot of hand based interaction, starting from 2007 with my bachelor thesis (I worked with a VR glove), and after that using both the LEAP Motion controller/SDK and the Microsoft MRTK (both on Hololens and on PCVR) for industrial and heritage applications... but this is the first time I'm going to use the Meta Quest hand tracking as a developer and not only as a user, so I wanted to see what was offered by the "high level" SDK before getting started.
The Interaction SDK looks well made and useful, but I think I'm not going to need it for the kind of gameplay I have in mind.
It might come useful for UI interaction, if I end up needing an options menu or things like that, but that's probably out of the scope of the prototype. What I care about now is checking that the basic gameplay loop I have imagined works well when implemented. Fingers crossed (which were hard for hand tracking not that long time ago!).
Next step, tomorrow: handling basic input and teleporting between adjacent platforms.