Hydra: Motion Control Right Now
Though there are many who look at emergent VR technology and see the obvious zero point for head mounted displays (and what that will mean for VR as a technology) it's only now that HMD technology is progressing that it's becoming equally apparent that motion control is imperative for creating an immersive metaverse. That's why you see technologies like Lighthouse coming from Valve, and there have been rumors that Oculus is working on some semblance of control themselves.
I learned a long time ago not to throw my support on any one technology at the forfeit of others, if for no other reason than technology is evolving so quickly that you could end up entrenched in an outdated (or merely not optimal) ecosystem. So, even though Valve and others will doubtlessly provide excellent control options, I've been exploring others in the mean time. My currently anticipated peripheral is Sixense's STEM system, which is a wireless multi-tracker rig that allows the user to customize how they input (with tracker packs and controllers) but while I wait I went to their first project: The Hydra.
Most people are probably aware that, although Razer produced the Hydra, all the tech under the hood came from Sixense. You can even use the Motion Creation Editor software that they put out with it in favor of the older Razer profiles. This means that becoming comfortable with Hydra control now sets you up best to upgrade to STEM input later, as the control profiles will be quite similar. If you've had as much interest in motion control as I do, most of this probably is old hat, so lets get into the details of the system.
The Hydra tracks two controllers in space by positioning them relative to the control base. It can tell how the devices are oriented, when they tip/roll, if you move forward or back, 7 keypresses per unit, and they each have a control-stick. They do a rather surprisingly good job of tracking, as well. The best example of this is Half Life VR, which can be used with them. It allows the user to hold their guns freely and position them in space at whatever angle they wish. Want to shoot around corners? No problem. Throw grenades variable distances using Half Life's famous physics? You absolutely can. But where it really shines it how it manipulates the game UI. When using the full Razer/Oculus set up you can actually view your health and armor by looking at the back of your right hand. And this is where the game, so to speak, changes.
Since the invention of the modern computer we have been, essentially, bound by our keyboard (and, later, the mouse.) One of the primary goals of the metaverse is to transport consciousness/perception directly into content, and having to WASD around a room will always limit your sense of presence. With the addition of motion controllers, however, you are free from the keyboard and the conventions of interaction/movement that come with it. There are days coming ahead where we will need to redefine the principles for user experience within things like the internet, games, and even movies. The passive viewer isn't going to last, and controllers like the Razer Hydra are trumpeting the downfall.
Motion control is a hard thing to describe if you can't try it. Since, as human organisms, we are quite accustomed to interacting with our world through reach and touch that part has a very simple learning curve and with the accessibility of gaming more and more people are becoming comfortable with controller-like input. This just leaves practice/experience as the only outlying piece required to make initiated motion controller users, and that will come in time.
If you haven't looked at the STEM I think it's one to watch, and if you have some disposable income to flex I recommend the Hydra. There are a lot of experiences that support it, and it can be used like a controller for the experiences that don't. VR control is an inevitable so you should consider becoming comfortable with it as soon as possible.