User Interface


  • To what extent can we modify the user interface for custom input controls?

    I have a "Neural Interface Controller" which can read thought commands to control a computer and would like to see what can be done there. I'm also expecting a Dev Kit for the FOVE eye tracking VR in a few months and one thing I'd like to be able to do is separate the pointer object from the view direction. Instead, I'd have the mouse pointed attached to the eye tracker and the camera object attached to the head motion. From there I would use a simple thought command (assuming I can get the Emotiv to fit under VR) to handle "click" events. For movement it could be haptic feedback, a joy stick, or even an omnidirectional treadmill. There won't be any need to press a button to switch between modes with this setup, but that depends on how much I can change. If there is an "object: visibility" control for menu objects it might even be possible to set it up so that the menus are not there until the user looks directly at where it should be, so there is less peripheral clutter and more immersive experience.

    Being able to separate those commands would also be useful when using a gamepad, since one joy stick could handle movement and the other joystick could handle the mouse.

    I probably won't have much time until the end of this semester to play around with mods (I have like 3 programming assignments a week), but I wanted to see what could be done.


  • 1
    Posts
    831
    Views
    Log in to reply

Internal error.

Oops! Looks like something went wrong!