Home > other >  Hololens Gaze Geuster Source input control
Hololens Gaze Geuster Source input control

Time:09-24

Among Hololens Gesture recognition and input source, we will encounter or total want to use the three aspects of knowledge, the first is the [Gaze: your cursor] [Gesture: Gesture input] [InteractionManager: input source controller]
The following us a result about the three contents,
Gaze: your cursor
In the PC, the mobile platform, the XBox other platform, we can see the cursor, finger touch, remote controller and a series of let us familiar input devices, and inside the Hololes equipment, we suddenly left the familiar input devices, switch to the gestures, and it is
Gestures from a distance, it is in front of the equipment to do a gestures can be captured by Hololes equipment, such as the input is a little make people feel good, however, the output is also becoming inconceivable, Hololens equipment integration environment, in the midst of your visual presents a large screen, the silver
Pictures on a screen to reality of our application, suddenly let me have a - more than a decade ago, which have seen large has been evident in our reality of science and technology,
Gaze the PC platform is used for the content of the cursor, our goal to the target through rotating equipment to see, then feedback to our current equipment to the target, and then combined with the back of the gesture recognition or input source controller on a series of logical control, it's like I
Use the mouse right and the left mouse button, we want to move the cursor to the first we want to control area, and then click the left mouse button to trigger the operation, click the right mouse button control, etc.,
As we Gaze lost mainly determine the target and the target, its main function is to do a thing like this,

Gesture: Gesture input
We mentioned the cursor in front of the right and the left mouse button, it can be understand that Gaze is a cursor, the Gesture is we have the left mouse button and right, we can through the registered event monitoring signal input some of the things
The public event GestureErrorDelegate GestureErrorEvent; [abnormal input]
The public event HoldCanceledEventDelegate HoldCanceledEvent; "Input to maintain"
The public event HoldCompletedEventDelegate HoldCompletedEvent; "Input to maintain"
The public event HoldStartedEventDelegate HoldStartedEvent; "Input to maintain"
[public event ManipulationCanceledEventDelegate ManipulationCanceledEvent;
The public event ManipulationCompletedEventDelegate ManipulationCompletedEvent;
The public event ManipulationStartedEventDelegate ManipulationStartedEvent;
The public event ManipulationUpdatedEventDelegate ManipulationUpdatedEvent;
The public event NavigationCanceledEventDelegate NavigationCanceledEvent;
The public event NavigationCompletedEventDelegate NavigationCompletedEvent;
The public event NavigationStartedEventDelegate NavigationStartedEvent;
The public event NavigationUpdatedEventDelegate NavigationUpdatedEvent; ] [move] associated
[public event RecognitionEndedEventDelegate RecognitionEndedEvent;
The public event RecognitionStartedEventDelegate RecognitionStartedEvent; ] [unclear]
The public event TappedEventDelegate TappedEvent; Click []
These methods are stored in a
Namespace unityEngine. VR. WSA. Under the Input space GestureRecognizer class
It is used to monitor signal input, but the content is not what we want him, press, such as gestures gestures bounce, this is no way to do it, and the only comparison is similar to that of Hold this content, but it is not the real press/pop-up monitor,
But a delay to maintain, that is, under the condition of invariable keep a Gesture and produce some of the callback, so Gesture provides a signal input, but not all, we want to reach the point, the bounce this instantaneous monitoring function, we also need to borrow
To help another content [InteractionManager], this is the input source controller,
InteractionManager: input source controller
This controller, there is a click/press/up/update/activate/destruction of callback, gestures, and it is not only monitoring also monitor audio and so on, let's look at what it can monitor
Public static event SourceEventHandler SourceDetected; [input source to activate]
Public static event SourceEventHandler SourceLost; "Input source is lost"
Public static event SourceEventHandler SourcePressed; "Input source press the"
Public static event SourceEventHandler SourceReleased; [input source pop-up]
Public static event SourceEventHandler SourceUpdated; [input source to move or change of state]
These methods are stored in a
Namespace UnityEngine. VR. WSA. Under the Input space InteractionManager class
We through the register for these events, by judging the input source, it can be done gestures press, gestures bounce, gestures and so on a series of control of the change,
Below is I wrote a few control class, interested can look at, of course, this is just a preliminary design, will inevitably have fault or not reasonable place, also please viewers said

CodePudding user response:

Jingdong, 65999, who played up?

CodePudding user response:

Can only say that ordinary players wait for prices, local tyrants company deployed in advance,
  • Related