vrpn  07.33
Virtual Reality Peripheral Network
vrpn_Tracker_ViewPoint.h
Go to the documentation of this file.
1 //
3 // Name: vrpn_Tracker_ViewPoint.h
4 //
5 // Author: David Borland
6 //
7 // EventLab at the University of Barcelona
8 //
9 // Description: VRPN server class for Arrington Research ViewPoint EyeTracker.
10 //
11 // The VRPN server connects to the eye tracker using the VPX_InterApp DLL.
12 // Whatever other control software is being used to connect to the eye tracker
13 // (e.g. the ViewPoint software that comes with the tracker) to perform
14 // calibration, etc. should link to the same copy of the DLL, so they can share
15 // information.
16 //
17 // -------------------------------------------------------------------------------
18 //
19 // Tracker:
20 //
21 // The tracker has two sensors, as the ViewPoint can optionally have binocular
22 // tracking. In the case of monocular tracking, only sensor 0 (EYE_A) will have
23 // valid information. Retrieving smoothed or raw tracking data is controlled by
24 // the smoothedData parameter.
25 //
26 // Position: The (x,y) gaze point in gaze space (smoothed or raw).
27 //
28 // Rotation: The (x,y) gaze angle as a quaternion (smoothed or raw).
29 //
30 // Velocity: The x- and y- components of the eye movement velocity in gaze space
31 // (always smoothed).
32 //
33 // -------------------------------------------------------------------------------
34 //
35 // Analog:
36 //
37 // There are a lot of additional data that can be retrieved from the tracker.
38 // These values are always calculated from the smoothed gaze point. Currently,
39 // the following are sent as analog values, but more can be added as needed.
40 // Please see the ViewPoint documentation regarding what other data are available.
41 //
42 // Because each channel needs to be duplicated in the case of a binocular tracker,
43 // the first n/2 values are for EYE_A, and the second n/2 values are for EYE_B.
44 //
45 // EYE_A:
46 //
47 // Channel 0: The pupil aspect ratio, from 0.0 to 1.0. Can be used to detect
48 // blinks when it falls below a given threshold.
49 //
50 // Channel 1: The total velocity (magnitude of eye movement velocity). Can be
51 // used to detect saccades.
52 //
53 // Channel 2: The fixation seconds (length of time below the velocity criterion
54 // used to detect saccades). 0 if saccade is occurring.
55 //
56 // EYE_B:
57 //
58 // Channels 3-5: See EYE_A.
59 //
61 
62 #ifndef VRPN_TRACKER_VIEWPOINT
63 #define VRPN_TRACKER_VIEWPOINT
64 
65 // Make sure ViewPoint EyeTracker is being used
66 #include "vrpn_Configure.h" // IWYU pragma: keep
67 #ifdef VRPN_USE_VIEWPOINT
68 
69 #include "vrpn_Tracker.h"
70 #include "vrpn_Analog.h"
71 
72 class vrpn_Tracker_ViewPoint : public vrpn_Tracker, public vrpn_Analog {
73 public:
74  // Constructor
75  //
76  // name: VRPN tracker name
77  //
78  // c: VRPN connection to use
79  //
80  // smoothedData: Get smoothed data or raw data for tracker values.
81  //
82  vrpn_Tracker_ViewPoint(const char* name, vrpn_Connection* c, bool smoothedData = true);
83  ~vrpn_Tracker_ViewPoint();
84 
88  virtual void mainloop();
89 
90 protected:
91  virtual void get_report();
92 
93  virtual void get_tracker();
94  virtual void get_analog();
95 
96  virtual void send_report();
97 
98  bool useSmoothedData;
99 };
100 
101 
102 #endif
103 #endif
Generic connection class not specific to the transport mechanism.
virtual void mainloop()=0
Called once through each main loop iteration to handle updates. Remote object mainloop() should call ...