CS184 Lecture 9 summary

Interaction: Sensors and Routes

VRML, like most interactive systems, handles interaction using events. Events can be user actions, like moving or clicking the mouse or typing, or they can be system-generated, like timer events.

The VRML event model is particularly clean and simple. Each node can send or receive events. The nodes are "wired up" into circuits by sending events from the output of one node to the input of another. The route command creates these connections.

In many window toolkits, events are low-level actions such as mouse-left-key-press, mouse-right-key-release, etc. VRMLs primitives are adapted for manipulating 3D geometry, and its tedious to do this with such low-level primitives. VRML provide high-level primitives called sensors for accepting user input.

eventIns, eventOuts and exposedFields

Each field of a node is either an eventIn, and eventOut, an exposedField, or just a field. The first three of these can send or receive events. eventIns receive events, eventOuts send events, and exposedFields can either send or receive them. So an exposedField is like both an eventIn and an eventOut.

Example

We used a PlaneSensor node last time to move a tower of cubes. The PlaneSensor syntax is as follows:

PlaneSensor {
enabled TRUE # exposedField SFBool
autoOffset TRUE # exposedField SFBool
offset 0.0  0.0  0.0 # exposedField SFVec3f
maxPosition -1.0  -1.0 # exposedField SFVec2f
minPosition 0.0  0.0 # exposedField SFVec2f
isActive # eventOut SFBool
translation_changed # eventOut SFVec3f
trackPoint_changed # eventOut SFVec3f
}

The planesensor senses pointer events when the pointer is over a shape in the group containing the PlaneSensor. Transform nodes have the following fields

Transform {
children [ ] # exposedField MFNode
translation 0.0  0.0  0.0 # exposedField SFVec3f
rotation 0.0  0.0  1.0  0.0 # exposedField SFRotation
scale 1.0  1.0  1.0 # exposedField SFVec3f
scaleOrientation 0.0  0.0  1.0  0.0 # exposedField SFRotation
bboxCenter 0.0  0.0  0.0 # field SFVec3f
bboxSize -1.0 -1.0 -1.0 # field SFVec3f
center 0.0  0.0  0.0 # exposedField SFVec3f
addChildren # eventIn MFNode
removeChildren # eventIn MFNode
}

An exposedField for a Node such as the translation field of the Transform node implicitly defines an eventIn called set_translation and an eventOut called translation_changed for the Transform node.

In this sample program, there are two shapes in the group with the planesensor, and dragging on either of them causes the planesensor to output translation_changed events. The route command specifies that those events should be sent to the translation exposedField of the Transform node.

The types of these two events must match (they are SFVec3f in the example). Other kinds of connection are possible. We could e.g. change the scale field of the Transform node, as in this example.

A SphereSensor allows you to change the orientation of a shape with two degrees of freedom. Sensors support a direct manipulation metaphor. The PlaneSensor is like sliding a point on the plane. The SphereSensor feels like rolling a point on a sphere. Here is a the blocks example with a SphereSensor replacing the PlaneSensor.

A sensor normally responds to all events over shapes in its group. But since shapes may be grouped into hierarchies, and its desirable to move the subparts sometimes, sensors that are lower in the hierarchy override those that are higher up. Look again at the source code for the blocks example from last time. The ASensor would normally respond to events over any of the shapes. But because the subgroups have sensors attached, it does not.