JAJU657 December 2018
To understand how gesture detection works, first look at how a gesture is detected. The MSP430FR2633 measures change in capacitance on the CAPTIVATE-BSWP panel caused by a finger touch on a button, slider, or wheel sensor. The CapTIvate library firmware includes algorithms that determine if a sensor is touched and, if the sensor is a slider or wheel sensor, the position of the finger on the sensor. The MSP430FR2633 has a dedicated 16-bit CapTIvate timer that is set by default to generate a periodic capacitive touch measurement interrupt every 20 milliseconds, or 50 times a second. This rate is user configurable. During each interrupt, the three sensors are measured, followed by the gesture processing, which uses the periodic sensor sampling rate as the time base measurement for finger touch and release events. Because the CapTIvate technology has a dedicated timer, none of the general-purpose 16-timers on the MCU are used, leaving them available for the main application.
NOTE
Important Concept: Gesture timing is based on the sensor sampling rate.
To determine a specific gesture there are one or two attributes needed. The first attribute is time. This is measured by counting the number of sensor measurement samples between two events, such as a finger touch followed by release. For example, when sampling a sensor every 20 milliseconds, a touch that lasts 10 sample periods represents a touch for 200 milliseconds. The time attribute applies to buttons, sliders, and wheels. The second attribute is distance. This is the distance that a finger has moved and applies to only wheel and slider sensors.
By assigning parameters to these time and distance attributes, rules can be created to help define each gesture. Why is this important? Because gesture duration and speed can vary from user to user, gesture parameters help improve gesture repeatability and detection accuracy. Each sensor type has its own gesture parameters and is configurable in software to allow a specific "user touch and feel" to be tailored for the application.
Because each sensor can have different gesture behaviors, processing is specific to each sensor type. For example, in this reference design there are wheel gestures, slider gestures and button gestures assigned to the corresponding sensors. In software, the sensor gesture is essentially a state machine that is executed on every measurement sample as part of the sensor's callback function and uses the sensor's timing and distance parameters to control the processing.
Figure 7 shows that a tap gesture is a momentary touch followed by release and is common to all three sensor types. Notice that the gesture detection only begins after there is a continuous touch for at least the first four samples and is only reported as a tap gesture when the finger is released within the specified time window. If released outside of the time window, the gesture is ignored.
NOTE
The specific time and distance parameter values in the following figures represent the parameters used for this reference design demonstration. These parameters are user configurable and can be tuned for any application.