- Touches
- Recipe: Adding a Simple Direct Manipulation Interface
- Recipe: Adding Pan Gesture Recognizers
- Recipe: Using Multiple Gesture Recognizers Simultaneously
- Recipe: Constraining Movement
- Recipe: Testing Touches
- Recipe: Testing Against a Bitmap
- Recipe: Drawing Touches Onscreen
- Recipe: Smoothing Drawings
- Recipe: Using Multi-Touch Interaction
- Recipe: Detecting Circles
- Recipe: Creating a Custom Gesture Recognizer
- Recipe: Dragging from a Scroll View
- Recipe: Live Touch Feedback
- Recipe: Adding Menus to Views
- Summary
Recipe: Live Touch Feedback
Have you ever needed to record a demo for an iOS app? There’s always compromise involved. Either you use an overhead camera and struggle with reflections and the user’s hand blocking the screen or you use a tool like Reflection (http://reflectionapp.com) but you only get to see what’s directly on the iOS device screen. These app recordings lack any indication of the user’s touch and visual focus.
Recipe 1-13 offers a simple set of classes (called TOUCHkit) that provide a live touch feedback layer for demonstration use. With it, you can see both the screen that you’re recording and the touches that create the interactions you’re trying to present. It provides a way to compile your app for both normal and demonstration deployment. You don’t change your core application to use it. It’s designed to work as a single toggle, providing builds for each use.
To demonstrate this, the code shown in Recipe 1-13 is bundled in the sample code repository with a standard Apple demo. This shows how you can roll the kit into nearly any standard application.
Enabling Touch Feedback
You add touch feedback by switching on the TOUCHkit feature, without otherwise affecting your normal code. To enable TOUCHkit, you set a single flag, compile, and use that build for demonstration, complete with touch overlay. For App Store deployment, you disable the flag. The application reverts to its normal behavior, and there are no App Store–unsafe calls to worry about:
#define USES_TOUCHkit 1
This recipe assumes that you’re using a standard application with a single primary window. When compiled in, the kit replaces that window with a custom class that captures and duplicates all touches, allowing your application to show the user’s touch bubble feedback.
There is one key code-level change you must make, but it’s a very small one. In your application delegate class, you define a WINDOW_CLASS to use when building your iOS screen:
#if USES_TOUCHkit #import "TOUCHkitView.h" #import "TOUCHOverlayWindow.h" #define WINDOW_CLASS TOUCHOverlayWindow #else #define WINDOW_CLASS UIWindow #endif
Then, instead of declaring a UIWindow, you use whichever class has been set by the toggle:
WINDOW_CLASS *window; window = [[WINDOW_CLASS alloc] initWithFrame:[[UIScreen mainScreen] bounds]];
From here, you can set the window’s rootViewController as normal.
Intercepting and Forwarding Touch Events
The key to this overlay lies in intercepting touch events, creating a floating presentation above your normal interface, and then forwarding those events on to your application. A TOUCHkit view lies on top of your interface. The custom window class grabs user touch events and presents them as circles in the TOUCHkit view. It then forwards them as if the user were interacting with a normal UIWindow. To accomplish this, this recipe uses event forwarding.
Event forwarding is achieved by calling a secondary event handler. The TOUCHOverlayWindow class overrides UIWindow’s sendEvent: method to force touch drawing and then invokes its superclass implementation to return control to the normal responder chain.
The following implementation is drawn from Apple’s Event Handling Guide for iOS. It collects all the touches associated with the current event, allowing Multi-Touch as well as single-touch interactions; dispatches them to TOUCHkit view layer; and then redirects them to the window via the normal UIWindow sendEvent: implementation:
@implementation TOUCHOverlayWindow - (void)sendEvent:(UIEvent *)event { // Collect touches NSSet *touches = [event allTouches]; NSMutableSet *began = nil; NSMutableSet *moved = nil; NSMutableSet *ended = nil; NSMutableSet *cancelled = nil; // Sort the touches by phase for event dispatch for(UITouch *touch in touches) { switch ([touch phase]) { case UITouchPhaseBegan: if (!began) began = [NSMutableSet set]; [began addObject:touch]; break; case UITouchPhaseMoved: if (!moved) moved = [NSMutableSet set]; [moved addObject:touch]; break; case UITouchPhaseEnded: if (!ended) ended = [NSMutableSet set]; [ended addObject:touch]; break; case UITouchPhaseCancelled: if (!cancelled) cancelled = [NSMutableSet set]; [cancelled addObject:touch]; break; default: break; } } // Create pseudo-event dispatch if (began) [[TOUCHkitView sharedInstance] touchesBegan:began withEvent:event]; if (moved) [[TOUCHkitView sharedInstance] touchesMoved:moved withEvent:event]; if (ended) [[TOUCHkitView sharedInstance] touchesEnded:ended withEvent:event]; if (cancelled) [[TOUCHkitView sharedInstance] touchesCancelled:cancelled withEvent:event]; // Call normal handler for default responder chain [super sendEvent: event]; } @end
Implementing the TOUCHkit Overlay View
The TOUCHkit overlay is a single clear UIView singleton. It’s created the first time the application requests its shared instance, and the call adds it to the application’s key window. The overlay’s user interaction flag is disabled, allowing touches to continue past the overlay and on through the responder chain, even after processing those touches through the standard began/moved/ended/cancelled event callbacks.
The touch processing events draw a circle at each touch point, creating a strong pointer to the touches until that drawing is complete. Recipe 1-13 details the callback and drawing methods that handle that functionality.
Recipe 1-13 Creating a Touch Feedback Overlay View
@implementation TOUCHkitView { NSSet *touches; UIImage *fingers; } + (instancetype)sharedInstance { // Create shared instance if it does not yet exist if(!sharedInstance) { sharedInstance = [[self alloc] initWithFrame:CGRectZero]; } // Parent it to the key window if (!sharedInstance.superview) { UIWindow *keyWindow = [UIApplication sharedApplication].keyWindow; sharedInstance.frame = keyWindow.bounds; [keyWindow addSubview:sharedInstance]; } return sharedInstance; } // You can override the default touchColor if you want - (instancetype)initWithFrame:(CGRect)frame { self = [super initWithFrame:frame]; if (self) { self.backgroundColor = [UIColor clearColor]; self.userInteractionEnabled = NO; self.multipleTouchEnabled = YES; touchColor = [[UIColor whiteColor] colorWithAlphaComponent:0.5f]; touches = nil; } return self; } // Basic touches processing - (void)touchesBegan:(NSSet *)theTouches withEvent:(UIEvent *)event { touches = theTouches; [self setNeedsDisplay]; } - (void)touchesMoved:(NSSet *)theTouches withEvent:(UIEvent *)event { touches = theTouches; [self setNeedsDisplay]; } - (void)touchesEnded:(NSSet *)theTouches withEvent:(UIEvent *)event { touches = nil; [self setNeedsDisplay]; } // Draw touches interactively - (void)drawRect:(CGRect)rect { // Clear CGContextRef context = UIGraphicsGetCurrentContext(); CGContextClearRect(context, self.bounds); // Fill see-through [[UIColor clearColor] set]; CGContextFillRect(context, self.bounds); float size = 25.0f; // based on 44.0f standard touch point for (UITouch *touch in touches) { // Create a backing frame [[[UIColor darkGrayColor] colorWithAlphaComponent:0.5f] set]; CGPoint aPoint = [touch locationInView:self]; CGContextAddEllipseInRect(context, CGRectMake(aPoint.x - size, aPoint.y - size, 2 * size, 2 * size)); CGContextFillPath(context); // Draw the foreground touch float dsize = 1.0f; [touchColor set]; aPoint = [touch locationInView:self]; CGContextAddEllipseInRect(context, CGRectMake(aPoint.x - size - dsize, aPoint.y - size - dsize, 2 * (size - dsize), 2 * (size - dsize))); CGContextFillPath(context); } // Reset touches after use touches = nil; }