Events in iOS are divided into three categories: touch events, accelerometer events, and remote control events. Only objects that inherit UIResponder can receive and process events, called "responder objects". UIApplication, UIViewController, and UIView are all inherited from UIResponder. Methods provided internally by UIResponder to handle events:
Touch events: touchesBegan, touchesMoved, touchesEnded, touchesCancelled
Accelerometer events: motionBegan, motionEnded, motionCancelled
Remote Control Event: remoteControlReceivedWithEvent
UIVeiw's touch event processing process:
/** * Called when the finger starts to touch the view * * @param touches <#touches description#> * @param event <#event description#> */ - (void)touchesBegan:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event { NSLog(@"%s",__func__); } /** * Called when the finger moves on the view * * @param touches <#touches description#> * @param event <#event description#> */ - (void)touchesMoved:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event { NSLog(@"%s",__func__); } /** * Called when the finger leaves the view * * @param touches <#touches description#> * @param event <#event description#> */ - (void)touchesEnded:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event { NSLog(@"%s",__func__); } /** * Called when the touch event is interrupted by the system event * * @param touches <#touches description#> * @param event <#event description#> */ - (void)touchesCancelled:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event { NSLog(@"%s",__func__); }
A touch action will inevitably call the three methods: touchesBeagn, touchesMoved and touchesEnded.
Speaking of these touch methods, you must first know the UITouch object. When a finger touches the screen, a UITouch object associated with it will be generated, and a finger corresponds to a UITouch object. This object stores the information of this touch, such as the touch location, time, stage, etc. When the finger moves, the system will update the same UITouch object. Enable it to save the touch position information where the finger is located. When the finger leaves the screen, the system destroys the corresponding UITouch object.
@interface UITouch : NSObject @property(nonatomic,readonly) NSTimeInterval timestamp; @property(nonatomic,readonly) UITouchPhase phase; @property(nonatomic,readonly) NSUInteger tapCount; // touch down within a certain point within a certain amount of time // majorRadius and majorRadiusTolerance are in points // The majorRadius will be accurate +/- the majorRadiusTolerance @property(nonatomic,readonly) CGFloat majorRadius NS_AVAILABLE_IOS(8_0); @property(nonatomic,readonly) CGFloat majorRadiusTolerance NS_AVAILABLE_IOS(8_0); @property(nullable,nonatomic,readonly,strong) UIWindow *window; @property(nullable,nonatomic,readonly,strong) UIView *view; @property(nullable,nonatomic,readonly,copy) NSArray <UIGestureRecognizer *> *gestureRecognizers NS_AVAILABLE_IOS(3_2); //Get the current location- (CGPoint)locationInView:(nullable UIView *)view; //Get the location of the previous touch point- (CGPoint)previousLocationInView:(nullable UIView *)view; // Force of the touch, where 1.0 represents the force of an average touch @property(nonatomic,readonly) CGFloat force NS_AVAILABLE_IOS(9_0); // Maximum possible force with this input mechanism @property(nonatomic,readonly) CGFloat maximumPossibleForce NS_AVAILABLE_IOS(9_0); @end
eg: Let a view move as the finger moves
/** * Called when the finger moves on the view * * @param touches <#touches description#> * @param event <#event description#> */ - (void)touchesMoved:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event { NSLog(@"%s",__func__); //Get UITouch object UITouch *touch = [touches anyObject]; //Get the current point's location CGPoint curP = [touch locationInView:self]; //Get the position of the previous point CGPoint preP = [touch previousLocationInView:self]; // Calculate the offset of x CGFloat offsetX = - ; // Calculate the offset of y CGFloat offsetY = = ; //Modify the location of the view = CGAffineTransformTranslate(, offsetX, offsetY); }
It is implemented based on the location information saved in the UITouch object.
Generation and delivery of events:
When the touch event occurs, the system will add the event to an event queue managed by UIApplication. UIApplication will take out the first event from the queue and send it to the main window of the application. The main window will find the most suitable view in the view hierarchy and call the touches method to handle touch events. The passing of a touch event is passed from the parent control to the child control. If the parent control cannot receive a touch event, then the child control cannot receive a touch event.
How to find the most suitable control to handle events? First, determine whether you can receive touch events? Is the touch point on yourself? Iterate through the subcontrol from behind to forward and repeat the previous two steps. If there is no subcontrol that meets the criteria, then it is most suitable for you to deal with it yourself.
Use the hitTest:withEvent: method to find the most suitable view, and use the pointInside method to determine whether this point is in the method caller, i.e. the control.
The underlying implementation of the hitTest method:
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event { //Judge whether the current control can receive touch events if ( == NO || == YES || <= 0.01) { return nil; } //Judge whether the touch point is on the current control if ([self pointInside:point withEvent:event] == NO) { return nil; } //Transfer your own child control from behind to front NSInteger count = ; for (NSInteger i = count - 1; i >= 0; i--) { UIView *childView = [i]; //Convert the coordinate system on the current control to the coordinate system on the subcontrol CGPoint childPoint = [self convertPoint:point toView:childView]; //Recursively call the hitTest method to find the most suitable view UIView *fitView = [childView hitTest:childPoint withEvent:event]; if (fitView) { return fitView; } } //The loop ends, there is no more suitable view than yourself, return to yourself return self; }
However, using the touches method to monitor touch events has disadvantages, such as customizing the view, so after iOS3.2, Apple launched the gesture recognition function UIGestureRecognizer. UIGestureRecognizer is an abstract class whose subclass can handle a specific gesture.
There are several specific gestures:
//Tap gesture// UITapGestureRecognizer *tap = [UITapGestureRecognizer alloc]initWithTarget:<#(nullable id)#> action:<#(nullable SEL)#> //Long press gesture is triggered twice by default// UILongPressGestureRecognizer *longP = [UILongPressGestureRecognizer alloc]initWithTarget:<#(nullable id)#> action:<#(nullable SEL)#> //Swipe gesture The default direction is to go to the right// UISwipeGestureRecognizer *swipe = [UISwipeGestureRecognizer alloc]initWithTarget:<#(nullable id)#> action:<#(nullable SEL)#> //Swivel gesture// UIRotationGestureRecognizer *rotation = [UIRotationGestureRecognizer alloc]initWithTarget:<#(nullable id)#> action:<#(nullable SEL)#> //Pinch gesture// UIPinchGestureRecognizer *pinch = [UIPinchGestureRecognizer alloc]initWithTarget:<#(nullable id)#> action:<#(nullable SEL)#> //Drag gesture// UIPanGestureRecognizer *pan = [UIPanGestureRecognizer alloc]initWithTarget:<#(nullable id)#> action:<#(nullable SEL)#>
Practical application:
@interface ViewController ()<UIGestureRecognizerDelegate> @property (weak, nonatomic) IBOutlet UIImageView *imageView; @end @implementation ViewController - (void)viewDidLoad { [super viewDidLoad]; [self setUpPinch]; [self setUpRotation]; [self setUpPan]; } #pragma mark - Gesture Proxy Method// Whether to enable triggering gestures//- (BOOL)gestureRecognizerShouldBegin:(UIGestureRecognizer *)gestureRecognizer //{ // return NO; //} // Whether to support multiple gestures at the same time, the default is that multiple gestures are not supported// Return to yes to support multiple gestures- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer { return YES; } // Whether to accept finger touch points//- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch{ // // Get the current touch point// CGPoint curP = [touch locationInView:]; // // if ( < * 0.5) { // return NO; // }else{ // return YES; // } //} #pragma mark - Tap gesture - (void)setUpTap { // Create a tap gesture UITapGestureRecognizer *tap = [[UITapGestureRecognizer alloc] initWithTarget:self action:@selector(tap:)]; = self; [_imageView addGestureRecognizer:tap]; } - (void)tap:(UITapGestureRecognizer *)tap { NSLog(@"%s",__func__); } #pragma mark - long press gesture// It will be triggered twice by default- (void)setUpLongPress { UILongPressGestureRecognizer *longPress = [[UILongPressGestureRecognizer alloc] initWithTarget:self action:@selector(longPress:)]; [ addGestureRecognizer:longPress]; } - (void)longPress:(UILongPressGestureRecognizer *)longPress { if ( == UIGestureRecognizerStateBegan) { NSLog(@"%s",__func__); } } #pragma mark - swipe- (void)setUpSwipe { // The default swipe direction is to the right UISwipeGestureRecognizer *swipe = [[UISwipeGestureRecognizer alloc] initWithTarget:self action:@selector(swipe)]; = UISwipeGestureRecognizerDirectionUp; [ addGestureRecognizer:swipe]; // If you want a control to support swipes in multiple directions in the future, you must create multiple swipes, and a swipe gesture only supports one direction. // The default swipe direction is to the right UISwipeGestureRecognizer *swipeDown = [[UISwipeGestureRecognizer alloc] initWithTarget:self action:@selector(swipe)]; = UISwipeGestureRecognizerDirectionDown; [ addGestureRecognizer:swipeDown]; } - (void)swipe { NSLog(@"%s",__func__); } #pragma mark - Rotate gesture- (void)setUpRotation { UIRotationGestureRecognizer *rotation = [[UIRotationGestureRecognizer alloc] initWithTarget:self action:@selector(rotation:)]; = self; [ addGestureRecognizer:rotation]; } // The angle of rotation passed by default is relative to the initial position- (void)rotation:(UIRotationGestureRecognizer *)rotation { = CGAffineTransformRotate(, ); // Reset = 0; // Get the angle of the gesture rotation NSLog(@"%f",); } #pragma mark - pinch- (void)setUpPinch { UIPinchGestureRecognizer *pinch = [[UIPinchGestureRecognizer alloc] initWithTarget:self action:@selector(pinch:)]; = self; [ addGestureRecognizer:pinch]; } - (void)pinch:(UIPinchGestureRecognizer *)pinch { = CGAffineTransformScale(, , ); // Reset = 1; } #pragma mark - drag and drop- (void)setUpPan { UIPanGestureRecognizer *pan = [[UIPanGestureRecognizer alloc] initWithTarget:self action:@selector(pan:)]; [ addGestureRecognizer:pan]; } - (void)pan:(UIPanGestureRecognizer *)pan { // Get the touch point of gesture // CGPoint curP = [pan locationInView:]; // Mobile view // Get the movement of the gesture, which is also relative to the initial position CGPoint transP = [pan translationInView:]; = CGAffineTransformTranslate(, , ); // Reset [pan setTranslation:CGPointZero inView:]; // NSLog(@"%@",NSStringFromCGPoint(curP)); } @end
The above is an introduction to the related content of iOS touch events and gestures. I hope it will be helpful to everyone to learn iOS programming.