Home > Software engineering >  Completion block of animation is performed immediately
Completion block of animation is performed immediately

Time:07-10

I'm trying to remove the custom view from the superview after the end of the animation in the completion block, but it is called immediately and the animation becomes sharp. I managed to solve the problem in a not very good way: just adding a delay to remove the view.

Here is the function for animating the view:

private func animatedHideSoundView(toRight: Bool) {
   let translationX = toRight ? 0.0 : -screenWidth
   UIView.animate(withDuration: 0.5) {
        self.soundView.transform = CGAffineTransform(translationX: translationX, y: 0.0)
   } completion: { isFinished in
        if isFinished {
            self.soundView.removeFromSuperview()
            self.songPlayer.pause()
        }
    }
}

The problem in this line: self.soundView.removeFromSuperview()

When I call this function in the switch recognizer.state completion block statement it executes early and when elsewhere everything works correctly.

@objc private func soundViewPanned(recognizer: UIPanGestureRecognizer) {
        let touchPoint = recognizer.location(in: view)
        switch recognizer.state {
        case .began:
            initialOffset = CGPoint(x: touchPoint.x - soundView.center.x, y: touchPoint.y - soundView.center.y)
        case .changed:
            soundView.center = CGPoint(x: touchPoint.x - initialOffset.x, y: touchPoint.y - initialOffset.y)
            if notHiddenSoundViewRect.minX > soundView.frame.minX {
                animatedHideSoundView(toRight: false)
            } else if notHiddenSoundViewRect.maxX < soundView.frame.maxX {
                animatedHideSoundView(toRight: true)
            }
        case .ended, .cancelled:
            let decelerationRate = UIScrollView.DecelerationRate.normal.rawValue
            let velocity = recognizer.velocity(in: view)
            let projectedPosition = CGPoint(
                x: soundView.center.x   project(initialVelocity: velocity.x, decelerationRate: decelerationRate),
                y: soundView.center.y   project(initialVelocity: velocity.y, decelerationRate: decelerationRate)
            )
            let nearestCornerPosition = nearestCorner(to: projectedPosition)
            let relativeInitialVelocity = CGVector(
                dx: relativeVelocity(forVelocity: velocity.x, from: soundView.center.x, to: nearestCornerPosition.x),
                dy: relativeVelocity(forVelocity: velocity.y, from: soundView.center.y, to: nearestCornerPosition.y)
            )
            let timingParameters = UISpringTimingParameters(dampingRatio: 0.8, initialVelocity: relativeInitialVelocity)
            let animator = UIViewPropertyAnimator(duration: 0.5, timingParameters: timingParameters)
            animator.addAnimations {
                self.soundView.center = nearestCornerPosition
            }
            animator.startAnimation()
        default: break
        }
    }

I want the user to be able to swipe this soundView off the screen.

That's why I check where the soundView is while the user is moving it, so that if he moves the soundView near the edge of the screen, I can hide the soundView animatedly.

Maybe I'm doing it wrong, but I couldn't think of anything else, because I don't have much experience. Could someone give me some advice on this?

I managed to solve it this way, but I don't like it:

private func animatedHideSoundView(toRight: Bool) {
    let translationX = toRight ? 0.0 : -screenWidth
    UIView.animate(withDuration: 0.5) {
        self.soundView.transform = CGAffineTransform(translationX: translationX, y: 0.0)
    }        
    DispatchQueue.main.asyncAfter(deadline: .now()   0.5) {
        self.soundView.removeFromSuperview()
        self.songPlayer.pause()
    }
}

enter image description here

You can see and run all code here: https://github.com/swiloper/AnimationProblem

CodePudding user response:

Couple notes...

First, in your controller code, you are calling animatedHideSoundView() from your pan gesture recognizer every time you move the touch. It's unlikely that's what you want to do.

Second, if you call animatedHideSoundView(toRight: true) your code:

private func animatedHideSoundView(toRight: Bool) {
   let translationX = toRight ? 0.0 : -screenWidth
   UIView.animate(withDuration: 0.5) {
        self.soundView.transform = CGAffineTransform(translationX: translationX, y: 0.0)
   } completion: { isFinished in
        if isFinished {
            self.soundView.removeFromSuperview()
            self.songPlayer.pause()
        }
    }
}

sets translationX to Zero ... when you then try to animate the transform, the animation will take no time because you're not changing the x.

Third, I strongly suggest that you start simple. The code you linked to cannot be copy/pasted/run, which makes it difficult to offer help.

Here's a minimal version of your UniversalTypesViewController class (it uses your linked SoundView class):

final class UniversalTypesViewController: UIViewController {
    
    // MARK: Properties
    
    private lazy var soundView = SoundView(frame: CGRect(x: 0, y: 0, width: 80, height: 80))
    private let panGestureRecognizer = UIPanGestureRecognizer()
    private var initialOffset: CGPoint = .zero
    
    override func viewDidLoad() {
        super.viewDidLoad()
        view.backgroundColor = .systemYellow
        panGestureRecognizer.addTarget(self, action: #selector(soundViewPanned(recognizer:)))
        soundView.addGestureRecognizer(panGestureRecognizer)
    }
    
    private func animatedShowSoundView() {
        // reset soundView's transform
        soundView.transform = .identity
        // add it to the view
        view.addSubview(soundView)
        // position soundView near bottom, but past the right side of view
        soundView.frame.origin = CGPoint(x: view.frame.width, y: view.frame.height - soundView.frame.height * 2.0)
        soundView.startSoundBarsAnimation()

        // animate soundView into view
        UIView.animate(withDuration: 0.5, delay: 0.0, options: .curveEaseOut) {
            self.soundView.transform = CGAffineTransform(translationX: -self.soundView.frame.width * 2.0, y: 0.0)
        }
    }
    
    private func animatedHideSoundView(toRight: Bool) {
        let translationX = toRight ? view.frame.width : -(view.frame.width   soundView.frame.width)
        UIView.animate(withDuration: 0.5) {
            self.soundView.transform = CGAffineTransform(translationX: translationX, y: 0.0)
        } completion: { isFinished in
            if isFinished {
                self.soundView.removeFromSuperview()
                //self.songPlayer.pause()
            }
        }
    }
    
    override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
        // if soundView is not in the view hierarchy,
        //  animate it into view - animatedShowSoundView() func adds it as a subview
        if soundView.superview == nil {
            animatedShowSoundView()
        } else {
            // unwrap the touch
            guard let touch = touches.first else { return }
            // get touch location
            let loc = touch.location(in: self.view)
            // if touch is inside the soundView frame,
            //  return, so pan gesture can move soundView
            if soundView.frame.contains(loc) { return }
            // if touch is on the left-half of the screen,
            //  animate soundView to the left and remove after animation
            if loc.x < view.frame.midX {
                animatedHideSoundView(toRight: false)
            } else {
                // touch is on the right-half of the screen,
                //  so just remove soundView
                animatedHideSoundView(toRight: true)
            }
        }
    }
    // MARK: Objc methods
    
    @objc private func soundViewPanned(recognizer: UIPanGestureRecognizer) {
        let touchPoint = recognizer.location(in: view)
        switch recognizer.state {
        case .began:
            initialOffset = CGPoint(x: touchPoint.x - soundView.center.x, y: touchPoint.y - soundView.center.y)
        case .changed:
            soundView.center = CGPoint(x: touchPoint.x - initialOffset.x, y: touchPoint.y - initialOffset.y)
        case .ended, .cancelled:
            ()
        default: break
        }
    }
    
}

If you run that, tapping anywhere will animate soundView into view at bottom-right. You can then drag soundView around.

If you tap away from soundView frame, on the left-half of the screen, soundView will be animated out to the left and removed after animation completes.

If you tap away from soundView frame, on the right-half of the screen, soundView will be animated out to the right and removed after animation completes.

Once you've got that working, and you see what's happening, you can implement it in the rest of your much-more-complex code.


Edit

Take a look at this modified version of your code.

One big problem in your code is that you're making multiple calls to animatedHideSoundView(). When the drag gets near the edge, your code calls that... but then it gets called again because the drag is still "active."

So, I added a var isHideAnimationRunning: Bool flag so calls to positioning when dragging and positioning when "hide" animating don't conflict.

A few other changes:

  • instead of mixing Transforms with .center positioning, get rid of the Transforms and just use .center
  • I created a struct with logically named corner points - makes it much easier to reference them
  • strongly recommended: add comments to your code!

So, give this a try:

import UIKit

let screenWidth: CGFloat = UIScreen.main.bounds.width
let screenHeight: CGFloat = UIScreen.main.bounds.height

let sideSpacing: CGFloat = 32.0
let mediumSpacing: CGFloat = 16.0

var isNewIphone: Bool {
    return screenHeight / screenWidth > 1.8
}

extension CGPoint {
    func distance(to point: CGPoint) -> CGFloat {
        return sqrt(pow(point.x - x, 2)   pow(point.y - y, 2))
    }
}

// so we can refer to corner positions by logical names
struct CornerPoints {
    var topLeft: CGPoint = .zero
    var bottomLeft: CGPoint = .zero
    var bottomRight: CGPoint = .zero
    var topRight: CGPoint = .zero
}

final class ViewController: UIViewController {

    private var cornerPoints = CornerPoints()
    
    private let soundViewSide: CGFloat = 80.0
    private lazy var halfSoundViewWidth = soundViewSide / 2
    
    private lazy var newIphoneSpacing = isNewIphone ? mediumSpacing : 0.0
    
    private lazy var soundView = SoundView(frame: CGRect(origin: .zero, size: CGSize(width: soundViewSide, height: soundViewSide)))
    
    private lazy var notHiddenSoundViewRect = CGRect(x: mediumSpacing, y: 0.0, width: screenWidth - mediumSpacing * 2, height: screenHeight)
    
    private var initialOffset: CGPoint = .zero
    
    override func viewDidLoad() {
        super.viewDidLoad()
        view.backgroundColor = .yellow

        // setup corner points
        let left = sideSpacing   halfSoundViewWidth
        let right = view.frame.maxX - (sideSpacing   halfSoundViewWidth)
        let top = sideSpacing   halfSoundViewWidth - newIphoneSpacing
        let bottom = view.frame.maxY - (sideSpacing   halfSoundViewWidth - newIphoneSpacing)
        
        cornerPoints.topLeft        = CGPoint(x: left, y: top)
        cornerPoints.bottomLeft     = CGPoint(x: left, y: bottom)
        cornerPoints.bottomRight    = CGPoint(x: right, y: bottom)
        cornerPoints.topRight       = CGPoint(x: right, y: top)

        let panGestureRecognizer = UIPanGestureRecognizer()
        panGestureRecognizer.addTarget(self, action: #selector(soundViewPanned(recognizer:)))
        soundView.addGestureRecognizer(panGestureRecognizer)
        
        // for development, let's add a double-tap recognizer to
        //  add the soundView again (if it's been removed)
        let dt = UITapGestureRecognizer(target: self, action: #selector(showAgain(_:)))
        dt.numberOfTapsRequired = 2
        view.addGestureRecognizer(dt)
        
        DispatchQueue.main.asyncAfter(deadline: .now()   1.0) {
            self.animatedShowSoundView()
        }
    }
    
    @objc func showAgain(_ f: UITapGestureRecognizer) {
        // if soundView has been removed
        if soundView.superview == nil {
            // add it
            animatedShowSoundView()
        }
    }
    
    private func animatedShowSoundView() {
        // start at bottom-right, off-screen to the right
        let pt: CGPoint = cornerPoints.bottomRight
        soundView.center = CGPoint(x: screenWidth   soundViewSide, y: pt.y)
        
        view.addSubview(soundView)
        soundView.startSoundBarsAnimation()
        
        // animate to bottom-right corner
        UIView.animate(withDuration: 0.5, delay: 0.0, options: .curveEaseOut) {
            self.soundView.center = pt
        }
    }
    
    // flag so we know if soundView is currently
    //  "hide" animating
    var isHideAnimationRunning: Bool = false
    
    private func animatedHideSoundView(toRight: Bool) {
        
        // only execute if soundView is not currently "hide" animating
        if !isHideAnimationRunning {
            // set flag to true
            isHideAnimationRunning = true
            
            // target center X
            let targetX: CGFloat = toRight ? screenWidth   soundViewSide : -soundViewSide
            
            UIView.animate(withDuration: 0.5) {
                self.soundView.center.x = targetX
            } completion: { isFinished in
                self.isHideAnimationRunning = false
                if isFinished {
                    self.soundView.removeFromSuperview()
                    //self.songPlayer.pause()
                }
            }
        }
        
    }
    
    @objc private func soundViewPanned(recognizer: UIPanGestureRecognizer) {
        let touchPoint = recognizer.location(in: view)
        switch recognizer.state {
        case .began:
            // only execute if soundView is not currently "hide" animating
            if !isHideAnimationRunning {
                initialOffset = CGPoint(x: touchPoint.x - soundView.center.x, y: touchPoint.y - soundView.center.y)
            }
        case .changed:
            // only execute if soundView is not currently "hide" animating
            if !isHideAnimationRunning {
                soundView.center = CGPoint(x: touchPoint.x - initialOffset.x, y: touchPoint.y - initialOffset.y)
                if notHiddenSoundViewRect.minX > soundView.frame.minX {
                    animatedHideSoundView(toRight: false)
                } else if notHiddenSoundViewRect.maxX < soundView.frame.maxX {
                    animatedHideSoundView(toRight: true)
                }
            }
        case .ended, .cancelled:
            // only execute if soundView is not currently "hide" animating
            if !isHideAnimationRunning {
                let decelerationRate = UIScrollView.DecelerationRate.normal.rawValue
                let velocity = recognizer.velocity(in: view)
                let projectedPosition = CGPoint(
                    x: soundView.center.x   project(initialVelocity: velocity.x, decelerationRate: decelerationRate),
                    y: soundView.center.y   project(initialVelocity: velocity.y, decelerationRate: decelerationRate)
                )
                let nearestCornerPosition = nearestCorner(to: projectedPosition)
                let relativeInitialVelocity = CGVector(
                    dx: relativeVelocity(forVelocity: velocity.x, from: soundView.center.x, to: nearestCornerPosition.x),
                    dy: relativeVelocity(forVelocity: velocity.y, from: soundView.center.y, to: nearestCornerPosition.y)
                )
                let timingParameters = UISpringTimingParameters(dampingRatio: 0.8, initialVelocity: relativeInitialVelocity)
                let animator = UIViewPropertyAnimator(duration: 0.5, timingParameters: timingParameters)
                animator.addAnimations {
                    self.soundView.center = nearestCornerPosition
                }
                animator.startAnimation()
            }
        default: break
        }
    }
    
    private func project(initialVelocity: CGFloat, decelerationRate: CGFloat) -> CGFloat {
        return (initialVelocity / 1000) * decelerationRate / (1 - decelerationRate)
    }
    
    private func nearestCorner(to point: CGPoint) -> CGPoint {
        var minDistance = CGFloat.greatestFiniteMagnitude
        var nearestPosition = CGPoint.zero
        
        for position in [cornerPoints.topLeft, cornerPoints.bottomLeft, cornerPoints.bottomRight, cornerPoints.topRight] {
            let distance = point.distance(to: position)
            if distance < minDistance {
                nearestPosition = position
                minDistance = distance
            }
        }
        return nearestPosition
    }
    
    /// Calculates the relative velocity needed for the initial velocity of the animation.
    private func relativeVelocity(forVelocity velocity: CGFloat, from currentValue: CGFloat, to targetValue: CGFloat) -> CGFloat {
        guard currentValue - targetValue != 0 else { return 0 }
        return velocity / (targetValue - currentValue)
    }
}
  • Related