tldr; Auto constrains appear to break on push segue and return to view for custom cells
Edit: I have provided a github example proj
I had the exact same problem. The table view had several different cell classes, each of which was a different height. Moreover, one of the cells classes had to show additional text, meaning further variation.
Scrolling was perfect in most situations. However, the same problem described in the question manifested. That was, having selected a table cell and presented another view controller, on return to the original table view, the upwards scrolling was extremely jerky.
The first line of investigation was to consider why data was being reloaded at all. Having experimented, I can confirm that on return to the table view, data is reloaded, albeit not using reloadData
.
See my comment ios 8 tableview reloads automatically when view appears after pop
With no mechanism to deactivate this behaviour, the next line of approach was to investigate the jerky scrolling.
I came to the conclusion that the estimates returned by estimatedHeightForRowAtIndexPath
are an estimated precalculation. Log to console out the estimates and you'll see that the delegate method is queried for every row when the table view first appears. That's before any scrolling.
I quickly discovered that some of the height estimate logic in my code was badly wrong. Resolving this fixed the worst of the jarring.
To achieve perfect scrolling, I took a slightly different approach to the answers above. The heights were cached, but the values used were from the actual heights that would have been captured as the user scrolls downwards:
var myRowHeightEstimateCache = [String:CGFloat]()
To store:
func tableView(tableView: UITableView, didEndDisplayingCell cell: UITableViewCell, forRowAtIndexPath indexPath: NSIndexPath) {
myRowHeightEstimateCache["\(indexPath.row)"] = CGRectGetHeight(cell.frame)
}
Using from the cache:
func tableView(tableView: UITableView, estimatedHeightForRowAtIndexPath indexPath: NSIndexPath) -> CGFloat
{
if let height = myRowHeightEstimateCache["\(indexPath.row)"]
{
return height
}
else
{
// Not in cache
... try to figure out estimate
}
Note that in the method above, you will need to return some estimate, as that method will of course be called before didEndDisplayingCell
.
My guess is that there is some sort of Apple bug underneath all of this. That's why this issue only manifests in an exit scenario.
Bottom line is that this solution is very similar to those above. However, I avoid any tricky calculations and make use of the UITableViewAutomaticDimension
behaviour to just cache the actual row heights displayed using didEndDisplayingCell
.
TLDR: work around what's most likely a UIKit defect by caching the actual row heights. Then query your cache as the first option in the estimation method.