Interesting post. It feels quite closely analogous to the virtual-DOM diffing approaches that have dominated client-side JS frameworks lately.
Is there any history of this technique being used in game engine design? Or is that more concerned about optimizing for the worst case (thou shalt not drop frames) than for the average case?
So I'll be that person: if it can be enabled in the 58 beta, does that mean it's on track for being included in that or another specific release? Or is it not planned yet?
That second graph and discussion is really frustrating. The claim is that latencies from 38-50 get shifted down into other buckets, and the the 51+ bucket increases is an artifact of the lower aggregate numbers.
Maybe the data supports this, but there's no way to know looking at the graph. The histogram should be scaled against total counts rather than the subcounts (just zoomed in appropriately for the dataset). As shown, its bad statistics at best, and disingenuous lies at worst
Will this improve remote X11 display performance for Firefox?
Previously on HN: item?id=16112283