Hacker News new | past | comments | ask | show | jobs | submit login

Yes and no.

You are right in that changes to drawing induced by the original iPhone are responsible for at least part of the widgetization of CocoaTouch. The first iPhone(s) had a really, really slow CPU but somewhat decent GPU, so moving more rendering functions to the GPU made sense.

Originally, Cocoa as well as its NeXTstep predecessor did essentially all drawing on the CPU (some blotting on the NeXTdimension notwithstanding). And this was usually fast enough. At some point, window compositing was moved to the GPU (Quartz Compositor). With the phone, animations were both wanted for "haptics" and needed in order to cover for the slowness of the device (distract the monkey by animating things into place while we catch up... g ), and the CPU was also rather slow.

So instead of just compositing the contents of windows, CocoaTouch (via CoreAnimation) now could and would also composite the contents of views. But that's somewhat in conflict with the drawing model, and the conflict was never fully resolved.

> texture upload is too slow

First, you don't have to have separate textures for every bit of text. You can also just draw the text into a bigger view.

> redrawing your text each frame

Second, Cocoa does not redraw the entire screen each time, and does not have to redraw/reupload the texture each time (if it is using textures). It keeps track of damaged regions quite meticulously and only draws the parts that have changed, down to partial view precision (if the views co-operate). Views that intersect the damage get their drawRect:: method invoked, and that method gets a damage list so it can also optimise its drawing.

Now if you actually have a texture living in the GPU and you are incapable of drawing into that texture, then you must replace the texture wholesale and the rectangle/view based optimisations won't work. However, I know that they do work, at least to some extent, because we were able to optimise animations on an iOS app by switching from layer-based drawing to a view with drawRect:: and carefully computing and honouring the damage-rect. It went from using 100% CPU for 2-6 fps to 2% CPU at 60fps. (discussed in more detail with other examples in my book: iOS and macOS Performance Tuning: Cocoa, Cocoa Touch, Objective-C, and Swift, https://www.amazon.com/gp/product/0321842847/ref=as_li_tl?ie...)

Third, if your text does change, you have to redraw everything from scratch anyway.

Fourth, while the original phone was too slow for this and lots of other things, modern phones and computers are easily capable of doing that sort of drawing. The performance can sometimes be better using a pure texture approach and sometimes it is (much) better using a more drawing-centred approach (see above).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: