"It's too complicated" moaned George, he continued:
"Mr Main has asked me to deliver all of these invites to the birthday party but I keep forgetting which people I've given invitations to. I gave Sam three invites and Polly hasn't got one yet, it's no use! I can't remember which invite is which!"
"Don't worry" said Ivy, "lets work together to make this more simple! I'll remember which invitation we're giving and which one is next, so all you have to do is deliver the invite!"
George beamed, "Really? You can do that for me?"
"Sure thing George! They don't call me Ivy Iterator for nothing!"
My favorite was this nice redesign I did where the form fields had no labels. Instead the designer cleverly used placeholders on input elements to identify them. Moreover they had a great deal of customization for each form to make them look nice and compact. At first glance this looked fantastic, until you try to add an error message to it.
I pointed this out to them, and their first inclination was to highlight the border of the errored out elements in red. Great, except for the large minority of users who are color blind. Also, this doesn't indicate what the error is, just that there is one. On top of that, form-wide (vs field-specific) errors still don't have a place.
This drives me crazy, because it's not like I can logically extend their design and say "let's put the errors here. What do you think?" Their design was so tight (in a good way I suppose) that it left no room for this kind of thing.
My suggestion is to make your version so ugly that it cannot be overlooked. Put the errors in size 72 font in neon green right over the fields. No way for it to go into production that way.
> In what freezing fucking hell is a dual-core, 1 GHz computer with gigabytes of RAM and tens of gigabytes of storage and 3D acceleration that can fit in my pocket memory-starved and CPU-starved?
Sounds to me like you've never developed seriously on an ARM chipset. These devices are worlds apart from your standard desktop, there is a reason that both Android and IOS dropped Adobe flash. It's partly the hardware and partly shitty ARM code, its not really much to do with the specs. I can do things much more easily on an underpowered x86 than an overpowered ARM.
The original iPhone, back when these decisions were made to not support flash, used an arm 1176 processor underclocked at 412MHz. That was a single-issue, in-order core without SIMD. Consider the Cortex-A15 and the latest Qualcomm parts; they're at least three-wide fully out-of-order cores supporting 128b SIMD operation with fused multiply-add, clocked at 1.5-2GHz in two- and four-core configurations. They are far more similar to a low-voltage Core2 package than they are to the armv6 processors of the first iPhone and Android devices (and in some ways they’re actually nicer to program for), and would easily be capable of handling a flash runtime.
> Sounds to me like you've never developed seriously on an ARM chipset.
You're selling cucumbers to the gardener, I was actually one career choice away from designing chips, and wrote ARM assembly before there was anything such as a tablet.
A mobile phone is slow in comparison to a desktop, but not slow enough to afford an excuse for lagging in most of today's mobile applications. If a Facebook client, a mail application, a simple 2D game or a music player lags on such a mobile phone, it does so because it's a piece of crap.
I don't think that's a fair assessment. The challenge is that developing for mobile is nowhere near _as_ _easy_ as desktop. So we have a legion of desktop devs coming over to mobile and getting lost, their code doesn't "suck" its fine for desktop its just not good mobile code.
Also, power management.
> their code doesn't "suck" its fine for desktop its just not good mobile code
IMHO, code that is not adequate for a platform on which it is intentionally deployed, by definition, sucks. There's no such thing as good application that is fine for any computer except those it is ran on.
As a person who used an LG "smart"phone that couldn't handle the bare Android OS without lagging (and turning on Wi-Fi would freeze it to death), I agree wholeheartedly. Intentionally selling crap that doesn't work is evil in my book.
> If a Facebook client, a mail application, a simple 2D game or a music player lags on such a mobile phone, it does so because it's a piece of crap.
Fair point. It has more to do with the way these apps are cobbled together out of heterogeneous chunks of code, just to make them look "cool". The native frameworks are lacking in terms of their ability to easily customize the controls, so people start applying crazy hacks just to mimic some functionality seen in another app, without any regard for the performance. It just has to "work".
That's my understanding, too. Developers are using high-level frameworks that generate an absurd number of redraws. In many cases, the bottleneck isn't even the CPU or the memory, but simply pushing too many pixels to the screen. It doesn't help that we expect much snappier response from touch interfaces than from 10-year-old desktops. Two seconds to open a new screen was somewhat acceptable in a VB6 application. Try to do that in an Android app, and see the kind of rating you get.
Yep. Who wants to make bad ports easier? Their goal is to keep their users hooked, and people are cheap (or stupid if you prefer) and approach value from the wrong direction, money spent instead of value gained.
Even on the slow (by smartphone standards) Nexus One, Flash ran quite decently. The Nexus 5 is a good magnitude faster in every way (I can't find specific parity-version benchmarks, so perhaps I'll fire both of them up and give it a go), moreso in some ways like the GPU, and is as powerful as some desktops that people still use in business settings. It would of course have no problem with Flash.
Flash failed because a good percentage of existing content relied upon the accouterments of a desktop, namely the keyboard and the mouse. Minus these it just made for a frustrating experience for a lot of users. Add the fact that sites were loaded with obnoxious, taxing Flash ads, and it just gave users of Flash-enabled devices a very negative experience. It also reflected poorly on the product as a number of popular tech sites compared Android and iOS web page loading times, the former seriously hindered by the loading and overhead of Flash.
There were later changes to make it activate on click, but that just made it even more of a usability burden.
In a way, at the time this whole debate was raging, iOS users got to enjoy essentially a free "adblock" in the absence of Flash.
I have to entirely disagree with any notion that smartphones are underpowered: I am currently working on a very intense real-time image processing system and with each iteration I'm finding that I'm increasing the scope and featureset because the performance continually blows me away. Even when I work on "older" devices like the Galaxy S3, with good code and good parallelization (incl. the GPU), it is just a ridiculous platform.
:D. Why is anyone even considering FP vs OO programming a choice? I've used functional patterns within OO before and I'm sure the opposite is possible.
Both have merit depending on the skill-set and staff available to you, scale of project and existing infrastructure.
I was however very disappointed when the author bundled interfaces into the mix as if it was the same as the rest of it. The interface enables you to support many different types of behaviour and enables you to structure your libraries in different ways. It's a different aspect from just classes themselves.
Genuinely I don't think shit like this is healthy. Smacks of religion.
What the hell is up with that attitude? You're completely missing the point of a welcoming community by closing the door with an RTFM sign. He's writing his own shit but is wondering if there is a library out there already that is better/already does it.
The point of a community is to be helped and then help. You make it sound like you got to where you are by yourself and that's the only _correct_ way. To be frank this attitude has no place in a community.
"Does thing belong here?"
NO. "Are we helping each other?" That's should always be the purpose of any community site and the reciprocation of help is something that has driven every newsgroup, irc channel and community dev sites for DECADES.
Do it for me is not a question. It is a demand and a lazy one.
> You make it sound like you got to where you are by yourself
Hardly. I ask questions. RTFM doesn't answer everything, nor does it explain, but you actually ask intelligent questions and are in a position to understand answers if you did some work before hand. Of course the referenced post wasn't looking for an answer to understand, they wanted 'use libfoo.'
> "Are we helping each other?" That's should always be the purpose of any community site
I have to write a script to install this thing here at work, would you mind doing it for me? That would be a BIG HELP to me.
See the problem yet? Technically it is a question, it would be a help to me. I suppose a community site would be more than happy to go about that then.
Oh and the response on any newsgroup for decades would be "Do your own damn homework, come back when you have a question."
I know what you're talking about but I think you just failed to read the second paragraph of the post.
> I'm using OpenCV for detecting the faces and a rough Eigenfaces Algorithm for the recognition now. But I thought there should be something out there with a better performance then a self written Eigenfaces Algorithm.
Does this sound like a "plz send me teh codez" to you?