They throw words like polynomial and I've asked people with comp sci degrees to define what they are. I get back replies like "We learned it as a polynomial and never got a definition of it." OK look, here is what it means and some people with a comp sci PHD cannot even tell me, poly means many, right, and nomial like numbers or formulas or rather in computer science and algorythm. You have to study how algorithms work, I asked this on Hacker News and many tried to bury the story because most couldn't even answer that, and thanks to those who did. I am glad to see someone else bring it up and hope it sheds more light on it.
Consider this, that you really cannot get a good estimate of how long any algorithm can run because you got other factors in play. You got multicore CPUs now, and some of the work is offloaded to GPUs as well (like Bitcoin mining programs using GPUs, well now algorithms can use GPUs for some of the processing), plus you got caches on processors you didn't have in 1971, not only that but the law of thermodynamics applies to computers and electronics and many people forget that, so if your CPU is over heating it could slow down if you have a variable fan it could cool the CPU down and it runs faster for a bit and then heats up and slows down a bit, most operating systems are multitasking so the program that runs the algorithm might take longer if the Apache HTTPD server is in use with heavy traffic on the server the algorythm runs on, not only that but Anonymous might have sent a DDoS attack to your server the algorithm is running on. I hope those with a PHD in computer science figure those as factors when they explore this issue.
Excuse me, but it seems you have no idea what you're talking about.
>Consider this, that you really cannot get a good estimate of how long any algorithm can run because you got other factors in play
That has absolutely nothing to do with the complexity of the algorithm. Also, it _can_ be estimated, too.
I wonder if you're trolling or just clueless
Given the curious interpretation of ‘polynomial’ by means of an only somewhat correct translation, I’d tend to the latter.
Is Wikipedia not correct on this issue? I was told on one hand to look stuff up on Wikipedia, and on another I found out it is not always correct.
A polynomial function, in this context, means a function that is/can be expressed as a sum of powers (usually restricted to a finite number of addends) of its input variables with appropriate prefactors. For example,
f(x) = 1
exp(x) = 1 + x + 1/2 x² + 1/6 x³ + …
Furthermore, you didn’t only get downvoted for misinterpreting polynomial, but also for the strange mixing of theoretical CS, which is concerned with the complexity of algorithms at hand, and real-world effects such as slowdowns due to other programs multitasking, CPU caches etc. etc. which are rarely taken into account when deciding whether a given algorithm is remotely feasible to implement (runs in time, say, n^10 or faster) or just too slow.
So a Polynomial doesn't need be many of them, it can be just one of them. The poly in the name threw me off I admit. I would have assumed that was a mononomial for just one and two of them are a binomial but apparently polynomial is used instead even for only one or two? I hope you understand my confusion there. Thank you for clearing that up.
In science in general, including computer science, "theoretical" doesn't mean fringe or unexplored, it means foundational and abstract. CS theory uses the word "theory" in the same sense as music theory. CS theory deals with well-defined, rigorous analysis of computation in abstract or mathematical terms.
I would have assumed that was a mononomial for just one and two of them are a binomial but apparently polynomial is used instead even for only one or two?
A two-termed polynomial is also called a binomial. "Binomial" and "monomial" are special cases of the more general term "polynomial".
The relevant point is that it usually doesn’t matter whether you have one, five or 500 terms in a polynomial, as the largest one will certainly dominate for sufficiently large input sizes and all terms in a polynomial essentially behave the same way (being differentiable, having no poles etc.).
 The only thing wrong with homosexuality is smashing a Greek prefix onto a Latin root.
 Latin: uno-, bi-, pauc-, multi-, Greek: mono-, di-, oligo-, poly-
 cf. http://www.wolframalpha.com/input/?i=x%5E20%3B+x%5E20%2Bx%5E...
Please explain it further and tell me how you can estimate the time when a DDoS attack is being done on the system, or the CPU is overheating, I'd really like to know how you estimate that. Apparently I'm clueless and in need of many clues.
Specifically, asymptotic complexity is an expression of the number of abstract operations or units of memory an algorithm will require. The meaning of "operation" or "unit of memory" depends on the algorithm. In the case of comparison sorting algorithms, for example, the operation in question is the number of comparisons.
Even if the CPU time allotted to a running algorithm changes due to system load or overheating, the algorithm still performs the same number of operations. Actual running time under system load can be calculated, but this is unrelated to big-O notation or complexity.
Regarding your explanation, it is excessively wordy and uses awkward phrasing. Your subsequent complaint about the terminology used in responses to your previous questions about P=NP demonstrates a serious lack of prerequisite knowledge. You learn the definition of "polynomial" in algebra; teaching you algebra is far beyond the scope of a discussion on HN.
As someone who is primarily self taught but also finished most of a college CS education, I can understand your frustration. In order to understand one thing you need to learn three other things, and there's no map that leads you from what you know now to what you want to know. Until someone makes that map, or you know enough to make up that map as you read Wikipedia, you'll have to learn things in the order they're taught in school.
So here's what you should do if you want to understand what people are talking about on HN:
1. (Re)learn basic algebra from start to finish. Use Khan Academy. You absolutely need this to understand what a polynomial is.
2. Accept Wikipedia as a source of information. Use a dictionary and other Wikipedia articles to look up terms you don't understand.
3. Related to #2, try to study cognitive biases, critical thinking, and logical fallacies. Studying these concepts will help your brain process information, such as difficult to understand Wikipedia articles. Check out lesswrong.com.
4. Study basic CS. Look for an Introduction to Computer Science course on Coursera or elsewhere.
5. Study algorithms. Take an algorithms course on Coursera or elsewhere.
I had a TA for College Algebra who couldn't teach it, and the professor was nowhere to be found. They made Catch-22 Jokes that he was like Major Major you could only see him if he wasn't in his office. It made me feel better but did not help me learn. I was in the Delta Tau Delta fraternity and they helped me out, but claimed the TA had messed up some of the problems that could only be solved with Calculus which I didn't learn yet. To make matters worse I got caught up in the party college thing and underage drinking. I quit eventually. But at least I didn't get forced to meet "Alice" like some unfortunate souls. I still have my Intro to Pascal and Pascal with Data Structures books in my basement, I should re-read them and try stuff with Free Pascal.
In 2005 I had a car accident and almost died, was in a coma, and lost some memories as a result. I may just have to relearn everything all over again. Since I am on disability, free online classes and sources are my only hope. Thanks.
Anyways we are talking here about complexity of the algorithm, and not its actual speed or performance which will depend on a lot of factors. But complexity tells for example how many steps it will take for the algorithm to solve the problem. The algorithm can be used by Humans or super computers, so performance can vary from days to microseconds, but complexity will remain same.
OK complexity, then why is polynomial time used to describe it? I know some compilers can optimize machine code better than others, how does that affect the complexity? If the performance can vary from days to microseconds, well that is a very big gap. Video gamers always complain about 'lag', and days well that is one heck of a 'lag'! :)
Lets try a crap example I just made up. Imagine you want to go with your friends to see a movie. Here's a bad way to organise that. You drive to your first friends house. Ask what they want to see. Then you drive to each of your other friends houses and tell them what friend 1 wants to see. Then you drive to friend 2's house and ask them. And then drive to each of your friends houses (including friend 1) to tell them all what friend 2 has.
Can you see how with a large group of friends, that would be insane? It really doesn't matter if you have a fast or a slow car. It's just crazy.
A better plan would be to drive to each friends house, ask what they want to see and write it down on a piece of paper. Then at the end -- you can drive around again and tell everyone what movie you are all going to see.
If you've got 2 friends -- in the first plan you get to visit 4 houses. And in the second plan you get to visit 4 houses. With 3 friends it's 9 and 6. And then 16 and 8. And 25 and 10. 100 and 20.
Now lets say the person doing plan 1 discovers this amazing thing called the telephone. Plan 2 person is still using a car. Person 1 can do things so much faster. They don't have to leave home and can call in a manner of seconds.
But let's think it through.
4 calls vs driving to 4 people. 25 calls vs driving to 10 people. Still a big win for plan 1 at the moment. But how about 10,000 calls vs 200 car trips? 1,000,000 calls vs 2,000 car trips?
Does that help?