This article is essentially an encouragement and a reminder of our ability to do experimental mathematics (https://en.wikipedia.org/w/index.php?title=Experimental_math...): there's even a journal for it, and the Wikipedia article on it is worth reading (https://en.wikipedia.org/w/index.php?title=Experimental_Math...). See also (I guess I'm just reproducing the first page of search results here) this article (https://www.maa.org/external_archive/devlin/devlin_03_09.htm...), these two in the Notices of the AMS (https://www.ams.org/notices/200505/fea-borwein.pdf, http://www.ams.org/notices/199506/levy.pdf), this website (https://www.experimentalmath.info), this post by Wolfram (https://blog.stephenwolfram.com/2017/03/two-hours-of-experim...), and there's even book by V. I. Arnold (besides a couple by Borwein and Bailey, and others).
Especially in number theory and probability, simple explorations with a computer can suggest deep conjectures that are yet to be proved.
Thank you so much for pointing this out! Experimental mathematics feels like a missing puzzle piece in which it makes so much more sense.
Quotes are from the wiki article you linked.
> As expressed by Paul Halmos: "Mathematics is not a deductive science—that's a cliché. When you try to prove a theorem, you don't just list the hypotheses, and then start to reason. What you do is trial and error, experimentation, guesswork. You want to find out what the facts are, and what you do is in that respect similar to what a laboratory technician does."
I wish there were books on how people would describe their complete process (not only their proof) on how they figured things out.
> Mathematicians have always practised experimental mathematics. Existing records of early mathematics, such as Babylonian mathematics, typically consist of lists of numerical examples illustrating algebraic identities. However, modern mathematics, beginning in the 17th century, developed a tradition of publishing results in a final, formal and abstract presentation. The numerical examples that may have led a mathematician to originally formulate a general theorem were not published, and were generally forgotten.
Why is this the case? It seems like it doesn't benefit us other than saving some paper.
> The following mathematicians and computer scientists have made significant contributions to the field of experimental mathematics:
--> This is so awesome, it also sheds some light into how these people think.
Not in a research paper, but it is described in some nice decades-old training videos, https://www.youtube.com/playlist?list=PL926EC0F1F93C1837
((ln n)/(ln ln n))(1 + o(1))
with high probability (i.e., probability 1 - o(1)), where ln denotes the natural logarithm (log to base e). Empirically, the maximum frequency for n=1000, 10000, 100000 often seems to be, respectively, 5, 6 or 7, and 7 or 8.
This problem has applications in studying hash tables etc., and can be found under terms like "maximum load" with balls in bins, and proving this doesn't seem to be very easy. As the post says “the solution is likely not as trivial as it first looks”. The analysis may be hard, but these days if faced with a problem like this in the real world (e.g. we have a hash table of size M that will receive N entries in it, and we're curious about the likely maximum load), we can likely just experiment to find out. Even when the numbers are too large to run simulations directly, an in-between solution is to get a tractable expression (a recurrence relation using dynamic programming or whatever) for the closed form, and write a program to compute it.
2. How many numbers we have to draw from 365 so the there is a greater than 50% chance of at least two of them are the same?
3. How many numbers we have to draw from X so the there is a greater than Y% chance of at least Z of them are the same?
I think X,Y,Z are enough parameters:
Drawing from X=1000 numbers, what is the chance Y that Z=(5,6,7,8...) is the same?
Sorry, I'm not a matematician, just some breakfast ideas ;)
Edit: Inspired by http://datagenetics.com/blog/february72019/index.html
Interactive visualisations are often impractical. They don’t work in publications, presentations or documents. Generally speaking, visualisations that have everything clearly visible without requiring interaction are always superior to visualisations that require extra interactions.
I found that it always paid off to do some extra thinking on how to reshape my plots so they don’t need to be interactive. I had very few cases where I needed the plots to be interactive, and ironically, in these occasions only matplotlib worked for me. Those were cases where I wanted to show and play audio snippets that belonged to data points in a dimensionality reduction plot. It was quite hard to get matplotlib to do what I want, but I didn’t even get a anywhere near a result with plotly et al.
Here is the relevant documentation: https://plot.ly/python/static-image-export/
Again, my disappointment is that we need to encourage a mentality of flexible learners, and I find this post a regression. I’m a bit disappointed at the unpopularity of this comment, but maybe that’s why I’ve moved up to earn an enormous salary.
a) customize every aspect of the chart, from the fonts to the length of the axis ticks to the legend placement etc (see the full list of thousands of available customization attributes here: https://plot.ly/python/reference/)
b) export to raster or vector formats for publication (https://plot.ly/python/static-image-export/)
c) use high-level grammar-of-graphics-inspired tools like https://plotly.express/ to create complex charts in a single line of code.
What kind of shitty reasoning leads to this? "Oh, let's introduce this elementary mathematics to the illiterate masses by writing it as a Python script. Now everyone will understand it!" This is a lack of respect for the agency of the readers.
There is value is static typing, but there are many instances where that cost is not worth the reward.
I would expect a post entitled “exploring mathematics with python” to have a whole lot more python code (inline with the text and better explained instead of an uncommented blob at the end) and a whole lot more mathematics.
A more accurately descriptive title for this post might be “counting the repetitions among randomly chosen positive integers”.... which of course isn’t going to get as many clicks or as many reflexive upvotes from non-readers as a post promising “exploring mathematics with python” because it doesn’t sound (and frankly isn’t) all that interesting to most readers. (It might make a decent short project for middle school students though.)
Personally I flagged the post for its misleading title.
It's not the author's fault if you bring your own baggage to the words.
This is needlessly dismissive and frankly offensive.
I prefer having errors from a compiler(or static analysis, or...) because it helps me. Not because I am a better programmer but because it helps me be a better programmer.
> Compiler: Hey that type doesn't work there
Oh! Thank you! I meant to use this type instead.
> Compiler: This value is freed here but used right afterward here
I meant to clone it. Whoops. That would have been embarrassing to debug in production!
And so on.
I absolutely recognize that it's a barrier to entry, but it's not one erected to keep people out, it's there to catch your mistakes for you so that you spend less time debugging and more time writing your actual application.
That's true. If I start doing that, please call me out on it. From the HN Guidelines:
> Please respond to the strongest plausible interpretation of what someone says, not a weaker one that's easier to criticize. Assume good faith.
I'm very much not a fan of meaningful whitespace, but I use Python occasionally, and regularly help my friends who are learning it grasp this or that topic. Except where pointing them to a specific library or other tool, I've never said "You shouldn't do that in Python, do it in this other language instead".
I was responding specifically to the content that I quoted, which came from the post that I replied to.
> It seems you just meant to criticize the title, or...?
I should note that I am not the person who started this comment thread.
They were criticizing the post title as they did not believe that the contents within were specific to Python.
They were not criticizing Python either. Rather they were saying that the word Python in the title appears to be used to attract viewers who might otherwise be intimidated by the contents, instead of being relevant to the contents of that article.