Hacker News new | past | comments | ask | show | jobs | submit | fizixer's comments login

As an experienced programmer, I have spent enough time trying to understand and make sense of some aspects of pure/higher math, specifically logic, that I feel qualified to say this:

- Experienced programmers are better trained at formality and precision than mathematicians, in some respects, and are able to ask questions that make experienced mathematicians go "why are you asking this question?" or "just get used to the idea (because that's what everyone does)"

- Much of higher math study advice (such as the one posted here) is aimed at laymen. And experienced programmers are no laymen.

- Mathematicians are laymen in many aspects compared to experienced programmers. An experienced programmer will have an easier time learning and using a proof assistant. Mathematicians (most of them) run away as fast as they can the moment they hear the phrase 'proof assistant.'


You don't touch type I assume?

Touch typing is not only about typing faster. It gives you this other ability of "running your control panel" out of the home keys only (four fings on each hand resting on a,s,d,f and j,k,l,;). When you have to lift one of the hands off the home keys and move it to the mouse to do anything, and I mean anything, it's super annoying and frustrating.

Those who never learned to touch type are not likely to grok the appeal of mouseless work.


"Those who never learned to touch type are not likely to grok the appeal of mouseless work."

I think this is the problem. If it can't be explained, it makes me doubt there are objective improvements.


Touch typing is not that hard. You can learn, and then try yourself and see if you like mouseless work.


As always, path tracing is not ray tracing.


They’re both overloaded terms, but with today’s common usage, path tracing is built on top of ray tracing, so path tracing in a sense is ray tracing (but not the other way around). So while it’s true that they’re not exactly the same, if that’s what you meant, your sentence as written can be easily misinterpreted.

Is there a reason to make this distinction here? The site & book are doing ray tracing, so the title is accurate. It mentions path tracing once at the very beginning, because the end goal is a path tracer. I think the goal is achieved, so in this case the book is both ray tracing and path tracing.


> Is there a reason to make this distinction here?

The original method, by Whitted, that looked much better than any previous methods, and yet looks horrible by today's standards, is ray tracing. It had no global illumination.

The method based on solving, explicitly or implicitly, the rendering equation by James Kajiya, and one that has built-in global illumination, is path tracing.

The distinction is not important for buzzword hijacking hacks, marketing gimmickers, and snake oil salesmen.


I could broadly agree with your first two sentences, even though it's not actually true that ray tracing didn't have global illumination before the Rendering Equation came along. Recursive ray tracing, diffuse inter-reflections, distribution ray tracing, all these things existed before the term path tracing came along. Path tracing is more of a unifying formalism than a distinction between global illumination and direct lighting.

I'm completely stumped by your last sentence though, I have no idea where your anger is coming from. But it fails to explain why you want to make the distinction in this thread, since, as I already pointed out "Ray Tracing in one Weekend" is doing both (for example chapters 1-7 are the mechanics of ray tracing, no path tracing concepts are used until chapter 8). Are you referring indirectly to any specific products that have marketing you don't like? Your story can't explain people who use the term "ray tracing" since they're being honest about not necessarily doing path tracing. So are you talking about someone who uses the term "path tracing" when they're not doing global illumination?


What snake oil is being sold here with a free book? Who exactly is a hack? Sure, there's some technical differences between the two terms, but outside of the research community they are used interchangeably.


Well. Looks like the hacks, gimmickers, and salesmen have succeeded wonderfully.


You missed one comparison: C infra (no C++).

(well strictly speaking, additionally you could also have hand-written assembly infra).


You can write C style code in C++ and still get access to features that are still interesting in the problem domain. constexpr functions, typesafe enums, static assertions, and destructors come to mind. No need to even touch a template or inheritance if you don't want to.

The main benefits of C per se are ABI stability and availability of compilers.


Machine learning is nowhere close to replacing the human coding activity.

Nonetheless, machine learning is creeping up into newer and newer areas of application in surprising, unpredictable, and unprecedented ways, and ignoring it would put you at peril as a tech worker. This is what I mean when I use the term 'software 2.0'.


Can't you spin up your own software in a docker instance on the cloud? what does official support mean here?

New to cloud.


This is talking about the managed database service.

There is nothing stopping you from renting a VM and running whatever you want.


You can. This is for Google's fully managed SQL service, "fully managed" meaning that you don't have to directly manage Docker instances or VMs yourself.


But you do have to specify CPU and memory and pay for those for having the database running - even if you're not calling it.

What I'd like is to pay for what I use when calls are being made. Providing an SQL interface as a service, rather than actually running a whole personally copy of a database.



Yup, except that's got a cold start time of 25 seconds. So no use for a low usage system. (One, for instance, that gets used a couple of times a day, but really needs to be available when it is.)



Yes, the way to avoid a cold start is to keep it constantly running. Which rather undoes the point of only being charged when you're using it.


Well, you can scale it 64x on demand...seems like a good tradeoff.


Oh yes - if you are operating at the middle to high end it's a great system.

But if you want "Read a tiny table once an hour, carry out a simple action in the circumstances are right" then paying for a database to be taking up a core 24/7 is not going to work for you.

So I wish it scaled down as well as up.


True...something like Amazon Lightsail might be better for that (starts at $3.50/month if you roll your own, and $15/m for a managed DB). Or you can do some sort of hybrid model...


Just out of curiosity, what would an acceptable cold start time be? I feel like there’s a SQLite + function-as-a-service opportunity here.


Near zero. If my data is only a few kB then an already running process should be able to read it from disk and read/write the data almost instantly. They should be working on a version of MySQL that can do this without spawning a whole new running instance on a whole new server.


Maybe that's more of a BigQuery use case then where you pay for usage?


Unfortunately it’s not a relational database, so no constraints or any of the other features.


MySpace was massive when Facebook was a year or two old. I had accounts on both and MySpace looked way more attractive compared to FB (it had the feel of Instagram).

So no, FB did not take over overnight. And even a few years into FB, no one (yes, no one) knew that FB will go past MySpace and there will simply be no comparison between the two 10 or 15 years down the road.

When FB took over MySpace a full FOUR years in [0], investors knew FB had momentum, but no one (yes, no one) had any idea why.

Not only that, many investors had no second thoughts about MySpace, and it felt like a competition space, instead of a winner take all. There was still an opinion that MySpace will retake the top spot again 'any minute now.'

So why FB took over MySpace? We don't know. Yes, we don't know in 2020, 16 years later.

(I'm not dismissing hard work, marketing, execution, commitment. But it was there for both FB and MySpace. It's a pre-requisite. You think MySpace folks slacked off and that's why FB got ahead? think again).

Stop monday morning quarter backing.

[0] https://www.google.com/search?q=when+did+facebook+overtake+m...


I wish I could downvote you, but as a direct reply I'll have to settle for replying instead. You seem to have completely misunderstood my argument. I'm not "Monday morning quarterbacking", and I in no way think that Facebook's success was guaranteed.

The point I was making was the idea that it was believed early on that social media was a "toy" or that it would never be profitable or that people poo poo-ed the idea of Facebook in its early days is not accurate, at all. Indeed, the example of MySpace shows that it was already well known that there would be a scramble for dominance in the social media realm.

> So why FB took over MySpace? We don't know. Yes, we don't know in 2020, 16 years later.

Uhh, I think we have a pretty good idea. First and foremost, Facebook was always focused on your real, offline identity, which was rather new at the time. MySpace and the early social networks were primarily based on different online personas (e.g. online usernames, no real name policy, etc.) MySpace tried to get people to add their real names and identities later, but it was basically too late by then. This isn't hypothetical, either, one of the founders of MySpace told me as much in a conversation about a decade ago.


I just want to point out one thing, for the record.

I'm sorry "Stop Monday Morning Quarterbacking (MMQ)" isn't directed at you per se, but the general direction of whoever is reading my comment.

MMQ is, unfortunately, practiced widely in many prediction/retrospection circles like startups, finance, economics, politics.

I'm speaking from a the point of view of scientific rigor, or at least quantitative data analysis. No one does that when it comes to opining about the cause-and-effect of an event in the past. If you're "the winner", anything you say about why you won, would be taken as gospel. "Winner is always right". Rigorous analysis is very hard, and costly, and to what end? Just so you could say "my reasoning is based on analysis"? That's a very boring thing to say. Unless rigorous-analysis finds utility in applications like decision-making for future startups, and is shown to work over and over again (maybe we'll need AI for that), no one is going to bother with it.


You must have had a very different experience if you think 'MySpace looked way more attractive compared to FB'.... MySpace pages were ugly as hell, with tiled backgrounds, autoplay music, and mismatched themes. Facebook was CLEAN, and everyone's page looked uniform and consistently styled. This lack of customization, I think, is what made it become so popular. That, and the feed; myspace never had the concept of taking posts from all your friends and putting them into a list that you could scroll through. You had to visit each page to see what was on it, and that meant seeing the horrible styling and autoplay music.


I attribute the success more to creating a tight and more complete personal network due to the campus focus early on. I do think the features of MySpace were a novelty that was also wearing off. And fb provided a cleaner bug free interface


That helped but FB was exclusive to universities at first. So everyone wanted in.


Hm, I think it's pretty well known why Facebook won. Far deeper networks, far better engagement. Look up the work the Facebook Growth team did.


> This is Guido van Rossum. If I were him and asked to solve puzzles, I'd tell the hiring company to fuck off.

Not sure what you're trying to get at:

MacOS homebrew creator is an effin nobody compared to Guido, therefore he should "know his place", "get in line" and invert a binary tree on the whiteboard and act like an obedient tech interview candidate that he really is?

OR

MacOS homebrew creator should've told Google to fuck off?


Imagine a University asking a Physics Nobel Laureate to solve QM Problems from an undergrad textbook in order to get hired as a Professor. It would the height of lunacy and incredibly insulting.


Guido is not a Physics Nobel Laureate. Going with the Physics analogy, Python is more like an overgrown masters level project, not a Nobel prize level by far. He did a good job at growing the Python community, and this is a great achievement! It requires certain personal traits not everyone has. But at the technical level, he made many beginners mistakes when designing Python, which he tried to fix later, but not always successfully.


An overgrown masters level project, eh? You could probably say the same thing about the founding of the United States!

"The US constitution is like an overgrown enlightenment dissertation. The founding fathers did a good job at growing the United States, and this is a great achievement! It requires certain personal traits not everyone has. But at a technical level, they made many beginner's mistakes when drafting the constitution, which the country tried to fix later, but not always successfully."

:-P


It may come to you as a surprise, but for an outside observer, who hasn't been indoctrinated at school by the religion of American exceptionalism, the US might not be a very good example. Think of American military-industrial complex that apparently defines the country's foreign policy.


My analogy was meant to cut both ways, and serves as much as a criticism of the US constitution as it does a compliment!

You'd certainly be wrong to assume that Americans are all in favor of our foreign policy, to say the least. Aside from that, some of the downsides to the US constitution that inspired my comment include the way the founding fathers totally failed to anticipate that the nation would be completely polarized by a two party system, and that this polarization would happen along geographic lines.

Admittedly, this compromise whereby rural states with lower populations enjoy disproportionate political representation is baked into the constitution being agreed to in the first place (the 3/5ths compromise being relevant here as well). As we've moved to more direct democracy, with things like the electoral college being bound to the (local) popular vote and the direct election of senators, the original intentions of having a federation of mostly autonomous states becomes more and more anachronistic, while still fueling an increasingly polarized electorate that pits high-tax revenue and high population centers like SF and NY against low-tax revenue and low population centers that make up most of the country.

The United States is large geographic region with a heterogeneous economy. I don't know if a parliamentary form of government would have served this kind of country well, but certainly most friends from abroad who have spoken to me on the subject have implied that proportional representation is far more sensible than FPTP voting, and that parliamentary forms of government avoid the gridlock and polarization that our de facto two-party system engenders.


Maybe.


>Python is more like an overgrown masters level project, not a Nobel prize level by far.

Please find me another "Masters Project" that has 8 Million+ Users and drives the backend for thousands of tech companies worldwide. Good Luck.


oh friend, say no more, how about left-pad? https://www.npmjs.com/package/left-pad


Funny guy.

If you think a single function is equivalent to an entire Language you should get your head checked.


The number of users is not a metric that makes something Nobel prize-worthy. The exaggeration of the comparison with left-pad is to make this point more clear to you ~~ if you cannot see it yourself, it's sad. Also, going with "funny guy" and "check your head" is not a good argument btw, if you did not know that ~~


I love Homebrew – but it really isn't that much of an achievement compared to damn Python, and I also have no idea who is its creator.


Homebrew is 11 years old. I'm willing to bet there are as many people (likely fewer) people who knew Guido in 2002, when Python was 11 years old, or even 2005, when Google hired Guido.

And I'm willing to bet when Google hired Guido in 2005, they didn't put him through a coding challenge humiliation clown show day.


> MacOS homebrew creator should've told Google to fuck off?

Yes, that's exactly what he should have done.


I agree what you say while also pointing out that unix home directory has become a complete mess. Anyone (any installed software) can do whatever they like, there is no mechanism of enforcement, and advice in the form of constructive critque or comment is not even a drop in the bucket towards fixing the problem.


I find that I want to use a project _more_ if they follow best practices like using ~/.config/{app_name}. Attention to details like that usually indicate a higher quality piece of software overall.


> YC application needs one to clearly articulate who ones target customers are, what they do today, why is it such a pain, and what one is building to solve that pain-point ...

Which in my opinion is the hardest part. If you don't know what you're doing, it doesn't matter if you're selling or not, it doesn't matter if you're building or not. That comes later.

But that's not it. How do you know what you're doing, or going to do? It doesn't happen by sitting down with buddies or co-founders and having a brainstorming session. This is where 'you have to be at the right place and the right time' but also 'you have to have spent the right amount of time thinking about it'

The right amount of time could be 1 week, or could be 10 years. Yes sometimes you need 10 years to get to the point where when you found a company, you start generating sales within a matter of months.


What you intent to do can also change over time. We got started as a Shopify plugin - but wanted to go big later and this is when things went south.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: