Hacker News new | past | comments | ask | show | jobs | submit login

I still don't get it. So you make this graph, with a random number on each edge. Then you duplicate that graph, stack them like a bunkbed, and connect every lower vertex to the one above it. Then you delete all edges with a number below some threshold. Wouldn't the upper and lower graphs still be identical? Since you had the same edge weights in both? So the route from any upper node X to any upper node Y should be 1 less than the route length between X and the point beneath Y.

What did I miss?






There is no thresholding, you delete edges randomly based on the probability assigned.

Thanks, that's an important distinction.

You missed the part where the conjecture is about subgraphs: after forming the "bunk bed", we start randomly deleting edges based on edge probability, so there may no longer be a direct path from x to y in what was originally the upper graph.

That seems irrelevant since then there would be no path in the lower graph either. This must be true because deletions are based on probability and the corresponding edges between upper graph and lower graph are required to have the same probability.

Having the same probability of deletion doesn't mean that they will necessarily share the same fate.

Ah. The Wikipedia description of the conjecture does not say that each edge is independently randomly deleted (or not) based on the probability assigned to that edge. I just assumed the probabilities were the weights in a weighted graph and had some other effect (e.g., likelihood of success traversing that edge, comparative likelihood of choosing that edge vs another when traversing the graph, etc.).

Same probability, but still independently sampled.

Each edge is independent from each other edge: you don't roll a die and then go "4: remove every edge that's 4 or less", you make a separate roll for each edge.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: