Hacker Newsnew | past | comments | ask | show | jobs | submit | HamSession's commentslogin

You would get tit for tat and US would tax any offshore development products.


A lot is caused by the forwarding services that mask the numbers. Really each should be held liable for any damages.


The calls is one aspect, how they get money out is another. The gift card business needs some revamping. It should not be so easy to transfer billions every year with no accountability.


I think for that to be meaningful, there would need to be some mechanism to prevent fly-by-night operations formed to isolate the liability.

Maybe requiring them to post a large bond (proportional to the liability) and/or have insurance?


Yeah. Robocaller fines are almost never paid. https://arstechnica.com/tech-policy/2019/03/fcc-fined-roboca...


Could someone explain why the reply from @calimac was killed?

I'm guessing it has something to do with the snowshoe-spamming reference, but I'm totally in the dark.


The user is probably blacklisted due to bad past behavior. You can vouch for individual comments, though.


Banned by sctb on Oct 16, 2016 for political ranting.


Wouldn't being banned prevent them from posting?

Or does it just insta-kill all their posts?


It just marks all of their posts dead. If you have showdead:on set then you can see them, and choose to vouch for individual posts if they add value to the conversation.


That’s a good thought. I wonder if there is an aggressive us attorney that could tie the forwarders to the scam with some kind of RICO charges.

(My entire understanding of law comes from law and order so take what i said with a grain of salt)


I've talked to some neuroscience friends and was interested to learn that lucid dreaming hasn't been studied much if at all in lab conditions. I welcome such research and hope more people get inspired to study deeper.


Not at all true, there's been a ton of study on lucid dreams. Check out a summary of the research by Jennifer Windt: https://mitpress.mit.edu/books/dreaming


Used Zotero with Firefox plugin and word plugin. Used to use Mendeley, but like the open source nature of Zotero.


There are a lot of factors that can go into heart disease. Some of these factors are:

1. Under/over working out, especially for white men over working out can lead to increased shear stress on the arteries

2. Genetic familia hypercholesterolemia aka high cholesterol. This is caught by a blood test and fixed with statins.

3. High Blood Pressure again exams and medication

4. Size of ldl particles, this one is new

5. Genetic markers especially 9p21.3, sadly about a third of people have this marker. How it works we don't entirely know, but the best guess I've come up with is it has to do with interlukin

6 signaling which signals inflammation. Solution to this might be taking curcumin, lower weight, Mediterranean diet.

7. Low HDL again mostly genetic, niacin may help some as may exercise

From my research into the matter it seems that the initial damage is done via HBP, shear stress, or cell repair failure. Then inflammatory markers attach to site causing white blood cells and ldl to infiltrate between the inner and outer walls. Eventually the plaque grows and the inner wall continues to thin until one of two scenarios occur 1) hard cap forms (calcification) and the plaque grows until it restricts enough blood that it causes heart attack or 2) The much more dangerous scenario of the inner wall thinning and breaking causing a clot and instant heart attack. The bad part about 2) is that its undetectable except for angiograph or nuclear stress (treadmill can if the heart rate goes high enough and the doctor skilled at interpretation). Some hope for 2) is that statins seem to encourage calcification of those thin walls thus preventing the heart attacks.

This is just what causes the heart attack and the actual fix (stenting and balloon angioplasty) has its own issues. Where the only surefire fix after a heart attack is a bypass, but if more than 2 plaques are found you need to harvest veins which don't last as long. What the entrepreneur world needs to focus on in my opinion is easier at home detection of coronary plaques (sound, radar, electrical, magnetic, etc.). If the government got involved nanorobotics and bio compatible hearts could be explored along with angiogensis via external stimulation, but each of those would be a moonshot program.


There is a faulty assumption that higher cholesterol leads to higher risk of heart disease, but the fact the opposite is true.

http://www.zoeharcombe.com/2016/11/familial-hypercholesterol...

A few of the other facts/assumptions you make are outdated. Statins do very little to improve mortality. Inflammation can be reduced by cutting out carbs and sugar. Eating more saturated fats and omega 3 also reduces your risk of heart disease.

In fact the whole dietary advice given over the last 60 years is responsible for the epidemic we now have with obesity, diabetes and heart attacks. But they can't come out and say they are wrong, but they are slowly changing it a bit at a time.


I wouldn't put much faith into the non ldl hypothesis we have extensive studies that show that lower cholesterol (total and ldl) cause less myocardial infarctions.

[1] https://www.nejm.org/doi/pdf/10.1056/NEJMoa1405386 People with a gene that naturally produces lower ldl levels lead to less Hard CD events.

[2] https://www.thelancet.com/journals/lancet/article/PIIS0140-6... Statins cause less heart disease and strokes even more than originally thought.

[3] https://www.thelancet.com/journals/lancet/article/PIIS0140-6... Even if people with low risk factors cutting LDL with stains causes lower number of events


[1] This study makes the assumption that lower ldl is less risk and then looks for mutations in genes that lower ldl. It's putting the cart before the horse.

> e. In each study, we estimated the odds ratio for disease among carriers of any NPC1L1 inactivating mutation, as compared with noncarriers. We then calculated the summary odds ratios and 95% confidence intervals for coronary heart disease among carriers, using a Mantel–Haenszel fixed-effects meta-analysis without continuity correction

[2] This starts off by saying that if we gave statins to 10000 people who are at risk that it would help 1000 and 500 people only. In addition 100 people might have adverse effects. If you ask me, an expensive drug that barely helps and only increases mortality by 5 days on average isn't worth it, and it comes with chances of adverse affects. If we just eat foods high in saturated fats and raise our HDL that is far better.

[3] a meta analysis of randomized trials. Says 11 out of 1000 patients had a reduction of major vascular events. So that is saying out of 90 or so people who take statins, only 1 of them will see any benefits.

Basically statins are poison that give hardly any benefit and treat an assumption that lower ldl leads to less heart attacks.

Try this one https://bmjopen.bmj.com/content/5/9/e007118.full

> The median postponement of death for primary and secondary prevention trials were 3.2 and 4.1 days, respectively.

The average life increase of taking statins is 4 days!


Would be cool if you open sourced it, I would happily add my NLP expertise to it as I have the same issues.


It's still coarse, but once I'm done it'd be awesome!


It is good to note that Lauder's principal is a statistical argument for degrees of freedom in a system. The Landauer principal on a statistical averaging of this effect, meaning you will sometimes be able to store a little more information.

I think you have your calculations off for the amount of energy required to store a human brain. Assume 8.6 x 10^10 neurons (more accurate number as its simple density measurement) each holding a 64-bit weight which gives us 5.5 x 10^12 bits. This multiplied by the lower bound e.g. k(1)ln2 or 9.56 x 10^-24 J, gives us 5.3 x 10^-12 J which is lower than the energy required to type a letter on this keyboard. Now even if we increase this by body temperature say 37 C we get 310.15 K or about 1.7 x 10^-8 J. This is of course a lower bound on the problem but even if you pump up the number of neurons or weight value you still get less than a joule of energy required.


>each holding 64 bits

There is no way in hell you're going to store the entire physical state of a neuron in anything near 64 bits. The question is not "information stored in the brain". The question is "information required to recreate the brain". In fact a single neuron has many more than 64 synapses, each of which itself has a highly intricate structure. All of the information describing this structure must be retained intact in order for the image brain to be equivalent to the substrate.


This is a rolling lawsuit waiting to happen. Those that invested must really be hungry for something unique.


This is very cool stuff. I was just wondering what do you see is the biggest technical problem with this technology? I understand that the hardest part currently is transferring the modified genome into all cells, is this still correct?

If I'm interested in this technology how do you recommend learning the required techniques. As a machine learning engineer I know maths more then biology, but want to contribute to the open source movement. Where do you view the biggest impact of software/machine learning engineers can make to the open sourced biology movement?


The biggest technical problem in general for synthetic biology is predictive design. Most of the technology we are building is to enable massive numbers of designs to be tried in parallel b/c the complexity of the cell rebuffs predictive CAD models. This will change as we get more data which is what foundries like ours generate - lots of data.


I'd love to hear a reply to this.


I agree with your suggestions for human assessment because of problem posed by food calories varying significantly due to different preparations.

Take for example soda, if I gave you a picture of soda in a glass could you tell it was diet or regular? You might scoff at such an edge case but it quickly becomes more common when looking into food perpetration techniques. This is why caloric estimation is a really difficult and causes restaurants to not list their calories as the calories of a meal do not equal the sum of it's parts.

All these solutions are common as people want a signal to tell them to stop eating, but these are insufficient as people will simply ignore it due to hunger cravings (as happens on diets). Any nutritionist service in addition to detailing calories would need to incentive the patient to recognize the need to lose weight or setup a helpline.


> causes restaurants to not list their calories as the calories of a meal do not equal the sum of it's parts.

In my area, restaurants list the calories for each menu item, exactly to the extent required by law (chain restaurants are required to public calorie counts), plus some promotional "under 500 calories" or what-have-you for diet-targeting places.


How is that " the calories of a meal do not equal the sum of it's parts"?

That seems to violate the laws of thermodynamics.


He may be talking about cooking. This could involve chemical reactions, or facilitate / slow the absorption of nutrients, therefore changing the effective caloric intake from various ingredients.

Thermodynamics should add up once you account for heating, cooling, evaporation, enzymes, waste, etc


Cooking changes the caloric content of food. Not all of every ingredient gets all the way to the diner plate.

Biology is not thermodynamics. Humans are not perfect combustion engines.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: