I'm a serious advocate of open access, but it should be noted that money which is otherwise ear-marked for research is now being poured into the publishing industry in the form of author fees. The last article that I published cost nearly $2000.
I'm a PhD student in the US, and the author's view of PhD students, at least among R1 institutions, is not accurate for most disciplines. In most of these programs, a student is guaranteed a salary for 5 years, given health insurance, tuition benefits, and in many cases quite a lot of freedom.
The guarantee of funding doesn't mean that you won't have to work for your income. In my field, it is common for theorists to teach for much of their graduate career in order to support themselves.
Critical phenomena, like phase transitions, are essentially defined by fractal like properties. Scale invariance in systems near a phase transition inspired the technique now called renormalization group. It remains a central tool in statistical physics, but isn't really in vogue at the moment. Kenneth Wilson won a Nobel prize in 1982 for his work on RG.
I agree, but very little of this work seems exciting. While advances like hardware support for transactional memory and, hopefully, some unified memory support for CPU/GPU operations will have a tremendous impact on the ease with which programmers compute in parallel, I tend to hear fewer new ideas about what can be done with parallel computing.
I don't disagree that there are fewer ideas, but that's because we need new ideas, including new architectures. Having spent quality time on a number of novel supercomputer architectures that seem to have been forgotten in the rush to turn GPUs into short vector SIMD-ish successors to Cray XMP/YMPs, I don't think we've looked at all the viable alternatives, at least not in a long time. For example, I'm clear on why memory-based vectors like the Cyber-205 faded at the time, but it's not clear the same limitations fundamentally still exist (especially with L2/L3 caches bigger than the 205 main store now). I think that's interesting...YMMV.
This is actually a heavily criticized animation within the computational biophysics community. Our intuition for physical determinism breaks down at the nanoscale. One major fault of this video is that, at low Reynolds number, there's essentially no inertia. So the representation of kinesin taking deterministic, 100% processive steps is an absurd idealization. The motion is dominated by thermal fluctuations (which, by the way, are occurring on a nanosecond timescale).
At the same, even as a biophysicist, I think these animations are superb learning tools. Those beautifully rendered animations actually describe the mechanism in far greater detail than, say, textbook diagrams. One should just be aware that fluctuations dominate at these scales.
Indeed, in computational chemistry there seems to be two classes of theorists. The first group attempts to simulate a realistic system using high performance computing and achieve, typically, agreement with existing experiments. The second group tries to think through problems and hopes to provide explanations that don't need the brute force approach. The former model often lacks both creativity and quantitative accuracy. The latter might be impossible for some chaotic or complex systems. I have to agree that having "theorists" who simply are trying to compute some quantity to match existing data must do so very carefully.
Actually, the "algorithms" are usually fairly straight-forward and well documented integrators for Newtonian dynamics, usually some form of Verlet Integration. The most popular and efficient Molecular Dynamics packages are open source, some GPL (e.g., NAMD, Gromacs). The technology is nowhere near in silico drug design, but we're converging on useful tools to measure important quantities like free energies of binding. Pharmaceutical companies have some interest in this technology, but at this point, don't direct too much effort into atomistic simulations.
ah thanks. My only experience with this was going to a lecture back in school when they brought in someone from Pfizer's in-house modeling team, but the presentation was a bit light on references/citations...
This probably isn't any different from consuming anything that contains animal products from a country with flexible regulation on farming conditions. The description doesn't strike me as being any different from the conditions described for chickens in the U.S.
Markets that suddenly emerge often have strange consequences. The demand for quinoa and acai berries, for example, have had serious economic and environment effects in South America.
There's not too much that's unique about this case, other than product. Poor conditions for animals and exploitation appear to be the norm.
Thanks for the summary. Per capita data would help put this in perspective, probably. It's also worth noting that the U.S. likely has a much higher usage rate of Facebook than, for example, Asian countries.