The company threatened to sue me and the university threatened me as well. Neither has followed through on threats. The company wants to keep selling their rubbish magnetic health ding ding and I assume the university wants nobody to look into how positive results for the product came out of their institution. Allround an education on how the real world works.
I do think that modern technology can revamp antiquated systems of peer review and appeals to authority with more robust and deterministic / objective measures of quality - an evolving consensus repository of the sum of human knowledge where each pull request is a debate, each experiment a commit to a branch that may become a new foundation or a footnote. Yet we still see fit to define knowledge as property, ownership as power, power as success. Fear of the impression of weakness, or the threat of starvation and poverty seems to override all other priorities and for the most part, we toe the line.
It is what it is I guess.
Write an article and send it to the papers?
This kind of thing needs to be shamed and punished. While I agree that the institutions can be expected to cover there ass, there must be a course of action.
Should we have that expectation? If a person does it we call it concealing evidence, intimidation, and perverting the course of justice. On the other hand, if they confess right away, we treat them leniently. Why should institutions differ?
Large systems are more than the sum of their parts, but not nearly as much as those parts want you to believe.
If not, what's the difference to any other organisation such as, in an absolute extreme form, an army or a "terror"group apart from the agenda.
Really curious, not trying to construct a strawman!
Yes, but the point being made is that in the case of organizations the whole doesn't necessarily have the properties of its constituent parts.
> If not, what's the difference to any other organisation such as, in an absolute extreme form, an army or a "terror"group apart from the agenda.
Not sure what you mean by this. Difference from what to any other organization and why that relevant?
You find immoral, illegal, etc. activity in institutions/organizations across a broad spectrum of areas.
This is because the "constituent parts" - members of the organization - are often just terrible human beings that hide behind the concept of an institution to shield themselves from the cognitive dissonance caused by their own unethical actions.
If an institution does something unethical it can always be traced to its employees.
As an example, all the researchers involved in promoting this fake health product ought to be fired and academically blackballed. It really is that simple. That's how we can deal with institutional rot.
Wait for one really good case and let the papers instead discuss how a century old institution collapsed.
Not that there's any way in our current culture to assemble a truly impartial reputation tracker, but surely we can do better than Yelp.
Um what? It's basic ethical behavior. What has the world come to that someone can defend unethical, unprincipled behavior of a educational institution because "preventing legal and financial liabilities is the #1 priority for institutions ...", with the implication that therefore unethical behavior should be expected?
Admitting your mistakes is a strength, not a weakness. We should not expect other's to self-censor their shortcomings because if we do, we live in a fake world already. I mean, it happens, but whatever happened to honesty, the greater good, long term goals? Your take short sighted, short term benefits which will bite them in the ass eventually in the long run (like a can of worms).
Going into hoops about some kind of alternative theory, which is tried to make sound plausible (but often seems like a bad joke, insulting to the victim as well) should be less honored than admitting to a crime, showing remorse, and demonstrating you learned from making your mistake.
Showing you admit your deed, showing remorse, and demonstrating you learned from your mistake is important for the victim (and/or their peers), as well as for society in general. We as society should reward such behavior, but indeed more often than not our legal systems fail here. However, at some point the evidence is so overwhelming that the theater as described is only harmful. How can we lower the incentive for such theater?
A) Why not?
B) If not, then you can and should expect the people within the institution to implicate it: Everything is a decision, even the decision not to make a decision. For them to not make the decision to become a whistleblower, is for them to make the decision not to become a whistleblower -- i.e, to condone research fraud. If you really, really want to insist that institutions can't be black-balled, then we'll just have to start black-balling the people within them. (If the world starts to do this diligently and systematically, I think you'll rather quickly see a lot of people denouncing this kind of shit at their institutions, probably just as swiftly followed by a developing consensus that maybe you should expect institutions to police themselves.)
The most entertainingly written such blog to my taste is Leonid Schneider's:
Most university administrators can see avoiding coverage like this is a good thing:
Why does _this_ not pose a legal liability?
Institution should have nothing to fear if they've followed due process and shouldn't be in a position of being implicated in dodgy science to begin with!
I would take a pharmaceutical/drug related test (administered through the state), and it could be comprehensive; If I could renew my long term prescriptions.
The pharmacist wants to see a blood panel before refilling, fine, I should be able to order one.
(Only on long term medications (Blood pressure, diabetes, and most psychiatric drugs within reason) , and never antibiotics)
Right now American law states you need to see a doctor once a year in order to procure medications.
We all know how many times patients are brought in just to get that script. They (MD's) love the six week interval. It pays for the lifestyle.
I'm lucky if my blood pressure is checked.
And the Colon scan is something they might be mentioned while walking out the door.
My point is if you have a conscience doctor, and good insurance, by all means take advantage of it.
Many Americans don't have good insurance, but need medication.
We shouldn't have to come into pricy office visits (basically a stare down, or if he's peppy--some dubious advice) just to get a medication we have been on for years.
Now, that's in extreme cases, 2-6 weeks is the average, but levothyroxine!
A basic medication for lifelong illness with no addictive or stimulant properties.
But hey, it's free-ish. If you ignore the "potentially being without it for a few weeks" part, which can be debilitating.
Particularly for my low-income patients who end up coming into the ER for a refill (or a complication of having ran out of their meds), I’ll often write a 3 month supply and 3 refills, so giving them one year of access to their meds without needing a doctor. I still strongly encourage them to get a primary doctor and see that doctor in order to get help with their medical conditions, but the idea that we should effectively coerce patients into seeing a doctor by withholding life-saving medications (“it’s for their own good!!!”) seems grossly unethical to me, and doesn’t take into account the problems poorer people have in getting good affordable healthcare. But I’m pretty sure my position is an outlier among emergency medicine docs.
I’m hazy on the details, just have developed an app for a bank that had to do some validation on clients. But I think it is an EU rule (or perhaps Hungary only if someone were to know)
> never antibiotics
Despite the common wisdom, the issue there wasn't (mis)use in humans but rather certain agricultural practices which I understand (in the US at least) to have largely been curbed at this point. Antibiotics that were still clinically relevant in humans were being added to animal feed en masse because "reasons" (ie cost cutting of various sorts). Evolutionarily, that went about as well as you'd expect.
What I'd look into instead is how it's legally classified. My heuristic is: if it works in any meaningful way, it's classified as drug or medical equipment. Might be OTC, but it's clear about its status. Stuff that's not classified like that can at best correct some nutritional deficiencies, but typically does nothing at recommended doses (but sometimes can still hurt you if you severely overdose).
There are substances gated behind a prescription that I wish people would have easier access to - but I understand the need for regulatory control. If people selling all these fraudulent cures can dupe so many regular folks, imagine what would be happening if they were allowed to put medically relevant quantities of an active compound in them.
Aspirin and fluoride toothpaste are each pretty darn effective.
In summary, if it works, it can also kill you. That is why most medicines that work are also controlled.
I think at least the university should reprimand this people, how did you get in contact with the people threatening to sue? As I'm currently a grad student (in an unrelated field) I guess I could have some fun pushing this as academic discourse.
The University of Salzburg also has a "Ethikkommission" https://www.plus.ac.at/service/uni-administration/gesamtuniv... and "Kommission zur Sicherung guter wissenschaftlicher Praxis" https://www.plus.ac.at/service/uni-administration/gesamtuniv...
The company only says there is a "positive trend" which there indeed is.
"the lactate measurement shows a difference"
"The lower lactate value with Powerinsole means longer performance and a later onset of muscle fatigue."
"Even with the first application of the power insole, a lower skin conductance is evident compared to the placebo. This means that the power insole can help reduce stress levels."
Nowhere are they talking about statistical significance.
Austria has a weird thing with pseudoscience scams  and it needs to be dealt with.
 The current Minister of Economy was working as an "energizer" https://www.diepresse.com/5395317/wirtschaftsministerin-schr...
 During the construction of a hospital in Vienna, an "energetic" ring was built around the construction site for 95000 Euro. https://www.derstandard.at/story/2000076199184/krankenhaus-w...
and were installed into hospitals in austria at great cost
Unsurprisingly the owner of powerinsole a Mr Martin Masching is also involved in this enterprise via
My poor dog would have probably been blind or worse if I had taken the advice. But they don’t care, they just want to move products, whatever the cost.
If I can suggest something to look into (i.e. not recommend!):
I use NeilMed sinus rinse which is just saline and bicarbonate (so pH balanced). Use distilled water to mix it. You can buy it at most drugstores.
This is not sterile unless you sterilize everything, but at least there are no funky chemicals.
After having checked five pharmacies with nobody having heard of it the vet took pity on me and gave me a handful out of their supply closet.
this company also makes
These products including powerinsole are marketed by a local tv station and program https://www.puls4.com/2-minuten-2-millionen/staffel-8 which pretends to be a startup investment show but is really just an infomercial for woo products.
I think it's a good thing, since it's a pretty good guarantee that you actually get what you buy.
You can get ibuprofen and paracetamol at any pharmacy in Austria for a few euros without a prescription. And in every district there's always a pharmacy that is open 24h.
Fake or bad goods are not an issue in developed countries without these extreme rules.
But that's not my point. My point is that such extreme rules stand in stark contrast to the sale of snake oil.
Because in Hungary I no longer see them around.
I'm pretty sure that's common to many of the elites. You don't get to be elite by being honest. Schools one tier down (e.g. large state schools) tend to be a lot better, but they're declining too.
Whether or not that lawsuit has any merit is irrelevant.
With certain exceptions based on subject matter (e.g. SLAPP) or jurisdiction, it still costs money to hire a lawyer to defend against the suit.
That’s not entirely true. In some jurisdictions persistent baseless legal action, or even empty threats of legal action can themselves be considered a cause for action.
Otherwise I'd expect people to leak trade secrets as a series of questions, leak top secret info as a series of questions, etc.
I kinda hope it's the underlying substance, not how you phrase it, that determines this sort of thing :)
That said, if you know the legal situation, demonstrating that you're not afraid of the legal threats is often the way to make them go away.
For security research, if the goal of the legal threat is to prevent you from publishing your results for the first time... publishing the results can significantly reduce the incentive to cause you further trouble because the goal is no longer achievable.
> In 2018, Yaniv filed discrimination complaints with the British Columbia Human Rights Tribunal against multiple waxing salons alleging that they refused to provide genital waxing to her because she is transgender. Yaniv's case was the first major case of alleged transgender discrimination in retail in Canada. Yaniv was seeking as much as $15,000 in damages from each beautician. In their defence, estheticians said they lacked training on waxing male genitalia and they were not comfortable doing so for personal or religious reasons. They further argued that being transgender was not the issue for them, rather having male genitalia was. Yaniv rejected the claim that special training in waxing male genitalia was necessary, and during the hearings equated the denial of the service to neo-Nazism. Respondents were typically working from home, were non-white, and were immigrants who did not speak English. Two of the businesses were forced to shut down due to the complaints.
Just one example..
But besides that, it doesn't answer the question you were responding to. Asking someone's birth sex does not tell you what genitalia they have. It doesn't even really tell you what genitalia they had at birth. Since those salon owners say they specifically objected to the woman's current genitalia rather than the woman's status as transgender, birth sex is the wrong question to ask.
> In October 2019, the Tribunal ruled against Yaniv and ordered her to pay $6,000 in restitution... The ruling was critical of Yaniv... stating that she "targeted small businesses, manufactured the conditions for a human rights complaint, and then leveraged that complaint to pursue a financial settlement from parties who were unsophisticated and unlikely to mount a proper defence."
This is a person who deliberately does all sorts of weird shit for the sole purpose of using her trans identity to sue people when they react negatively to it.
For example, she apparently called her local fire department "dozens of times" for "help getting out of the bath" (when in reality she needed no assistance) and "subjected Fire Department staff to "inappropriate and lewd conduct"".
To be clear, the beauticians explicitly didn't care (per their claims) what the birth sex of the customer was, so I'm not sure how that would be a relevant question here? Per their claims, they'd have had the same issue with a trans-man who'd had bottom surgery.
Then you’d probably want to ask “What kind of genitalia do you currently possess”. Sex assigned at birth is not a reliable indicator of that, for reasons very similar to why current gender identity isn't.
You might want to rethink that phrasing, because it’s a very reliable indicator. That’s the entire problem. <1% of the population have genitalia that doesn’t match their sex assigned at birth.
Asking someone “what genitalia they currently posses” is more offensive because it sound like you’re talking about an accessory that people swap out on whim. “Which will you be carrying with you today? The penis or the vagina?”
> You might want to rethink that phrasing, because it’s a very reliable indicator
In the context where current gender identity is an insufficiently reliable indicator, which is the context of the supposed need to ask the question, no, it is not.
If you need to probe beyond current gender identity because your personal sensitivities about genitalia can tolerate no error, then you need also to bypass indirect proxies entirely and ask the question you are actually concerned about regarding genitalia.
This is not strictly related to what you write and more about research in general, but most researchers seem to avoid submitting negative results. Disproving something can be just as important as proving something, but it is seen as a failure in most cases and can negatively impact your future funding.
This leads some researchers to just hide their "failures" and some go as far as doctoring the results.
You combine these three factors (and likely a few more) and the entire thing is a rotten stinking mess that exists on a binary state between religious deniers and religious zealots.
What's a researcher to do? Tell the truth? Ha! Only if you want your career completely destroyed as well as never seeing even a hint of a grant. Going against these forces is a sure path to having more PhD's driving taxis.
I have to say, I have become very cynical about what we call "science" these days. It seems you have to be very guarded about accepting anything you are told, because the forces at play could be beyond anyone's imagination in scale, breath and reach. The problem is that the general voting public is ill-equipped to take an intellectual stab at what they are being told, which means they are easily duped and herded like cattle in and direction that might be of benefit to the puppet masters in politics.
There's no doubt groupthink happens in academia on many issues, but the need to displace fossil fuels really is very important. Not just for climate change reasons, but overall human health. For instance, air pollution from fossil fuels kills tens of thousands of people every year.
There's no honesty anywhere - or rather, the honest people get squeezed out of the field. Which is why the issue ends up so polarised.
> There's no doubt groupthink happens in academia on many issues, but the need to displace fossil fuels really is very important. Not just for climate change reasons, but overall human health. For instance, air pollution from fossil fuels kills tens of thousands of people every year.
Changing the subject like this is a huge red flag that you're using motivated reasoning, as you'd probably have noticed yourself if this wasn't such a political issue. It's a very small step from "it's important to displace fossil fuels even if not for the reasons I originally said" to "I'll overstate the effects of climate change to ensure that we abandon fossil fuels as quickly as possible to save thousands of lives", and once you do that all hope of finding the truth lost.
Well that's an unsupported conjecture that I see no reason to accept.
> Changing the subject like this is a huge red flag that you're using motivated reasoning
Where is the motivated reasoning in acknowledging that climate change is not the only reason to replace fossil fuels?
I base it on having friends who were in the field; of course that won't be particularly convincing to you. (I'm pretty sure others in these comments had examples of people being pushed out for reaching the "wrong" conclusions, but I have to be honest that I'm trusting the people I know personally rather than anything else).
> Where is the motivated reasoning in acknowledging that climate change is not the only reason to replace fossil fuels?
Suggesting that climate change is likely to be true because you have non-climate-change reasons to want to replace fossil fuels is motivated reasoning. The fact that you brought up non-climate-change problems with fossil fuels in a thread about whether climate change is occurring suggests that you're doing it.
Except I didn't do that.
> The fact that you brought up non-climate-change problems with fossil fuels in a thread about whether climate change is occurring suggests that you're doing it.
The thread isn't strictly about whether climate change is real, it was about climate change alarmism, about whether climate change was reducible to a single variable, and whether we should be motivated by the available evidence to make drastic changes to potentially avoid the predicted outcomes. The additional point I made is perfectly in line with that.
No one funds research into problems that aren't that big a deal. There has to be a crisis NOW for climate scientists to get more funding and social status. The news media knows there has to be a crisis for people to tune in; no one would watch programming on a problem that won't occur in their lifetimes. And politicians love to take the side of causes that give them the moral high ground. They also get to be sanctimonious and pretend they are "on the side of science" when they have no more grasp than anyone else.
To wit, if politicians were actually interested in reducing CO2 emissions they could just tax it straight up. They could combine that with lowering taxes on other things so as not to hurt the economy. The market would then implement green energy solutions of all kinds on its own.
But that wouldn't give the politicians any leverage for donations from green tech for future campaigns. So they pick and choose winners and losers based on donations and the revolving door. Fossil fuel isn't that unhappy because they get to set up a Nat Gas plant with new wind and solar installs to fill the gap when it's cloudy and calm.
That's false. Mundane, basic research gets funding all the time.
> the need to displace fossil fuels really is very important.
Why? I don't necessarily disagree. But reality isn't a problem managed through a single variable. The things you list are not singularly caused by fossil fuels.
In fact, a very solid argument could be put forth about just how much uglier things might be without fossil fuels.
Here's the basic math someone would have to do before making the assertion that the elimination of fossil fuels --as a single causally-connected variable-- would make things better:
The simplest (well, not so simple) calculation is that, while we might eliminate fossil fuels we do not eliminate the need for the energy they provide. In other words, in rough terms, you still have to explain how we would generate, harness, create, transport and distribute a certain amount of energy per unit time (hour, day, week, month, year, whatever).
In fact, I think we can, in historical terms, state that energy requirements increase over time, they do not decrease.
The next element of the story is how we are going to replace the massive number of byproducts of fossil fuels that modern life pretty much depends on. We know that making complex hydrocarbons any other way is in a range between highly inefficient (which would increase the aforementioned energy requirement) and impossible.
My point --in stressing that reality is a rather complex multivariate problem-- is that, while it would be nice to think of a desirable reality without fossil fuels, in the real reality (just go with it) this is much more of an aspirational thing than an attainable objective.
The same is the case with electric vehicles. I have yet to see someone do the math on the total daily energy requirement of the installed fossil-fuel based vehicle fleet and explain how on earth (literally) we are going to generate that much energy without causing even more problems. Our current electrical grid is designed for current energy requirements (and power requirement, which is equally important). The current system, in any country I know of, doesn't magically have an extra 100% in power/energy generation capacity to support every vehicle going electric.
Reality: A multivariate problem. You push here and it pulls there. Not so simple.
> For instance, air pollution from fossil fuels kills tens of thousands of people every year.
Fair enough. Containerships, as a simple example, burn bunker fuel, one of the nastiest things you can burn. They are singularly responsible for more pollution along certain vectors than the entirety of the ground transportation industry. And yet we do nothing about it.
I can only guess. Part of it has got to be a case of "well, what we have works". The other issue --which I think is very real-- is that bunker fuel is, quite literally, the bottom of the barrel. It is what is left after you extract everything else from petroleum.
So, next Monday we stop using bunker fuel everywhere in the world. No problem. Right?
You see, all the other oil byproducts are still needed. Which means that the bottom of the barrel...the bunker fuel...would still be produced in absolutely massive quantities. Except now we are not using it, because we want to clean-up the planet.
Wait a minute. What do we do with it?
Well, we likely have to bury the stuff, dump it somewhere, make huge mountain-sized piles out of it. We would now use massive amounts of fuel (yes, everything is "massive") to run the machines that have to haul and manipulate this stuff. We also have to devote massive (sorry) resources, land and ecosystems to burying what we are not using. Where it goes from there I cannot even guess.
Once again, reality isn't a single variable problem. Bunker fuel == bad? Yes, no, maybe, hard to say. Because the alternative could be worse, far worse.
This is precisely what I don't see treated fairly these days. Imagine a politician taking the time and making the effort to fully analyze and understand the bunker fuel ecosystem and also taking the time to present this analysis to the voting public. Good luck. It is far easier to say "bunker fuel == bad", get votes, stay in office and move on. It's easy to show how horrible the stuff is (and it is!). It is impossible to show how much worse things could be if we don't fully understand what reality looks without it.
I'll overstay my welcome and give another example from real life.
A number of years ago a well-intentioned yet mathematically-challenged "science" teacher at my kid's school showed the kids this gut wrenching video animation that pretty much says humans are a pile of shit destroying the planet. The thing is a close as you can get to an ignorant politically-motivated pile of lies.
She was receptive to having a conversation. I asked if we could go through a simple exercise where we would try to understand what our small town would look like if we did not use the products of evil industrialized society. Petroleum is a favorite, of course.
I won't bore you with the details. Before we got done we had destroyed every forest in sight, had piles of human excrement the size of mountains, all possible fields where you could grow something in the region were dead, sources of water were polluted (human waste and other by products of inefficient source for everything) and more. At the extreme we were using horses to get around, etc. A town of a few tens of thousands of people relying on horses has a serious manure problem. We would burn trees for heat and cooking, etc.
As we extrapolated this from a town of tens of thousands to cities with millions and regions with tens to hundreds of millions of people, it became very obvious that modern life (or more accurately, modern population levels) would quickly become unsustainable if we demanded that humanity abandon how we got here and embrace everything "natural" an "sustainable". She was certainly surprised to understand the scale of the problem.
Once you start thinking at scale --planetary scale-- "natural" and "sustainable" quickly end-up with razed forests, depleted marine life, polluted water sources and a sky blackened with thick pollution.
Not to end on a depressing note. Yes, we are doing better, have been so for decades. We just have to be careful that we don't reduce reality to single variable problems, because that isn't reality, it's a fantasy, and a dangerous one at that.
Climate change is one of those. It is hard to find truth that is being discussed with honesty in the mainstream.
It's not: How do we maintain our quality of life?
It's: What quality of life can we achieve?
The basis of your argument needs to be a reality in which life continues, which limits all other considerations.
Where is the proof of that?
Not trying to be obtuse at all. I am also not suggesting that things are changing towards a future with potent climate events. I am not challenging any of that.
You see, the dark future everyone is selling is one where we all die. Everything dies. Mass extinction of all kinds of living things. Another two degrees and we are done for.
In the face of this we are supposed to have the technological prowess TO ACTUALLY TAKE CONTROL OF PLANETARY SCALE PROBLEMS and magically bend the curve to where WE want it to be, not where THE PLANET wants it to be. Upper case for greater emphasis than this which does nothing for me, not yelling at you.
Do you have any idea of the scale of this thing? Planetary. Yes. What does it mean?
In other words, we purport to have the power to change the entire ecological balance of the planet (hence "planetary scale", and, at the same time, we can't deal with the purported effects of global warming?
I applied "purported" because, once again, climate change is being treated as a single variable problem where NOTHING ELSE changes. In other words, "CO2 bad -> CO2 ppm rising -> Existential threat".
The planet has been dealing with this kind of stuff long before humanity was a thing. It adjusts to atmospheric CO2 through weather. Specifically, storm, hurricanes, cyclones, rain, etc. Water. And growing vegetation. Yes, at a planetary scale. We have data on this, reliable data, going back at least 800K years.
Is CO2 bad?
Well, yeah, taken as a single variable, sure. Yet, that isn't the entire story, is it?
Have you heard of indoor farming? This is where food is growing in controlled indoor environments rather than outdoors.
Do you know what they do in indoor farms to promote plant growth?
They inject CO2.
Yup. They actually have CO2 tanks delivered to the farm and CO2 is metered by a computerized system in order to raise the level and promote plant growth as well as other characteristics.
When you start leaving the "CO2 bad -> CO2 ppm rising -> Existential threat" myopic view of the universe being pushed and start to consider that reality is a complex multivariate problem, ideas and the potential actual reality start to surface.
Have you ever walked around your home, family and neighbor's homes and your neighborhood with a CO2 meter?
Levels in my home and my neighbors are in the range of 500 to 600 ppm. No, we don't live right next to a highway. Outside, about the same range. Some of the office environments I frequent, about the same.
In the car? It can reach 1100 ppm. No, that isn't with me breathing directly into the meter. If the ventilation system is set to forcefully ingest outside air it comes down to about 700 in neighborhood streets and spikes back up to 800 to 1000 on the highway (which makes sense).
My point is that this "CO2 bad -> CO2 ppm rising -> Existential threat" scenario is one that, very likely, billions of people have been living in for decades, maybe more. Care to guess what indoor environments looked like 100 to 200 years ago? I have no clue, but I cannot imagine them being better than what we have today.
And cars? How much time do billions of people spend in their cars at 700 to 1100 ppm CO2 every day? Hours.
Has the sky fallen?
WHY ARE WE NOT QUESTIONING WHAT WE ARE BEING TOLD THEN?
C'mon folks. This isn't about denying our influence in increasing atmospheric CO2. However, this is, very much so, about gaining a sense of proportion and putting what we are being told to scrutiny.
Yes, we absolutely managed to increase atmospheric CO2 through the burning of highly dense hydrocarbon fuels. No question about that. Is the inescapable conclusion "CO2 bad -> CO2 ppm rising -> Existential threat"? I don't know. Somehow I don't think so.
For example, increased levels of atmospheric CO2 might promote more efficient growing of food in indoor farms. Controlled environment farming is more efficient that outdoor farming, uses less water, delivers higher quality food and reduces damage to the land. More importantly, controlled environment farming can bring food production to places that could not consider it before, like the desert.
How about all the storms, rain, etc. that are a part of the planet reacting to CO2 levels? Well, this will among othr things, promote vegetation growth everywhere.
So is, "CO2 bad -> CO2 ppm rising -> Existential threat" real? I, for one, after devoting a non trivial amount of time to truly looking at this as a complex multivariate problem rather than that silly statement being pushed around, do not believe so. I think this is a silly and damaging reduction to an absurd conclusion.
Will we have to adapt to potential changes? This is likely. However, we are already living in a 700 to 1100 ppm environment (homes, offices, inside cars) and we haven't all turned into a pile of goo on the ground.
Somehow I don't think the threat is existential as much as it is evolutionary. What I mean by this is that we likely have to evolve how we live, where we live, how we grow food and, yes, of course, how clean we are about our affairs. I am all for reducing CO2 emissions and being clean, just not because of a potentially flawed conclusion but rather due to the fact that, yes, humanity should pollute as little as humanly possible. This is a good goal. Yet we should not be hysterical about it. The sky isn't falling.
They do, and all energy needs can be met with solar, wind and grid energy storage. Or nuclear if you don't want to invest in energy storage for whatever reason.
> The next element of the story is how we are going to replace the massive number of byproducts of fossil fuels that modern life pretty much depends on. We know that making complex hydrocarbons any other way is in a range between highly inefficient (which would increase the aforementioned energy requirement) and impossible.
Burning fossil fuels are the biggest immediate problem. Other fossil fuel products may or may not be a problem. But you don't ignore the heart attack because you just noticed a rash that may be flesh eating bacteria. Triage is key.
> You see, all the other oil byproducts are still needed.
"All" is overselling. Some are arguably useful, but for example, most product packaging is likely superfluous and a product of our current economic incentives. For instance, why do we have disposable containers for each unit of cleaning product we buy rather than reusing containers that you get refilled at the store? These choices are driven by market incentives that prioritize convenience over sustainability.
Some products may never get rid of their plastic packaging, perhaps something like sterilized vacuum packed needles that hospitals use. Those would be the exceptions but not the rule.
> We just have to be careful that we don't reduce reality to single variable problems, because that isn't reality, it's a fantasy, and a dangerous one at that. Climate change is one of those.
Climate change isn't a single variable problem, and I don't think anyone serious is pushing it as such. If you look into the IPCC report on climate change, you'll see all sorts of factors being accounted for including cloud cover, contrails, methane, water vapour, CO2 and more.
We only have so much influence over some of these factors, but the biggest and most obvious factor for which we have alternatives, is CO2 emissions. Do you deny that?
> Once you start thinking at scale --planetary scale-- "natural" and "sustainable" quickly end-up with razed forests, depleted marine life, polluted water sources and a sky blackened with thick pollution.
You and I clearly have different understanding of what "sustainable" means.
Not true. Not even close. Particularly if you move away from optimal solar regions.
In addition to that, manufacturing the grid energy storage capacity required to service planetary scale requirements will result in unspeakable consumption of natural resources (mining), pollution and environmental challenges. Not to speak of the amount of energy required to produce, ship and install such storage systems.
> Or nuclear
Nuclear is the ONLY viable solution. It makes excellent use of the existing infrastructure and we pretty much know how to do it right.
> if you don't want to invest in energy storage for whatever reason.
You are confusing things here. It isn't about me, or someone like me, not wanting to "invest in energy storage for whatever reason". Such loaded words too, "not wanting to", implying negative intent and "invest", implying a benefit that clearly might not be there. In other words, fabricating a conclusion while, at the same time, attempting to diminish the other persons standing.
I must repeat myself here. Too much of what we discuss these days boils down to the magical waving of a single variable that will solve all of our problems (or cause all the harm). "Energy storage", this case.
Well, this single variable solution to all of our problems isn't, in fact, a solution to all of our problems. At all. Start digging into what "investing" in this stuff actually means and you might come out of it thinking that coal and natural gas look pretty good in the comparison.
If we are going to invest in anything it should be nuclear plants. Manufacturing batteries with ten to twenty year useful lifespan at a global scale is bound to cause more harm than good. Of course, some prefer to look the other way and think it is "green" because it seems clean at the application level.
Here's an example of how not "green" these things can be. Do you know what you'd have to do to grid scale battery-based storage in the winter in Nebraska or Alaska in order not to lose capacity like crazy (or, even worse, have the plant shut down)? Heat up the batteries and keep them warm. If you think this is going to happen with solar energy...in the dead of winter...with snow and blizzards. Well.
> Burning fossil fuels are the biggest immediate problem. Other fossil fuel products may or may not be a problem. But you don't ignore the heart attack because you just noticed a rash that may be flesh eating bacteria. Triage is key.
Well, the analogy is nonsensical to begin with. Setting that aside, you don't get the petroleum byproducts without making fuel. This is a highly optimized production system. Every layer of it, almost literally, extracts a different useful substance. So, you can't say let's stop producing diesel and gasoline and keep producing the myriad industrial and commercial products that share that manufacturing pipeline.
I mean, if you want plastic tubing, syringes, equipment and almost everything you find in a modern hospital, you have to start with oil and derive everything else. A modern hospital would be reduced to medieval times without the outputs of this process. Manufacturing a modern home, car, food, clothing would also be impossible.
It is critical to understand this fact before continuing to push this fantastical idea that we can magically stop using petroleum. We cannot. It isn't that simple.
Here's a review of how fuels and lubricants are made:
If you don't understand this, you are missing a key element of the perspective you have to have in order to have this discussion. Facts do matter.
From the article:
"By pulling the condensing liquid from the column at different heights, you can essentially separate the crude oil based on molecular size. The smallest of the hydrocarbons (5 to 10 carbon atoms) will rise to the very top of the column. They will be processed into products like gasoline.
Condensing just before reaching the top, the compounds containing 11 to 13 carbon atoms will be processed into kerosene and jet fuel. Larger still at 14 to 25 carbon atoms in the molecular chain, diesel and gas oils are pulled out.
Those compounds with 26 to 40 carbon atoms are a tribologist’s main concern. This is the material used for the creation of lubricating oil. At the bottom of the column, the heaviest and largest of the hydrocarbons (40-plus carbon atoms) are taken and used in asphaltic-based products."
Translation: You don't make lubricants and a bunch of other critically-needed oil byproducts without making gasoline and the lighter hydrocarbons first. Or, put a different way: You can stop using the lighter hydrocarbons but you still need the other stuff, which means that you are going to fill entire lakes with gasoline-type hydrocarbons that you are going to choose not to use, which makes no sense at all.
> "All" is overselling. Some are arguably useful, but for example, most product packaging is likely superfluous and a product of our current economic incentives.
I am sorry, it is clear you don't know much about the industrial production of, well, anything. Plastic bags is the last thing anyone with this knowledge would even remotely mention. Factories would come to a grinding halt (in some cases literally) without oil byproducts. To continue with my medical example, none of the equipment, drugs and supplies a hospital needs to save lives can be made without myriad oil byproducts.
> Climate change isn't a single variable problem, and I don't think anyone serious is pushing it as such. If you look into the IPCC report on climate change, you'll see all sorts of factors being accounted for including cloud cover, contrails, methane, water vapour, CO2 and more.
That is not what I am talking about. I am talking about things like us "saving the planet" by reducing CO2 emissions. Single variable. Complete bullshit.
And when I say that, it isn't an opinion. This is supported by 800K years of atmospheric data. If you have an honest interest in truly understanding what's going on I'll point you in the direction of the data. This assumes you have a modicum of scientific training, high school science and math is enough. No need to have a PhD at all. This is actually very simple and easy to understand, but you have have to be willing to leave the bullshit you've been told behind and honestly consider the data and nothing else. When you do that it is very difficult to find support for what we are being told.
This isn't about what I say. This is about what the data says.
I know of only one paper --reputable paper, by a major organization-- that finally admitted the unavoidable conclusion, that, in effect, we are being sold a pile of bullshit. Again, if you are honestly interested in digging deeper and are able to leave preconceived notions behind, I'd be happy to provide you with a link to this paper.
> You and I clearly have different understanding of what "sustainable" means.
Of course, you are thinking "sustainable" at the utopic local level. I have taken the time to have a look at these problems at a planetary scale. You can make "sustainable" work at a local level in selected areas with very careful design and management. It almost never pays off, but we still do it. The reasons are sometimes good, we should clean-up our act, no question about that.
Once you look at this as a "solution" that must address the needs of over seven billion people at a planetary scale it is hard to impossible for "sustainable" to actually be sustainable. In a lot of cases you create more damage than benefits in order to create the illusion of being "green". One example of this is that case of not using fuels, such as gasoline, while still having a need for the massive number of heavier hydrocarbons that come out of the distillation process. Great, no gasoline. Now, where to you put that shit while you only, literally, use the bottom of the barrel? Ah, you destroy the planet. Got it.
BTW, I don't have the answers other than to warn that we really need to start reducing reality to single variables for each problem. That is simply not the way we are going to solve anything at all.
There exist lubricants for which non-fossil crude oil production sources are possible such as PAG and others. Likewise, there exist many plastics and synthetic rubber that are not petro based, however ethylene, which is a petrochemical byproduct, is still required for a lot of plastic uses. This doesn't matter much, it is totally possible to produce these things without vast ghg production.
The production of lubricants which are not burned do not contribute in large part to greenhouse gas emissions. It is fully possible to produce industrial petrochemicals such as plastics and lubricants that do not produce large ghg emissions from burning. Most lubricants are recycled, lots are done so by burning, these can instead become industrial stock for production of lubricants and plastics given research.
Diesel and fuels can also be produced from non-petro sources, this has been done in people's garages. Diesel engines can burn corn, peanut, canola, waste vegetable, some of them can run on Dexron III (this is petrol) and diesel created in a garage reactor from plant sources. Biofuels are not fossil fuels. Electrification of vehicles removes vast sums of ghg from vehicle emissions, the production source can be any other non-fossil source.
The thesis of your comment, I gather, is essentially:
>that case of not using fuels, such as gasoline, while still having a need for the massive number of heavier hydrocarbons that come out of the distillation process. Great, no gasoline. Now, where to you put that shit while you only, literally, use the bottom of the barrel? Ah, you destroy the planet. Got it.
The economics of gasoline and diesel as a byproduct of petrochemical production makes rather favorable economic rationale for fossil fuel uses, but it does not follow that the byproduct must therfore be burned or drank. CO2 reduction will be required to prevent greenhouse heating and climate destabilization. Once energy production has transitioned away from fossil sources, smaller amounts can be used without the full scale ghg emissions we are currently creating.
Not using gasoline does not mean you "you destroy the planet", you could for example, return it to the fossil fuel reservoir from which it came. Seems pretty stable.
Sure. However, the problem with this idea is that it is easy to pick a few things here and there that can be done without petroleum. However, once you take in the entirety of industry, the range and scale of products we derive from petroleum or products that very much directly depend on petroleum and derivatives it truly massive. The world would come to a grinding halt if we stopped using petroleum.
This stands to reason. In an effort to extract more and more value out of the stuff we have become very creative and efficient at using as much of that black goo as possible for all kinds of things. We are talking about hundreds of years of research and technological evolution. It is only reasonable that, given that, humanity is as dependent as can be on the oil ecosystem.
This is what I talk about when I say that we need to stop this business of reducing reality to single variables. The consequences of reacting to a single variable while ignoring the dependency tree could result in far worse outcomes than anyone could possibly imagine.
Sorry to quote in this way, it isn't meant to editorialize, only to organize my response:
> [...]the problem with this idea is that it is easy to pick a few things here and there that can be done without petroleum. [...] The world would come to a grinding halt if we stopped using petroleum.
Its super useful and there's no reason to quit using it entirely. We don't even have to, there are many other means of doing what we need to do without it releasing ancient carbon stores into the atmosphere.
Everything bound up in the petro supply chain is clearly beyond the scope of my comment here and education in general, but the only way to approach reasoning about this issue is with single examples. Its too easy for somebody to gish gallop about the huge scale of industry involved, and quickly lose the fact that CO2 has to be the target. NMOG and Methane and nitrogen oxides as well, but mainly CO2.
Nobody will clear the table with a single "we can just X" or "we can't because we'll lose X". Its definitely much more involved, just as you say. One thing I strongly agree with you about is that nuclear energy is likely a critical help to this endeavor.
The largest global sources are, unsurprisingly, electricity/energy, transportation, and manufacturing, but also consider the global hegemony that for the most part turns on controlling and using this strategic resource, ever since WWI. That is all about burning it in large part.
Maybe we need a "Breton Woods" for carbon, as triggering and possibly unpopular as that may be. I'm certainly not betting on that turn of events, but we may just end up resorting to a different type of nuclear power if we just keep the blinders on ...
That's all I am trying to convey. I am mostly sick and tired of the reduction of reality to a single variable or culprit and then pounding on that ad-nauseum as if that is actually how we are going to solve anything. This is how one gets to stupid ideas like seeding the ocean with chemicals to promote CO2 capture. What? I don't care what anyone says, that's far more likely to kill all life on earth than save it.
Our history is full of unintended consequences of sure "solutions", like that island in Australia, where they introduced one species to get rid of another. The end result is that they swapped one plague for another that might actually be worse. We can't even do something like that successfully --because reality isn't a single variable problem-- and we actually dare to suggest we can modify climate at a planetary scale? The hubris in this kind of thinking is thick and dangerous.
The single variable reduction is how we get idiot politicians like AOC pushing absolute nonsense day after day. Because it is simple they are successful at linking <variable> to "bad" and, after that, position their hairbrained idea as the savior. This kind of thing should inspire projectile vomiting, not a following.
> One thing I strongly agree with you about is that nuclear energy is likely a critical help to this endeavor.
Nuclear has been the elephant in the room for decades. OK, I get it, plants build in the 1960's might not have been optimal. We can say that about cars, planes and even ballpoint pens. That's the history of humanity. Well, I think we can build them to be very safe these days. Don't build them where a tsunami can hit them, etc. To paraphrase, there's a list for that (or there should be).
What's interesting about nuclear is that you can simply (OK, not so simple) connect them to the grid and your energy and power delivery capacity instantly increased, 24/7/365. Build a 1 GW class plant and you have 1 GW, rain or shine.
Nuclear, from my perspective, is the ONLY way we can support the conversion of the entire ground transportation fleet to electric power.
Here are the results of a simple model I threw together trying to answer a simple question:
How much power do we need to support the entire US fleet of cars going electric?
The simplest assumption is one where 100% of the fleet uses 8 hour long charge cycles:
daily charge energy 50,000 Wh
cars 300,000,000 cars
long charge 8 hours
fast charge 0.5 hours
Portion charging long 100%
Portion charging fast 0%
% of long-chargers charging simultaneously 100%
% of short-chargers charging simultaneously 0%
Total daily energy requirement 15,000 GWh
Cars long-charging simultaneously 100,000,000 cars
Cars short-charging simultaneously 0 cars
Power for simultaneous long charging 1,875 GW
Power for simultaneous short charging 0 GW
Total power requirement 1,875 GW
If every hour we have, say, 1/8 of the entire fleet plug in for eight hours to charge, what's the maximum number of vehicles that will be charging simultaneously at any point in the day? The assumption is that car will charge for eight hours and be off charge for 16.
Well, eight hours into the day we will, in fact, have 300 million cars charging simultaneously. After a full 24 hours from the start of this approach, the minimum number of cars charging simultaneously will be 187.5 million and the maximum 300 million.
So, yes, at peak utilization we will will have 300 million cars, requiring that we deliver 50 kWh in 8 hours, which means a peak requirement of 1,875 GW.
This means we need nearly two thousand giga-watt class nuclear power plants to support a fleet where 100% of the vehicles will slow charge.
What happens when some percentage of the fleet needs to fast charge? I am defining fast charging as delivering 50 kWh in 30 minutes:
daily charge energy 50,000 Wh
cars 300,000,000 cars
long charge 8 hours
fast charge 0.5 hours
Portion charging long 80%
Portion charging fast 20%
% of long-chargers charging simultaneously 100%
% of short-chargers charging simultaneously 20%
Total daily energy requirement 15,000 GWh
Cars long-charging simultaneously 240,000,000 cars
Cars short-charging simultaneously 12,000,000 cars
Power for simultaneous long charging 1,500 GW
Power for simultaneous short charging 1,200 GW
Total power requirement 2,700 GW
TWO THOUSAND SEVEN HUNDRED nuclear power plants.
Even if I am off by a factor of ten (I threw this together and it is very simplistic), that means nearly 300 nuclear power plants to be built in, say, 30 years. We have to build ten per year and we had to get started yesterday.
This is the kind of thing I look at when I talk about not reducing reality to single variables. The amount of energy we delivery by using petroleum is of a scale that is hard to imagine. To go electric we have to find alternative means to deliver some percentage of that energy (because electric cars are more energy-efficient than IC vehicles) to every car on the road every day. This task is far from being simple. Beyond that, the unmitigated mess that US politics has become over the last few decades virtually guarantees we cannot build a single nuclear power plant, much less ten, fifty or a hundred.
Frankly, I have no clue how this could even be possible. I think we are going to have some number of people driving electrics and, in the hubris of it all, we are going to ignore the fact that we are going have to burn twice or three times more coal to charge those cars every day. It has all the potential to be a larger mess than what we currently have.
I would love for someone to take the time to develop and publish a better model than my mindlessly-simple quick calculation. I know a lot of subtlety could be introduced. That said, I somehow don't think we can escape physics.
On the plus side, moving combustion from vehicles to centralized sources does a lot for much better efficiency. There is a lot of variation in the efficiency of vehicles that is nearly impossible to control for. Centralized sources can be much more easily managed than aging ICE all over the place.
That definitely doesn't address your main point that we are unprepared to convert.
Thanks for the work you've done preparing your analysis, I've read it entirely. I can sense your frustration with the general ignorance of more or less everybody wrt what we actually need to do. It's something I also feel. When examining the problem the conclusion I have immediately is that the entire industrialized world and probably the rest of the world, basically all of human society, is very tangled in the business of burning petroleum.
You've mentioned a lot of things that ring very true to me, such as the scale of the problem, the political boondoggle (I don't know of another way to say clusterfuck, but that doesn't really capture either) and the nearly complete lack of functional solutions as well as a tendency for incumbent forces to prevent implementing what solutions we do have available.
There are also a lot of vested interests who frankly make a lot of money doing what we do now. The US has also benefited greatly from the status quo of fossil fuel in a geopolitical sense, It's pretty much been the center of global foreign policy since WWI. "Developing" nations such as China (they are definitely developED) are using it too, they're on the same page of the usage story, I don't see anything at all different there.
The only thing that is really gonna do the heavy lifting are economic needs, because as much as I hate to admit it, it's the only language that is useful or understandable at all anymore.
Our conversation is really about nuclear power it seems. I recognize that, but I don't have any solutions to that impasse that we are experiencing. The one thing that I see helping that cause is, weirdly enough, alternative energy like solar, geothermal and hydro electric. Don't lose it on me yet please, I'm not trying to change the subject.
If we do end up taking seriously the prospect of implementing as much non-nuclear, non-fossil electrical generation as we can, it will have a positive economic effect on the usage of electrical power vs fossil power. This doesn't melt the enormous capacity iceberg that you have very well pointed out, but providing additional economies of scale for this kind of electric power will allow economic forces to begin to favor it vs fossil fuel.
If electrical storage becomes more necessary, we might be in a position to create a demand for additional electric resources including nuclear power. Additional development of alternative sources will drive more innovation, dollars, research and political interest into the usage and creation of this kind of energy.
When everybody wants to have solar panels which are looking more and more economically desirable, they may also invest in storage technologies that allow them to use it more effectively. This kind of thing can augment baseline electrical demand in a variety of dimensions: Politically, it will be much more desirable to create electrical sources, economically it will be easier to achieve because of greater scaling, and the technology will improve as investment increases with the demand. I suspect nuclear energy will be a better sell in a world where there is more understanding of electrical needs.
I don't know how to give nuclear energy a better PR campaign... people just don't understand why its desirable, but its easy to imagine how it could be undesirable. By the same token, people just live their lives with whatever is there, and that's gasoline and whatever makes electricity for them now or whatever they feel culturally comfortable with. As well, there is a clear fact that oil producing corporations have a lot of power, politically and economically, with which to do their own PR, but nuclear energy does not have giant multinationals pushing for its development and use.
It doesn't look good, that's for sure.
I appreciate the effort you have expended making your point, it has benefited my thought process.
What we need more than anything else are honest conversations about all of this. Sadly the mixing of political forces (which only exist for the benefit of the political class) and industrial/business/financial forces (which, of course, exist in support of their goals) makes this nearly impossible to address, at least on the time scale of one or a couple of human generations. I think this is a multi-generation problem, meaning, somewhere in the one to two century range.
BTW, I designed and built a 13 kW ground-mounted solar array three years ago. By this I mean, I purchased all the components and physically built the structure and wired it all. I have about three years of minute-by-minute data on solar production. No batteries yet, they just don't make sense in terms of ROI, at all. Eventually, maybe.
I'll just say the solar experience has been "interesting". Homes around mine don't have nearly this size system and they likely spent two to three times the money to install them. I have spoken to a few neighbors who are actually sorry they put money into solar because the size of their systems were calculated based on rates at that time. As rates have gone up they find themselves paying to lease their solar system as well as paying a bundle for electricity.
Going back to honesty in discussing some of the issues of our time. Climate change and the issues regarding atmospheric CO2 concentration often lead to the idea that we have to act immediately to "save the planet" or we are all going to die in twenty years (or whatever nonsense politicians are pushing). This is objectively false and it is amazing to me that the scientific community does not riot against such dishonesty.
Furthermore, understanding the idea that we just can't do anything about atmospheric CO2 accumulation can be verified while armed with very basic high school math and critical thinking.
The first thing you do is look at the graphs we have from reliable and accurate atmospheric CO2 concentration data from the past 800,000 years. Here's that graph:
And the data:
You then fit straight lines to the graph in order to determine the rate of change of both atmospheric CO2 accumulation and decline. Here are my lines for the decline portion of the data:
Looking at it in rough strokes, it looks like it took, on average, somewhere around 25,000 years for a 100 ppm increase and, say, 50,000 years for a corresponding 100 ppm decrease. In some cases it took twice that time, I am just trying to generalize.
The planet did this entirely on its own...because we were not around or we were insignificant during this time period.
This is extremely valuable data and an equally valuable conclusion because it establishes an important baseline:
If humanity LEFT THE PLANET tomorrow, it would take about 50,000 years for a reduction of about 100 ppm in atmospheric CO2.
I'll repeat that: If we left the planet and all of our technology was shut down, you are looking at a minimum of 50,000 years for a meaningful "save the planet" change in CO2 concentration.
At this point the question becomes glaringly obvious:
How does anything LESS than leaving the planet even make a dent on CO2 at a human time scale?
This is important. 50K-years for 100 ppm is not a human time scale. We could very well be extinct by that time due to a virus or collective stupidity. I am going to define "human time scale" to mean a century or less. In other words, something we can wrap our brains around. That also means making plans and taking action today for something that will not deliver results for, say, 50 to 100 years. Imagine the world making decisions in the 1920's for us to benefit from today. That's pretty much ridiculous on the face of it.
And yet, that isn't the problem, is it?
Because of the baseline revealed by this data we know, without any doubt, that anything less than leaving the planet cannot possibly delivery a faster rate of change, a faster decline than 100 ppm in 50,000 years.
Solar panels all over the planet? How is that MORE than leaving the planet?
A billion electric vehicles? Same question.
No more fossil fuels? Nope.
In fact, Google Research boldly set out to show the world that a full migration to renewable energy sources could address the issue. To their credit, when they discovered just how wrong they were, they published the data. In this charged environment these researchers deserve a ton of respect. They went in --and say so themselves-- with a position of believing that renewables could save the planet. What they discovered instead was precisely what I understood through the simple exercise on this graph, that this is an impossibility. Their methodology was different from mine, the result was the same.
Here's that paper, it is well worth reading:
From the paper:
"we had shared the attitude of many stalwart environmentalists: We felt that with steady improvements to today’s renewable energy technologies, our society could stave off catastrophic climate change. We now know that to be a false hope"
"Trying to combat climate change exclusively with today’s renewable energy technologies simply won’t work"
"if all power plants and industrial facilities switch over to zero-carbon energy sources right now, we’ll still be left with a ruinous amount of CO2 in the atmosphere. It would take centuries for atmospheric levels to return to normal"
"<snip> to see whether a 55 percent emission cut by 2050 would bring the world back below that 350-ppm threshold. Our calculations revealed otherwise. Even if every renewable energy technology advanced as quickly as imagined and they were all applied globally, atmospheric CO2 levels wouldn’t just remain above 350 ppm; they would continue to rise exponentially due to continued fossil fuel use."
"Suppose for a moment that <snip> we had found cheap renewable energy technologies that
could gradually replace all the world’s coal plants <snip> Even if that dream had come to pass, it still wouldn’t have solved climate change. This realization was frankly shocking"
Well worth reading. Like I said, these guys deserve a ton of respect for effectively saying "we were wrong, and here's why".
Why aren't we talking about this AT ALL. This is reality. Not what we are being told by politicians and zealots. Climate change has become a religion or a cult and science has been left far behind. Here are two ways to come to the same general conclusion. One uses a super-simple look at 800,000 years of atmospheric CO2 data. The other took a detailed look at mathematical climate and other models. The conclusion was the same: We can cover the planet with renewable energy sources and do NOTHING to atmospheric CO2, or worse.
I've been trying to elevate this to some level of consciousness here on HN any time the topic comes-up. It is often met with a pile of downvotes and attacks. Because, of course, they "know", even though none of the detractors bothered to devote even 1% of the time I have trying to understand actual reality in a sea of nonsense.
Frankly, I am not sure what else to do. In this charged political climate it is actually dangerous to stick your neck out too far. I think you understand now that this is not --I am not-- denying climate change, I am simply saying "the emperor has no clothes" to all the nonsense we seem to be told to focus on.
I think we need to learn to live with whatever is coming. We can't do a thing about it. New industries will sprout to help us manage it. The planet will deal (and is dealing) with CO2 as it has for millions of years.
And that's the other set of questions that the graphs and some research can answer:
How did CO2 increase when humanity was not around to muck it up?
Continental scale forest fires burning for 25,000 years as well as other sources of CO2.
How did the planet bring it down?
Rain, storms, cyclones, hurricanes, and the regrowth of vegetation over 50,000+ years.
So, we have to learn to deal with changing weather patterns and perhaps start helping the planet a tiny bit by planting trees. Judiciously though, because more trees could also mean more fuel to burn. In other words, we could, if not careful, actually increase CO2 if we plant a billion trees and create the conditions for the mother of all forest fires.
Like I keep saying, not a single variable problem. Is it?
Convenient that you just skipped over one of the options that doesn't require "optimal" solar regions.
> In addition to that, manufacturing the grid energy storage capacity required to service planetary scale requirements will result in unspeakable consumption of natural resources (mining), pollution and environmental challenges. Not to speak of the amount of energy required to produce, ship and install such storage systems.
You seem to be making a lot of assumptions that grid storage would be some kind of electrochemical battery. That's unwarranted. There are in fact many storage options available.
> You are confusing things here. It isn't about me, or someone like me, not wanting to "invest in energy storage for whatever reason". Such loaded words too, "not wanting to", implying negative intent and "invest", implying a benefit that clearly might not be there.
No, what I'm doing is telling you that there is a benefit there. Our current grid design is frankly terrible, speaking as an electrical engineer. Grid storage would solve so many problems, make the grid far more robust and simplify the overall design, even if we still burned fossil fuels. You have no idea how much cost, overhead and difficulties there are in actively managing the grid that would just disappear with grid storage. Australia benefitted immensely from the Tesla storage, for example.
> Well, this single variable solution to all of our problems isn't, in fact, a solution to all of our problems. At all. Start digging into what "investing" in this stuff actually means and you might come out of it thinking that coal and natural gas look pretty good in the comparison.
I disagree 100%. I'm aware of the costs and benefits here.
> Setting that aside, you don't get the petroleum byproducts without making fuel.
Even if that were the case, making fuel does not entail burning fuel.
> That is not what I am talking about. I am talking about things like us "saving the planet" by reducing CO2 emissions. Single variable. Complete bullshit.
Not complete, but I agree it's not the full story of the problems we have. The error bars on what problems climate change will cause are wide, but the worst case is truly horrible. All indications are that we're doing worse than we should be, so it's not looking promising.
> Here's an example of how not "green" these things can be. Do you know what you'd have to do to grid scale battery-based storage in the winter in Nebraska or Alaska in order not to lose capacity like crazy (or, even worse, have the plant shut down)? Heat up the batteries and keep them warm. If you think this is going to happen with solar energy...in the dead of winter...with snow and blizzards. Well.
There are plenty of options for using batteries in cold climates that avoid most of the problems you describe, but I'll steelman your position and suppose that any kind of grid storage is completely unusable in Alaska and that it requires fossil fuels.
Are you really saying that because Alaska requires fossil fuels, therefore we should continue to use fossil fuels everywhere else? Because I've already acknowledged that displacing them 100% is probably not an option, but we almost certainly could get to >95%.
Then please, pretty please, with sugar on top: You have the scientific and mathematical training to actually understand that what you are being told is complete bullshit. However, you have to be willing to apply some of that critical thinking, engage in some simple research, do a little math and try to understand. However, you have to be willing to leave this single variable view of the world behind during the process.
Grid storage = good?
Current grid design = terrible?
Well, grid storage today means batteries. It does not mean Star Trek dilithium crystals, hydroelectric plants, compressed underground air, spinning masses or the myriad interesting-but-unrealizable ideas people have floated. In the US, in particular, it would take a hundred years to build just one new hydroelectric plant, much less a couple of hundred of them.
Everything sounds great until you start considering scale. A town of 30K people, in an optimal solar region, full solar with batteries at every home? Sure.
I have a 13 kW system at my home that I built myself. I might add batteries at some point in the future, when and if it makes sense. Would I recommend everyone in my neighborhood do the same? No. It's crazy. This was likely the single worse investment I have ever made.
Scale IS a problem. Most definitely.
Scale means that for everyone to have what I have (or will have, if I ever install batteries) you have to be willing to do things like destroy thousands of miles of marine ecosystem to mine the very materials we need to make something like this happen at scale.
Scale is the problem. You have the background to be able to explore and understand this.
Electric vehicles. Calculate the energy we deliver into gas tanks every day through gasoline. Now calculate what that means in terms of total energy delivery EVERY DAY across the US. Make an accounting of our current energy generation capacity. Which, BTW, does not happen to have been built to support a 100% or 200% demand increase. Now translate the total energy delta demanded by a fleet of, say, 300 million electric vehicles, into, say, 1 GW nuclear power plants. The conclusion, if I remember correctly, is that we need somewhere in the order of 100 new 1 GW class nuclear power plants spread across the nation in order to support a full shift into electric vehicles.
The other problem is POWER rather than energy. In other words, you have to be able to deliver a certain amount of energy per unit time simultaneously across geographic areas. If you have 100K or a million cars charging simultaneously you are going to have to build additional POWER delivery capacity in order to service that demand, something that today (in terms of energy) we deliver using a liquid fuel.
When you start considering scale and get beyond single variable reduction of reality, things look very different. Coincidentally, the Los Angeles Times published an article about the mess we are walking into due to our drive to move into electric cars:
"A mining permit pushed through in the last week of the Trump administration allows the Canadian company Lithium Americas Corp. to produce enough lithium carbonate annually to supply nearly a million electric car batteries. The mine pit alone would disrupt more than 1,100 acres, and the whole operation — on land leased from the federal government — would cover roughly six times that. Up to 5,800 tons of sulfuric acid would be used daily to leach lithium from the earth dug out of a 300-foot deep mine pit."
And this is for JUST a million batteries per year. I assume they mean cells rather than the full car battery. Whatever the case may be, at scale, this is horrific. Yes, I said "at scale" because a million per year is not scale. We need many times that, thousands of times that amount if we are going to electrify the global transportation fleet. If we also want to use the same materials for grid energy storage the problem, again, at scale, quickly reaches apocalyptic proportions.
All I am asking you to do is to invest the time to really get into the details of the issue and use the math and science you understand to develop a true sense of proportion.
My guess is that the current path to electric energy storage is, at scale, a seriously flawed idea. I am not a chemist, so I can't propose an alternative that would be more benign at scale other than to say two things:
First, we need to be very careful and not allow politicians and various interests to lead us by the nose into causing a global disaster of unimaginable proportions.
Second, I have a sense --and the hope-- that a bright young scientist might just discover a path to store and deliver energy in liquid form in a way that will not have us resort to such things as strip mining the oceans and dumping millions of tons of sulfuric acid into mines to leach lithium out of them.
I don't have the answers. I just know we are probably being led down a path that could be far uglier than pumping petroleum out of the ground. We are making dumb decisions, like cancelling the XL pipeline...which means oil will have to be trucked...which will burn millions of gallons of refined diesel fuel to move petroleum at a horrific loss in efficiency. In the meantime, we have no problem dumping thousands of tons of sulfuric acid into mines to get lithium for "clean" electric cars and storage.
Does this really make sense to you? At scale?
I hope not.
A mixed energy economy with a variety of energy storage solutions can address the various needs, and you must also take into account the evolution of technologies and economic incentives.
For instance, the energy delivered each day to cars by chemical means is indeed huge, so you naturally conclude that we can't possibly deliver that much energy using known batteries and renewable technologies. Let's suppose that's true, there are some factors that will affect how this actually plays out:
a) ICE are very inefficient while electric vehicles are much more efficient, and so the total energy to deliver is significantly lower than that delivered by fossil fuels now,
b) vehicle utilization is also inefficient, in the sense that there's no real need for everyone to have their own cars and drive point to point; improved public transit is one solution, but so is something like Uber pool, thus improving the utilization of the delivered energy and thus reducing demand further
c) as for storage alternatives, solid state batteries are very close, and numerous other storage options already exist and are deployed, with improved variants being tested around the world already; I think the wikipedia page on this is decent, in particular if you pay attention to the tech that's already deployed and running [1,2].
I agree that this is a huge problem because fossil fuels are so heavily embedded in our economy in so many ways, but I don't think it's insurmountable. If enough of us can get on the same page that some outcome needs to happen, we can get pretty far pretty fast. We need a Manhattan project level commitment to this, and the country that does this well will have a huge first mover advantage to sell this to other countries. Vested fossil fuel interests are preventing this though.
10 years ago I would have been 100% with you that nuclear power was the option we should be investing in, but given the precipitous fall in the cost of renewables, I don't think nuclear is a slam dunk anymore. Nuclear still has its place in the move away from fossil fuels, and the renewed interest and new reactor designs seem promising.
I really reccomend, if you want a level headed, conservative, independent look at the issue that you check out some of these videos by Potholer54, a retired geologist and science reporter.
If the author of the most cited paper in a field is going to get all the grants, and a standard "useful" paper is going to get no continued funding - then researchers will push to make their work sensational. Eventually the professors and everyone left in the field is fighting sensational with either outright fraud or alternate funding sources.
Same goes for VC funding of startups, employees at companies, and government programs. The baseline "useful" work is rotting away in favor of aiming to be the top ~5-10%.
In other words if your goal is to craft a paper that draws attention, that's harder if you're being meticulous about limiting yourself to saying to what evidence shows is clearly true.
How is that relevant to my comment?
Disagreeing with status quo is given lip service, but in my particular subdiscipline, I saw trendy theories come and go. It certainly was more work to get published if your paper suggests the trend is not true (more likely to argue with referees, etc). This is common when experimental data is not available.
> You suggest how scientists will lose grants if they produce inconvenient science
Again, not a direct connection in my discipline, but trendy work is easier to get funding, and if you counter the trend, you'll have trouble getting papers published in good journals, which affects your funding.
In my time there, the common advice to new faculty members was "Do fun/risky research only after you get tenure."
I am not saying what I said in a vacuum, this came out of conversations with actual climate science researchers and scientists that I had the opportunity to meet during the course of my work in aerospace. The context was a project were we were developing various systems and payloads for both the International Space Station and, eventually, the Artemis mission to the moon.
The message was quite clear: In a politically charged environment you have to be very careful not to ruffle feathers. Anything from your career to your funding could be at stake.
No, you are not going to find google-searchable interviews with such folks, again, that would be professional suicide.
You can find information that helps understand some of the forces at play. While not a full picture, it's enough to support the plausibility of my claim. You don't have to believer me, of course. It's your prerogative.
Use google ngrams to look for "climate science". It basically doesn't occur before around 1982.
The other risk is that it's an act that can easily be abused. It is very easy to level charges against someone without proof; somehow we tend to believe the first salvo (myself included). In this case it sounds relatively straight-forward, but it really is irresponsible to take a stranger's word for it.
So, if you really care, you might reach out to the OP and get the details. That eliminates the downside risk to the OP and acts as a shibboleth that ensure only people that actually care enough to look into it know the details.
I actually bought one of the devices and took it apart on video. There is nothing inside except 4 magnets and a plastic printed card. No active circuits and no power source and no components such as resistors or capacitors and certainly there is no microchip inside.
edit below are links to pictures of the device and accompanying text as originally on facebook posted
Hi Powerinsole I ordered one of your power chip devices and took a detailed look at it. My analyses is as follows.
1. There is no battery and no system for harvesting energy. All electronic circuits require an energy source and a lack of an obvious system for powering the device is a problem.
2. There are no components such as integrated circuits, transistors, capacitors, diodes or inductive devices that would be required to create a "circuit" or "chip". A "chip" is not just a random configuration of tracks. The tracks are there to transfer electricity between components that shape and switch the electric current according to purpose but given that there are no components what are the tracks for?
3. There are 4 magnets. Probably neodymium. They produce a constant magnetic field. They do not generate "frequencies". The device sticks to a metal wall like a fridge magnet and doesn't vibrate.
5. The tracks are configured in such a way that even if components were attached at the "solder points" nothing would happen because the tracks are all shorted together. Electricity always takes the easiest path. If all the tracks are are shorted then the components will receive no energy input.
6. After testing with a multimeter I found that the tracks on the "circuit board" do not conduct electricity. If the tracks do not conduct electricity there is no possibility of transferring energy to components. ( there are no components )
7. The magnets are isolated from the "tracks" and each other by a plastic layer and glue. It is not clear what the relationship the position of the magnets to the tracks might be.
8. There is no NVRAM, magnetic storage, optical storage, ROMs or other known systems for storing information. So claims from PowerInsole that they load information onto the device is difficult to comprehend.
9. There is no crystal or LC circuit to drive an oscillator. Even if there was there is no battery to drive it and the tracks are all shorted and the tracks do not conduct electricity.
Given the above observations I find it difficult to believe that the device can function as advertised. What you essentially have is 4 small magnets on a printed card in a gel cushion for 69 euros.
If I am wrong about any of the above I would be happy to have a respectful and open discussion about your technology.
It involved a research project that they stopped funding, but didn't want to let go. The only researcher who continued to work on the project wanted to take the project to a different institution where the project could get funding, but Uni Salzburg refused and said it's their project. They would rather have the project be abandoned than let it thrive somewhere else. If their name wasn't on the project anymore, they would rather have it die.
And not to forget, Uni Salzburg was also home to our most famous case of scientific misconduct where Robert Schwarzenbacher fabricated measurements using simulation software. The handling of that case was also interesting (they terminated his employment after verifying the accusations of fraud, but one guy from the union tried to convince someone from HR to delay some paperwork so they could later challenge the termination... crazy stuff)
Furthermore, universities tend to require tons of publications to promote you. Things are spinning out of control. I know a few EU countries where the written norm is to need > 100 publications to qualify for a full professorship, with equally ridiculous requirements for associate and assistant positions.
Obviously, this encourages and rewards completely broken practices. Many associate and full professors in my area only care about stamping their names into as many journal articles as humanly possible. Some of them are already beyond 500, with many of these in top tier journals (Nature, Science, Cell, NEJM). Obviously, they hardly ever contribute anything. Their serfs do all the work. Their job is basically to plot in order to stay on top of their neofeudal shire.
In addition to this, funding bodies do nothing after fraud has been proven. ERC only terminates grants on rare occasions. https://forbetterscience.com/ discusses many cases of serial fraudsters who keep getting funded despite having retracted 10 or 15 articles in major journals.
Over time though the last author came to mean "the more senior person" and then "the person whose idea it really is". So being last went from this thing that no one wanted, to this thing that people would kind of argue over. In the process the more manipulative cases, people would kind of casually say "oh I can be last author" realizing the gains from that position.
It seems when a more junior person is doing all the work and is first author, an unscrupulous senior researcher will claim that "it's the idea that counts"; when that senior researcher is first author, it's "ideas are a dime a dozen, it's getting it done that matters."
Just list everyone alphabetically. Problem solved.
This describe my lab's head perfectly. At first I found strange he was so angry about a side project paper I wrote alone quickly on my free time and asked to publish at a conference. Then I understood why: in his view, every minute I spend on my projects is one I don't spend on his projects. The guy approved my first journal paper submission, which had his name on it without even reading it. It was obvious by the lack of comments and when he asked a few days later during a lab meeting to change half the content of the paper...
I'm not against putting name of people contributing to the research, even slightly and informally, but at this point this is pure leeching and exploitation. Then he wonders why my thesis isn't progressing (hint: because when I chose nothing about the topic, method and experiment setting I'm not really motivated to work on that).
This is the most screwed up part.
This is compounded by publishing negative/null results being disincentivized. Knowing what doesn't work can be as important as knowing what works.
When I was an economics RA literally half of the econ professors didn't work Fridays and barely worked summers. It was incredible you get paid 150k with that kind of schedule.
I used to think it was a great gig too, since most professors had one or more small businesses on the side. Then I realized they have those businesses and consulting companies because that means they can also apply for small business grants (which they use to subcontract the research out to the university) in addition to the normal academic research grants. If you also count teaching, then that means those professors are working three jobs for one salary.
I made the decision that I'd rather make 50% more working in industry doing easier (if boring) work.
In CS I saw more of a mix though. It's feasible to fund a small research group without busting your ass, and it also seemed to me that putting time into coursework, writing books, etc. was culturally a more acceptable use of time in that department than it was in biology.
I knew a few CS PIs that actually purposely scaled back their research once they got tenure because they were more excited about teaching and some of the educational initiatives the school was working on. That's not the norm of course, but I literally can't imagine that ever happening in a bio department lol.
This is basically how we ended up in the replication crisis to begin with. There's little to no accountability and people assume they're working hard and being honest.
5% of all degree granting universities are R1 or R2 research universities, e.g. Harvard, Stanford, etc. The vast majority of professors aren't obligated the conduct research or apply for grants.
Of course, government funded motives are just as tainted by selfish motives, if not more. Even worse, the people who make the funding decisions aren't spending their own money, so they have little reason to care.
At least with privately funded research, the people providing the money aren't going to fund bullshit fake research. This is why market systems work better than government systems.
There is also "sponsored" research as others have pointed out, that is more of a bought study that a business hopes they can use for marketing. These have a big conflict.
I agree that government is probably the worst system in most cases. It's the same kind of "picking winners" that doesnt work in corporate funding. I'm from Canada where our tech industry basically runs on subsidies, and very little escapes the bubble of trying to get more government funding and actually becomes self sustaining.
Personally, I have seen there is a legit appetite for corporate funded research that advances the company's goals. As an academic, I would rather seek out companies for funding, knowing that I'm working in something that someone wants, and not trying to optimize for government priorities. I'm coming at this from a hard science perspective. I imagine the dynamics are very different for drug trials or other efficacy type studies, which are maybe more relevant to this discussion.
A prestigious reputation is like glass - easy to break, very hard to put back together.
You'd think this would work with government funding, too. But it appears it does not. It could be because one's "reputation" is based on how many papers are published and how many cites. This is like rating a programmer on how many lines of code written.
It is not a measure of quality at all.
Unless gamed, it _is_ a useful measure.
But if your programming ability was measured by how much are your libraries used (eg. in hiring, determining salary, seniority...), there would be every incentive to aim for your library to be used as much as possible even when it is superflouous.
That is what has happened in academia, basically.
If/when the perception of government funded researchers finally aligns with the reality, businesses would stop doing that because there'd be no reputational misalignment to exploit.
I'm talking about incentives here, and people do things almost entirely on selfish impulses. Money is a powerful motivator, and people are strongly motivated to not spend their own money on bullshit. That motivation is absent when government funds things - but other motivations remain.
They absolutely are if it helps them promote something. Cigarettes and asbestos industries helped produce plenty of fake safety studies.
The problem is that research has been marketized; you have to "sell" your proposal to get funding, so naturally you big it up as much as possible. And thus the incentive to fake results.
I seriously doubt it. Any more than you'd continue taking your car to an auto shop that took your money but didn't fix it.
Citation needed, first of all, but governments are at minimum accountable to voters. Private money is in no way accountable.
Voting on how government spends money is in no conceivable way like you deciding how to spend your money.
> Private money is in no way accountable
It's accountable to the people who are providing the funds out of their own pockets. People do not like wasting their own money.
I bet you look at your own budget. You have to, otherwise you'll be in jail for bouncing checks and tax evasion. I also bet you've never looked at your city, county, state or federal budget. It's other peoples' money, so who cares!
I have. Not in great detail though. The problem is I can't really do anything about it. Even if I find something bad and by lucky chance get people to care (there are plenty of slow news nights) - there is far more bad things in the budget than I can expose before people get tired of the corruption and give up listening. I try to elect politicians who will do something about it - I have low success: people who benefit from any specific spending are more powerful than people who are just against waste in general. That is assuming I can get my person on the ballot in the first place (low odds), and they don't realize once elected that reelection (read power) comes from handing out pork to those who want some specific waste. There are more things that make it hard - I just scratched the surface.
Pork is hard to figure out. Is spending money not to repair something that isn't broken good money or bad? I've seen perfectly good buildings get needless remodels and I've seen perfectly good buildings suffer because they never got maintained. I've seen towns put in sewer systems they don't need, and other towns fail to put in a sewer system until it was an expensive emergency. Flint had 40 years to replace the lead pipes in their water system - or they could have investing in water treatment chemicals that makes lead not leach from the pipes for much less money even over 40 years (you can pick anything from 60 to 30 years ago as the date when lead is bad became known - 40 was my somewhat arbitrary pick).
If you wanted a mcburger but 51% voted for mcnuggets, you got mcnuggets.
If you knew in advance that 75% were going to vote for mcnuggets, you likely wouldn't even bother to vote. You knew you had no choice at all.
Governments are at a minimum accountable to the people willing to use force against the government if they are sufficiently displeased. They may also be accountable to voters qua voters, depending on whether they have voting at all, and, if they do, what options are presented to voters and how fairly voted are counted, all of which are axes on which governments vary considerably, with many falling into ranges resulting in little or now accountability to voters.
This only happens because we allow it to happen.
Not being innovative enough isn't the root cause though. The real issue is there isn't enough funding to go around, and so the bar is higher than it needs to be. Available research funding in the US is a paltry sum considering the aggregate ROI of discoveries and technologies that originate in universities. Funding rates can be as low as 10-20%, with thousands of researchers competing for the same grants. They need to all paint a tortured story of how their idea will be the next big invention.
The problem with our system is that we put public money into research, which is then commercialized by corporations and sold to consumers, and corporations/universities end up capturing the profits. Those profits are then invested in ways that yield short-term returns instead of being reinvested in research.
Some of those profits are supposed to come back to the government and reinvested in research, but more and more corporations (and I consider universities to be a kind of corporation with the way they act like hedge funds that do education as a side hustle) are figuring out how to keep as much of those profits as possible, despite those profits only being made possible in the first place due to publicly funded research.
What if we increase funding into research? VCs are willing to pour millions into ridiculous or tenuous ideas because they know a single success will more than make up for the duds. Lower the stakes, make funding more available to researchers, and then maybe we won't need to squeeze every bit of "innovation" out of every research dollar. Make room for research that fails or yields a negative result. This is important work that is valuable and needs to be done, but there's no funding for it. We could double the amount of funding for e.g. the NSF and it would still be a drop in the federal government's proverbial bucket.
Part of the reason we have the problem you're mentioning is not that there isn't enough money to go around, it's that universities (at least in the US) now depend on inflated costs to function. The costs of research are kicked down the road to the federal government, and the research itself is seen in terms of profits rather than discovery. So if you have all these universities essentially telling researchers their jobs are on the line if they don't bring in profits, you're going to have everyone scrambling to bring in as much money as they can. It's not just postdocs or untenured research professor lines, it's tenured professors as well, whose income can be brought down below some necessary standard of living, or who can have salaries frozen or resources cut.
I was thinking about this the other morning. I had a grant proposal that the program officer was really excited about. This program of research could probably be conducted for almost nothing because it involved archival data analysis. However if you put a dollar amount on the time, it might realistically actually cost around 250k USD, maybe 500k max, pretty generously in terms of staff effort. However, the university managed to inflate the budget ask to around 2 million for the sole purpose of indirect funds.
When you have that kind of monetary incentive (carrot or stick), of course you're going to have thousands of persons applying for each opportunity. It's what led to the graduate student ponzi scheme, inflated numbers of surplus graduates, etc and so forth.
It all trickles down too, in terms of research claims, p-hacking, etc and so forth.
There's a place for profit, but there's also some realms where it does nothing but corrupt.
Universities and grants are this firehose of tax money being sprayed everywhere without even the slightest bit of accountability in how it's used. The government effectively "loses" all of it in accounting terms, but because it's tax it doesn't matter. The buyer is blind and doesn't even bother looking at the papers they've paid for, let alone caring about the quality.
Now go look at the results coming out of corporate labs when the corporates actually want to use the tech. You get amazing labs that are consistently re-defining the state of the art: Bell Labs, DeepMind, Google Research, FAIR, Intel, ARM, TSMC etc. The first thing that happens when the corporate labs get interested in an area is that universities are immediately emptied out because they refuse to pay competitive wages - partly because being non-profit driven entities they have no way to judge what any piece of research is actually worth.
This is definitely not true, recipients of grants are heavily restricted on what kind of things they can spend that money on. I can't even fly a non-domestic carrier using grant money without proving no other alternatives exist.
Do research projects sometimes fail to deliver? Yeah. But that's just the reality of doing research. The problem I see is people expect research to be closer to development, with specific ROIs and known deliverables years ahead of time. Sometimes in the course of research you realize what you said you were going to do is impossible, and that's a good result we need to embrace, instead of attaching an expected profit to everything.
> Bell Labs, DeepMind, Google Research
I don't know so much about all the labs you listed, but just taking these three, they certainly don't have a good feeling for what their research is worth either. Do you think Bell Labs fully comprehended the worth of the transistor? For all the research Google does, ad money still accounts for 80% of their revenue. DeepMind is a pretty ironic choice because Google has dropped billions into them and it's still not clear where the profit is going to come from. So it's not clear anyone, even those with a profit motive, have any way to judge what any piece of research is actually worth.
But that's not to say there's anything wrong with that... that's just how research works. You don't know how things are going to turn out, and sometimes it takes a very long time to figure that out, and it. This is why massive corporations like AT&T, Intel, Google, Xerox, MS etc. are able to run such labs.
> The first thing that happens when the corporate labs get interested in an area is that universities are immediately emptied out because they refuse to pay competitive wages
I've seen this happen first hand. In my experience these researchers usually go on to spend their time figuring our how to get us to click on more ads or to engage with a platform more. In one instance, I remember one of my lab mates being hired out of his PhD to use his research to figure out which relative ordering and styling of ads on a front page optimized ad revenue for Google. They paid him quite a lot of money to do that, and I guess it made Google some profit. But is the world better off?
They are restricted in trivial ways that are easy for a bureaucracy to mechanically enforce, as is true of employees at every institution.
What I meant by accountability is deeper: people are not accountable for the quality or utility of their work, hence the endless tidal wave of corrupt and deceptive research that pours out of government funded 'science' every day. These researchers probably filled out their expenses paperwork correctly but the final resulting paper was an exercise in circular reasoning, or the data tables were made up, or it was P-hacked or whatever. And nobody in government cares or even notices, because nobody is held accountable for the quality of the outputs.
Whilst DeepMind is not especially interested in profit it's true, and is just doing basic research, Google itself is an excellent example of how to seamlessly integrate fundamental research with actual application of that research. That's what profit motivated research looks like: just this endless stream of innovative tech being deployed into real products that are used by lots of people, without much drama.
We have come to take this feat so much for granted that you're actually asking if someone working on ads is leaving the world better off. Yes, it does. Google ads are clicked on all the time because they are useful to people who are in the market to buy something. Those ads are at the center of an enormous and very high tech economic engine that powers staggering levels of wealth creation. If I understand correctly, a lot of academic papers are actually never cited by anyone - a researcher who optimises search ads by just 1% will have a positive impact on the world orders of magnitude greater than that.
Have you ever received and administered a grant? I have to ask. You seem pretty certain about how it works, but it just doesn't match with my experience.
You say that there's no accountability in results and this leads to people committing fraud. In my experience, fraud happens when there is too high of an expectation that researchers can't meet. Let's say you get a $5 million grant to do X, and in the course of doing X you find out it's not possible. You have a negative result. First of all, good luck publishing a negative result. Without that publication, good luck getting the next grant.
High expectations for results incentivize fraud. There should be room for researchers to come up short with their research and still be able to progress in their careers. But when grants dry up, the publications dry up and then your career is derailed by failing to get tenure.
The fact is that not everyone can be researching world-changing technologies. That's just not a realistic expectation. Even Google can't do that, as much as you laud a profit motive (how many Google projects are in the trash right now?). But that's what we expect of everyone who gets grant money, precisely for the reason that people have an expectation that an immediate and tangible ROI must be demonstrated.
I don't know if you consider people at funding agencies like the NSF as part of the government, but they do notice when projects fall short, and they do deny 80% of grant applications (I would consider that accountability).
The NSF denies 80% of grant applications because there is a radical oversupply of people who want to be scientists and the NSF has a finite budget. That by itself doesn't create accountability any more than the fact that lots of people want to be movie stars creates accountability for actors. That's not how accountability works.
Accountability means people being held to account for illegitimate acts. If it existed it would look like this: we (the government) gave you money to deliver some genuine research, yet you delivered a paper that simply modelled your own beliefs, cited a retracted paper and cited another paper that actually disagrees with the claimed statement, used a input dataset too tiny to achieve statistical significance, looks suspiciously P-hacked, you then misrepresented your own findings to the press and by extension the government and then to top it all off it doesn't replicate. We will therefore prosecute you for research fraud and failure to meet the terms of your contract.
What actually happens is this: nothing. Journals will happily publish papers with the problems I just listed, universities praise them, the 'scientists' who do this stuff proceed to get lots of citations and the government proceeds to award even more grant money because who are they to argue with citations.
As you admit, fraud is everywhere in science, supposedly due to "high expectations for results". But so what? Lots of people, non scientists included, have high expectations for results placed upon them. The right mechanisms and incentives to stop fraud are not simply having low expectations of scientists, that's absurd and wouldn't be seriously proposed as a solution in any other area of society. It'd be like saying the answer to fraudulent CEOs fiddling the books is to simply stop expecting them to turn a profit, or like the solution to shoplifting is to just stop expecting shoplifters to pay for things.
Expectations on scientists are already rock bottom: large chunks of the research literature doesn't even seem to replicate, other large chunks are not even replicable by design, and nobody seems to care. You can't get much lower expectations than "We don't even care if your claims replicate" and yet this ridiculous suggestion that the solution to fraud is to give fraudsters even more money keeps cropping up on HN and elsewhere. The solution to fraud is tighter contracts to ensure the rules are clear, and systematic prosecutions of people who break them.
It's relevant because you are criticizing the process but you don't seem to understand how it actually works. Your original characterization was that grant money is "this firehose of tax money being sprayed everywhere without even the slightest bit of accountability in how it's used." The reality is, when I get grant money I need to account for how every dollar is spent, and if there are ever any questions about spending, I better have the receipts to back it up. The other reality is, I only get to spend a small fraction of a grant award, as the University takes most of it off the top, and my students take almost all the rest in the form of a tuition remission and a stipend, leaving whatever is left over for equipment and conference costs. Then there are strict conflict of interest regulations which come with their own reporting requirements. I don't even get all of the money at once; I'll get some up front and then I have to show significant midterm progress in order to get more. There's accountability at multiple layers by multiple organizations.
> The NSF denies 80% of grant applications because there is a radical oversupply of people who want to be scientists and the NSF has a finite budget.
It's accountability in the form of: if you didn't do what you promised you'd do, then you don't get any more money and your career is derailed. Isn't that what you want? Anyway, do we have an oversupply of scientists relative to the amount of science that needs to be done? I don't think so. The NSF budget is finite, but it's also embarrassingly miniscule given the upside of research that has come out of NSF funded projects.
> If it existed it would look like this ... We will therefore prosecute you for research fraud and failure to meet the terms of your contract.
Fraud is one thing. I'm not going to say we shouldn't prosecute fraud. But treating a grant proposal like a contract with positive deliverables (no negative results allowed) is the exact problem. Research is not development. Research implies failure. You can't do one without the other. Failure to meet stated objectives shouldn't be met with prosecution. That just further incentivizes fraud.
If there's a replicability crises it just means we need to spend money on replication research. Researchers know no one is going to bother replicating their study because there is no grant money available for redoing someone else's research. Grant agencies don't pay for that kind of thing, and you can't make a career doing that kind of research. No one gets tenure this way. If we want studies to be replicated, we need to allocate money to replicate them, and we need to incentivize people to do so by making it a viable career for a Ph.D.
> Lots of people, non scientists included, have high expectations for results placed upon them. The right mechanisms and incentives to stop fraud are not simply having low expectations of scientists, that's absurd and wouldn't be seriously proposed as a solution in any other area of society.
I didn't say we should have low expectations, I said we should have realistic expectations, and yes, that does involve lowering expectations from where they are right now. Because the current expectation is this: you have to write a proposal that has a <20% chance of getting funding. If you can't get that funding your career is basically over, so you better promise the world, because everyone else is. In this proposal you need to lay out a research plan for the next 3-5 years and you have to convince the funding agency that your research is going to change the world as we know it. If within that time you fail to meet your stated objectives, you will find funding hard to come across, and your tenure will be threatened, meaning you will probably lose your job and have to move your family. On top of that you want to add potential federal prosecution to stakes, thinking that will make things better.
> Expectations on scientists are already rock bottom .. The solution to fraud is tighter contracts to ensure the rules are clear, and systematic prosecutions of people who break them.
Okay, run with this idea: exactly what rules need to be clearer and exactly how do the contracts need to be tightened? Because there are already clear rules and tight contracts, yet the problem persists. Will clearer rules and tighter contracts fix it? How?
I'll tell you what I think will happen with this system: you'll chase out all of the public scientists because the stakes are too high. Already the pay is too good on the corporate side, and now you add potential federal prosecution to the list if I don't meet positive deliverables? No thanks. I'll go work for Microsoft where my research will be privatized. You might be okay with this as you pointed out you believe a profit motive is good for research, but you know who wouldn't be good with this? Microsoft. And Google. And all the other tech companies who were (or will be) built on top of technologies that started as government funded research. All this does is make Microsoft stronger. Is that what we want? What about the next Microsoft or Google? Where will they come from?
I'll give you a concrete example of where your idea fails: the 2004 DARPA Grand Challenge. Millions were spent trying to bootstrap autonomous cars, and what was the result? They all crashed, no one completed the race. What should the response have been, to prosecute everyone involved? No, they tried again and gave everyone more money. Next time around in 2005 more succeeded (mostly because they relaxed the expectations).
Then in 2007 we saw the first real demonstration of autonomous cars in the DARPA Urban Challenge. Today, everything Tesla, Google, GM, Ford, et al. are doing with driverless cars is based on the research that happened in 2004-2007. Without government funded autonomous car research, there would be no Tesla or Waymo today. That's how research works, you try, you fail, you try again, and you have no idea how far your impact will be, and really no one does. If we try to control this process toward producing only successes with contracts and positive deliverables, like it's an engineering project (with prosecution of failure and all), it just means we're going to lose dynamics like the Grand Challenges, and the broader economy will suffer for it.
Take all that money you want to invest in prosecutors, courts, lawyers, and prisons, and invest that in a system where replication studies are well funded and a viable career path for scientists. Increase funding into the NSF and other grant funding agencies to hire more people to consider grants, and increase grant throughput. I guarantee you you'll fix a lot of the problems you're identifying.
> when I get grant money I need to account for how every dollar is spent
Yes I know, but that's not what I mean by accountability. Again: nobody is upset with academics because of expenses scandals or taking too many expensive flights. Well, except maybe for climatologists who supposedly take more flights than the average academic, but that's due to the perception of hypocrisy rather than concern over cost.
People are getting upset because when they download and read papers, the papers turn out to be bad and there are no visible consequences for that. Even just getting a clearly fraudulent paper retracted is reported to be a nightmare, according to people who search for scientific fraud as a hobby like Elizabeth Bik. And I've read endless reams of terrible papers that were useless or outright deceptive, I tried reporting a few and nobody ever cared.
Now, you're arguing that there is accountability of the following form:
> It's accountability in the form of: if you didn't do what you promised you'd do, then you don't get any more money
This is true given that scientists are promising the NSF to publish papers, not strictly speaking to do research, and therefore by implication promising to come up with interesting claims, not necessarily true claims. But that's not what we want.
This is an inevitable problem with government funding of research. The buyer, the government, cannot really check if the claims they're buying from scientists are true, so they need proxies like did it get published, did it get cited, etc. But those aren't the same things. Corporate research doesn't have this problem because the corporate will try to apply the research at some point and if it was fraudulent they will discover it at that point, and of course they're strongly incentivized to ensure it never gets to that point in the first place.
In theory the government could write grants in such a way that money is awarded independent of what claims end up being made, instead awarding money for the quality of work done. That's what you're arguing for here. And indeed corporate labs write contracts in this exact way. Scientists get a salary in a corporate lab, they don't have to write grants. They do have to convince their management chain that the research is worth funding, but there are many different ways to do that which don't involve continually publishing astonishing claims in scientific journals.
You're asking me to propose how science should work instead but, indeed, you already know my answer: eliminate the NSF completely, and stop subsidizing student loans. All science should be funded by companies. They have already solved the problems you're treating as novel / intractable above. Scientists are awarded salaries and promotions by firms on a more flexible basis than the NSF. Importantly, they are rewarded for doing research not producing claims. Companies can do this because they have management structures sufficiently well staffed to closely monitor what scientists are doing. That means if a firm is truly committed to research then the scientists will get paid even if their programme has some dry years. Plus there's a huge body of law handling fraud and corruption in the workplace.
At the same time, firms are incentivized to eliminate the research that is probably always going to be nearly useless. Outside of firms selling books or self help courses I doubt many would subsidize sociology or gender studies for example, and it's also unclear that would be a loss.
Your argument about who it would or wouldn't be good for seems a bit contradictory and I struggled to follow it. You're arguing it would be both bad for Google and Microsoft yet also make them stronger. I disagree with both possibilities: I think they would hardly notice the difference and it wouldn't affect how powerful they are. Having worked for one of those companies and also worked at a startup where we often read research papers in a certain subfield of CS with views to maybe applying them, my view is that even in the relatively good field of computer science, most academic output is useless and has no impact. These firms do not rely heavily on government funded research:
- The web was very briefly funded for a couple of years as a side project of CERN, but then R&D was taken over by the private sector where it remained ever since. Page & Brin never even finished their PhD before moving their research into the private sector! It's hardly a mystery where the next Google will come from - probably the same place the previous one did, a garage in Silicon Valley.
- What government funded tech was Microsoft built on? The internet? Microsoft is still with us in spite of the internet, not because of it! Or are you going back to military computers in World War 2? Military R&D is different, governments can fund that semi-effectively because they actually use the outputs.
- Neural networks were a backwater until Jeff Dean resurrected the field using the resources of the private sector, academia has been chasing to catch up ever since.
There are a lot of other examples. The DARPA Grand Challenge is not an example of what I'm talking about because:
1. DARPA is military research and therefore structured differently to how the NSF does things. The very structure of it as a Grand Challenge is a clue here: the output of the programme was cars (not) going round a track, not papers and citations.
2. I'm not arguing for prosecution of researchers who end up with null results!
> Your argument about who it would or wouldn't be good for seems a bit contradictory and I struggled to follow it. You're arguing it would be both bad for Google and Microsoft yet also make them stronger.
What I meant is, if e.g. Page and Brin in 1998 had no access to government funding and research because it was privatized by e.g. AOL, there wouldn't be a Google today. But if we were to privatize all research, Google of today would certainly like that insofar as it strengthens their market position (jut like the AOL of 1998 would like the situation), but it also means they have to start funding more research because now they can't get any from the public.
> - The web was very briefly funded - What government funded tech was Microsoft built on? - Neural networks were a backwater
But the point is that it all started with government funding, so we need to be very careful about the consequences of privatizing it all. Today, ideas start out funded by the government, they gain legs in academia, move out into corporations, and are productized and disseminated to the public in the form of consumer goods. This is the progress pipeline, and it's proven extremely effective and enduring at driving innovation.
You want to cut out the beginning of the process because you think corporations can handle that part, but I don't think you've really demonstrated that. Can you point to any tech product out there that is exclusively built on in-house, private research? I certainly can't think of one.
For example, you bring up the origin of Page & Brin. Yes, they never finished their Ph.D., but the fact is they did meet in grad school while they were doing NSF funded work. Brin was at Stanford on an NSF fellowship. They built the first prototype of Google on an NSF grant. They were mentored by academics who also were funded by the NSF as professors and graduate students themselves. You take that funding away, and maybe these two people never meet, maybe they never learn what they need to get that spark of insight. So I agree with you that the next Google will come from the same place the previous one did - a government-funded research lab in Silicon Valley. The garage is where they moved their operation only after they had already used a lot of NSF money to get their start.
> 1. DARPA is military research and therefore structured differently to how the NSF does things. The very structure of it as a Grand Challenge is a clue here: the output of the programme was cars (not) going round a track, not papers and citations.
The processes of getting grants from NSF and DARPA are very similar, and in most cases the deliverable is a paper. The Grand Challenges are the exception of DARPA funding, not the rule.
> Military R&D is different, governments can fund that semi-effectively because they actually use the outputs.
Yes and no. DARPA would like to use the fruits of its funded research, but it funds projects on a very long timescale, so what it funds may or may not be used in the long term. Sometimes the research is not to strengthen the military per se, but to strengthen American interests though creating domestic tech sectors. e.g. I'm sure the military would like to use autonomous vehicles, but what's even better is for America to have its own domestic autonomous car sector that can produce those vehicles.
> most academic output is useless and has no impact.
You've tried to make the case that we should optimize toward useful research, and companies are better at identifying useful research because they have a profit motive, but I still think it's difficult to say today what research will be important 30-40 years down the line. DARPA recognizes that it's very hard to tell how useful research will be ahead of time, and that corporations don't like to engage in foundational research when there is no obvious short-term path to profit. This was the entire point of the Grand Challenge series, and it worked out well -- they wanted to bootstrap the autonomous car industry, so they paid researchers to get them rolling and now look where we are. If the government hadn't gotten involved, there probably wouldn't be an autonomous car sector in the US today.
There are plenty of cases in our history where some technology seemed useless initially turned out to be bigger than anyone could have imagined. We need to be careful not squelch those ideas too quickly because they don't return an immediate profit. Things like the Internet and neural networks come to mind. A lot of people, particularly large corporations, thought the Internet was a toy when it first was introduced. Neural networks seemed like a dead end and then found new life. But the fact is they started in academia. The Deepmind arcade paper and essentially the entire deep reinforcement learning field today is based on decades-old research funded by the UK government. What if that research was locked away in a UK corporation? Would Deepmind even exist? That research was a toy for 30 years, until it wasn't.
The whole point of DARPA and other government funding agencies is that they don't know what the winners are ahead of time, and I don't think corporations can know this either. (if they could, why didn't they do more to fund RL research 30 years ago?). Therefore we shouldn't try to optimize for obvious winners because we'll miss out on non-obvious winners, which bring the biggest upsides. This means we have to fund losers and research that ends up not being useful, and we should be okay with that, because things have turned out pretty well over all.
> 2. I'm not arguing for prosecution of researchers who end up with null results!
Sorry I thought you were with this:
We will therefore prosecute you for research fraud and failure to meet the terms of your contract.
After I typed all this I realized I failed at my pledge to not give you a wall of text. Oops!
What I mean by prosecution is that if a research body signs a contract with a scientist to do research, then those contracts would need to specify what research actually is, and that is the first step towards penalizing people who aren't really doing it. Indeed the process of flushing more research into the private sector would automatically eliminate a lot of the grey-area fraud that is so prevalent, because it would force a lot more people to write down what precisely they mean by "doing research", as well as continually evaluate that definition via normal management techniques. For example, is a simple modelling exercise "research"? It's often treated as such by e.g. banks, but the big tech labs we're talking about don't engage in a much of that, unless you count AI, but I think that's sufficiently beyond the sort of modelling you find in most science that it's best to treat it separately.
At the moment governments fund science but have no working definition of what science is, which breeds a lot of cynicism of the type I display above w.r.t. sociology. Is gender studies "science"? Most people would say no, but the government says yes. A more subtle example is epidemiology. A close examination of their papers will reveal that it's just plugging public CSV files into a bunch of very over-simplified simulations, and publishing the outputs. Is that science? If it is, can I get paid to play Cities: Skylines all day as long as I write a paper at the end? It sounds like a stupid suggestion but actually yes I can:
In my view this type of thing is not science, but my guess is at this point the science-y ness of epidemiology or urban planning would split 50/50 or most people would just go with the government's definition of "they receive grants and call themselves scientists, therefore they're scientists".
Would Google exist without the NSF? The specific company maybe not, but there were plenty of search engines around before Google, and Page in particular was already keen on creating a tech company when he was very young so would likely have ended up a startup founder sooner or later. An example competitor was Inktomi, which had already started doing pay-per-click ads. It's all forgotten now but Google nearly didn't survive its early years because they got sued over 'stealing' the PPC ad concept. They were able to argue that their own elaborations on the idea were sufficiently different that it wasn't infringement. It's very plausible that one of these other firms would have struck upon the idea of PageRank; they were certainly incentivized to do so especially once Inktomi had realised that PPC ads were a way to monetize search engines.
"The Deepmind arcade paper and essentially the entire deep reinforcement learning field today is based on decades-old research funded by the UK government. What if that research was locked away in a UK corporation? Would Deepmind even exist?"
Well DeepMind is a difficult example to debate here for both of us because of course DeepMind is or was a UK corporation and they do the exact opposite of locking up their research, if anything they're famously publicity and paper hungry. Google/DeepMind are actually a strong counterpoint to the idea we need academia for long range research: DeepMind is nothing but long range research (of unclear utility!) and of course self driving cars have been driven by Google for the last decade, pun totally intended.
If I were arguing in your shoes I'd be trying to argue Google is the exception that proves the rule and/or trying to distract attention from it, because it shows that companies can and will do long range research. Microsoft Research is another example, although it's less "pure" because it's more or less a little recreation of academia inside of Microsoft. I prefer the Google approach where science and technology are fully integrated.
Now the wider issue of governments needing to fund long range research is one I used to fully agree with. It sounds right and it's easy to find examples where you can sort of link them to government funded research. But as you can see, I changed my mind over time and no longer find myself in that camp, because:
1. Government funded basic research isn't free. We have to weigh up costs and benefits. How much of a contribution does government grant money make to the technological successes we take for granted today? For examples like PageRank, self-driving or DeepMind the initial contribution was quite small and mostly in the form of logistics (grand challenges) or theory work (which is cheap). And how much of a cost does it impose?
2. The costs are not just financial. I guess this is what mostly changed my mind. I concluded a big part of the "cost" of government funded research is actually in terms of intellectual pollution of the literature. If you have to wade through 50 useless, deceptive or outright fraudulent papers to find 1 good one because governments aren't paying attention to what they fund, then that poses an externalized cost on everyone who wants to benefit from research. Moreover this work has to be endlessly duplicated because journals are loathe to retract anything, so everyone who wants to push technology forward in a certain area has to do this work within their own small group because there's no coordination mechanism ... or just give up and ignore the literature entirely (this is what eventually happened to me).
I think a stronger argument for government funded research than the "it would never have happened" approach is that government funded science is usually un-patented and freely accessible. But even this argument is kind of weak because universities do patent the results of tax funded science, maybe not in computer science but it happens a lot in other fields, and also because the results of the research are often behind paywalls too! Although that's been getting better with time and is usually not a problem in CS (which IMHO is definitely one of the better fields).
But overall, to me it's just not clear that the benefits of buying papers en-masse outweighs the costs, both in dollar terms, time terms and of course, the inevitable costs when people put bogus research into production and things go wrong.
That is pure corruption: the grant is funneling money from you to a domestic ariline. If it was about accountability you would have to prove the flight was really needed in the first place, and then that you found the best price. (though the grant should allow you to ignore the "skip maintenance and pilot training to give you a lower price" airline, but if that best happens to be foreign it shouldn't matter to the grant unless there is corruption involved)
Friend, at a certain point the overhead to administrate these kinds of checks is more costly than just letting people buy tickets to go to conferences. And at this point it isn't corruption in the university, it's in the form of handouts to large corporations.
No, grants are not a firehose of tax money with no accountability--not even close. You completely made that up.
How do we start over in some sane way?
Peer review should happen out in the open and not just be limited to academics.
The fact that it's not out in the open is somewhat complicated. You're perhaps right that it would lead to better outcomes, but it's also important that researchers feel free to speak openly.
This also fixes the problem of incompetent peer review, because it will be called out as such and the reviewer's reputation will suffer.
I don't think it's controversial though, isn't it commonly believed that increased transparency means less corruption? It might not be true, but if it's the prevailing belief then the burden of proof is in fact on you.
Scientists themselves have a hard time "following the science". Add to it the observation that when an issue is getting lots of attention outside of academia, then there are usually some really strong incentives (profit, prestige) associated with doing the science and applying it (e.g., epidemiological science during a global pandemic).
The question seems not to be about how can normal people "follow the science" but rather, why should normal people trust at all that any touted science is anything more than bullshit spouted by highly-motivated sophists?
In the current climate, frankly I think it's absurd that we're putting so much trust in science, or rather what it has become.
The fundamental problem is that science as in the method is absolutely worth putting your trust in, but a lot of what's sold as Science^TM has diverged from it far enough to be worthless. However, it still bears the same name and borrows its credibility. There are countless examples even from the places one would think to be the most trustworthy.
What science as in the method hinges on as opposed to Science^TM is verifiability. Disciplines that aren't easily verified suffer from the replication crisis to the point where it's basically synonymous. I would go as far as arguing that unless something has been verified several times it should be nothing more than a hypothesis. Note how popular science media are basically living off doing the opposite (though I don't think much better can be expected from the media honestly.)
Math and social sciences form the two ends of the verifiability (and reproducibility) scale. CS is close enough to math that it's not a dumpster fire like psychology but I would say we're still suffering a lot of BS research. To fix this we need actual rigor, more openness about the methods, and frankly, motivation to reproduce results.
We need good politicians to negotiate a consensus on how we move forward in light of human desires and modern thinking about cause and effect. Pretending that "science" provides us with a way forward is abusing science for something it is not designed to do nor capable of doing.
I have read a lot of papers. I generally think science can be a force for good. I understand analytic methods developed by or used in papers from my field of interest. I generally believe that those methods are capable of answering important and interesting questions.
In my view, the problem is that you can't know if an article is good or bullshit until you sit with it for, say, at least 2 or 3 hours (some papers even more). And that is for someone with my background. I tried to do this same thing when I had an undergraduate level of education and it (a) took me a lot longer (at least 10x), and (b) I missed a lot of the mistakes/scams/lies that I would not miss now. (I'm sure I am not able to detect some bullshit even still.)
We should follow the good science. We should not follow the bullshit science. This sounds hard because science, being more technical, is harder to vet. But upon further reflection, it seems that society hasn't figured out how to deal with much simpler lies, either.
There's also no way to really know if the researcher entirely left out 10 other tests they tried that failed. Sometimes you can guess that it's a stretch because of the stupid categories they use (I'm reminded of those ESPN graphics that say things like "most home runs on a rainy Tuesday in June"). But it's harder to detect if someone straight up removes data points, repeats tests and reports only the nicest, etc.
So at some point you basically need to be an insider in the field so you hear the gossip about what doesn't replicate. Or if you have access to thousands of dollars to blow you could try a dozen different variations to try getting it to replicate yourself.
I think for something like COVID that is actively affecting many people, there should be funding explicitly for replicating studies, and some slots reserved in a prestigious journal for the findings of the replications. I get that it is not feasible to be replicating everything in science, but I don't see why we can't have ~one lab per relevant university department that specializes in replicating important studies. If you make that a path towards becoming a tenured prof I think that could change the culture surrounding replication studies in general.
outside of your field how much of the BS papers can you catch? I know enough about computers that I could probably figure out at least some of it in that field (after spending 10x longer than someone who actually reads papers regularly), but give me a paper in something else and I'm not so sure.
I still occasionally see things like "hanging a potato to your wall will cure your child's flu" being debated by friends of friends on Facebook. You'd need to take a time machine several hundred years back for it to be within the realm of realm of genuine scientific debate.
That’s why therapy is a must - the “buttons” drugs can press is simply not fine-grained enough in itself to manage depression.
I have seen physicians at top hospitals. Nobody cares about the reaction to the drug but only about treating the current symptoms. I have heard maybe it was the drug, maybe it wasn't. My reaction was instant during a hospital visit. It makes me feel terribly uneasy, getting the word out there doesn't help me in anyway, it helps to protect other people, which frankly the government should be doing.
I honestly don't know how nobody thought maybe we should look at the data we have, look at medications patient took and then if they suffered from an illness within certain period of time.
There is a group who met with senators on flouroquinolone antibiotics.
FDA has updated the "black" label on the drugs multiple times. First with tendonitis issues, then depression/anxiety, now with possible permanent nervous system damage. Yet doctors remain completely uninformed and the drug is given out for 'suspected' UTI. EMA in Europe now recommends these drugs should only be used in life threatening infections.
Recently, a physician submitted a request to the FDA to require written patient consent to take the medication, due to possible side effects. FDA said due to covid they are unable to review it at this time.
I would suggest adding a feature to your algorithm, before taking a drug also look for groups of suffers. If there are many groups....you might want to take something older and safer.
The first "big lie" I experienced is the food pyramid. This was a big government push in the schools that told us all to eat carbs like crazy. Turns out it was just pure corruption, paid for by the grains industry. They have killed literally millions of us with this lie alone. And there were no consequences for this. No one went to prison. At some point you have to ask yourself: "How many millions of people does the government/industry have to kill before we stop believing them?" For me, it was the first million who died of diabetes and other obesity related diseases.
This is a clear misrepresentation of what science is and is not. Specifically, science is not normative.
Science will teach you how to build the bomb. It will never be able to tell you whether you should detonate it.
When people disagree about whether to detonate the bomb or not, one side may be able act like the predictable consequences automatically determine the ethical norms that should guide the decision while ignoring the implications of the unpredictable consequences. This side may attack the other side as "ignoring the science" in order to ignore the more difficult normative debate. A counter reaction to the initial unfair sleight of hand is sometimes to act like one should completely ignore the scientists anyways, and then maybe the accusation is more justified but still ultimately meaningless: norms are simply not grounded in facts. People also aren't necessarily consistent with their own norms. They might agree on some premises then when presented with conclusions that follow, just find the result so repugnant that they search for a way out. That's when we search for chinks in the science, because admitting to our own natural moral hypocrisy is just too painful. But we can always just change our basic norms and reach a different resulting decision and we know this in our gut, so searching for the facts to support a predetermined conclusion doesn't seem so different.
Be a rock. This doesn't mean be hard-headed and unpersuadable, but it also means not being lead around by statistics and citations and "studies". Know who you are, trust your instincts, and do what is right for you.
The fact is, unless you're actually the scientist doing the science, almost all the papers and publishing and Science The Meme! that happens and is constantly and endlessly touted, has zero bearing on your own personal life. Ignore it all.
In particular, when it comes to health science, you can literally ignore every little bit of it--none of that garbage gets in front of a wide audience unless it's got some profit for some concern at the other end. Know how to feed yourself and stay fit and healthy. Sure, you could use the NYTimes to determine whether a pescetarian diet is superior to a carnivore diet, or whether you should be eating highly-processed factory-produced fake meat and in what quantities, or you could just use your own common sense and your own body to do your own individual science--try carnivorous eating for a month, then vegan, see how you feel.
Be confident and ever-increasingly capable. Become more dangerous every day. Have an anchor that's ties your core values to who you are, however you define that, and let the rest of the masses get socially engineered into believing whatever the hell they want.
Well, okay. I was kinda with you when you were saying you can ignore most health science headlines, but then you seem to change your mind and hint that there must be some way of gaining knowledge about one’s health without science. So how then? Common sense and personal experimentation are only going to get you so far.
Are you seriously saying you can't figure out how to do these things without bullshit "studies"?
Get some weights. Ear high-quality meat. Sleep 8 hours. Tweak as necessary.
That gets 90% of people to ever-increasingly healthy.
You've been totally neutered if you think you need more than that.
Another COVID-related example is masks. At one point early in the pandemic there was messaging that masks were not useful in preventing the spread of COVID in the general public. There were supply chain reasons that probably motivated that message, but I also know virology PhD students that were insistent it was the truth, with papers in hand. At the time there were also papers to suggest the opposite, so one could have a healthy debate on the subject - but at the end of the day I gave my grandparents a small box of masks, because wearing a mask is so low effort I don't see why you wouldn't unless you are truly extremely certain that it's useless.
Just one of MANY stories why BLM protests are so much less risky than, e.g., Trump rallies.
When it was widely rebuked as hypocritical bullshit, they even came up with a crazy way of explaining it away: Now racism was a "public health crisis", all of a sudden. See the google trends: https://trends.google.com/trends/explore?date=all&geo=US&q=r...
Just like how January 6 was an insurrection worthy of months of security state theatrics and proclamations that "white supremacy" is the biggest threat to the USA domestically, whereas an entire summer of burning buildings and riots was "mostly peaceful protests".
This is why nobody trusts authority and the media anymore--they have given up on even a pretense of seeming trustworthy.
Yes - and it’s not true - there is plenty of footage of protesters not following health guidelines, and indeed plenty of discussion of how the protests may have precipitated a covid spike.
The fact that other similar events were not permitted proves the hypocrisy in the public health guidance regardless of the outcome.
If you don’t think there is any political bias in public health policy, that’s fine, but it seems like we are in disagreement about that.
This is a significant altering of the goal posts. But anyway, you seem convinced it's all part of some huge partisan battle where you're sure that your side is the good side and everything is stacked against your side. I'm not part of this battle.
Not really - it’s a matter of degree. When there is too much bias displayed, it stops being public health and is discredited.
> you seem convinced it's all part of some huge partisan battle where you're sure that your side is the good side and everything is stacked against your side.
This seems like pure imagination on your part. I suggest you reread the thread. You’ll see no evidence anything partisan from me.
I simply think that public health officials have undermined trust by politicizing the issues or otherwise distorted their message. I.e. they have ‘discredited’ the field as the other poster said.
> I'm not part of this battle.
Are you sure? You are the only one reading this conversation and seeing a ‘battle’.
Actually it is not the science that is causing this issue. But research in various forms is.
Trusting science is fundamentally different from trusting "scientists".
Trusting science is essentially trusting nature to behave in a regular fashion. But trusting research, done and proclaimed by a bunch of humans is well, trusting human beings. Nothing could be more fundamentally different.
I think adding a "repeatability factor" to researches could help. So if anyone on the planet can replicate a research method, it should have a repeatability factor value of 1. If only a single entity can replicate it, then it should have a value of 0. If some entities can validate it, then it should have a value somewhere in between.
This does not mean that research that have low RF cannot be applied widely. It just that it will have to do additional work to gain the trust of the people.
It only follows common sense that trust in something cannot be mandated. So, measures that are based on low RF value research should NEVER be mandated. No matter what that cost.
And thus I think this can save the corruption and manipulation in the name of science.
There is a realistic, weaker statement about the best available information we have, that a specialist could use to explain to a non specialist why they are making a recommendation about something emerging or theoretical. But what we are hearing with "follow the science" really means follow the carefully crafted political message that politicians with scientific credentials have put out.
It's easy to see a distinction. Nobody needs to be told to follow the science on antibiotics or birth control or something. I think the blatant anti-intellectualism in the follow the science type statements is why we have so much worry about vaccines e.g. People aren't stupid and they can tell the difference between being manipulated and being presented with something objective. Even if you're right, it's a bad strategy to try and trick people or use religion to get your point across. See "the science is settled". Nothing makes people stop listening faster.
Edit: and ironically, people call those who don't "follow the science" anti-intellectuals, as if intellectuals take things on blind faith. Every time I hear mention of anti-intellectualism, I have to remember that people are referring to those that question official doctrine, as opposed to those who have framed religion as science to try and short circuit debate.
A fun one that circulated for awhile when the vaccines were first launching was "all vaccines are the same". They are decidedly not, and I don't think people are as stupid as that kind of messaging implies. It was weirdly taboo to say that you'd prefer a particular vaccine, even once we got to the point that there was enough stock to choose. It's true the clinical trial efficacy numbers shouldn't be literally compared to one another, but that doesn't mean we should pretend the vaccines are actually the same either. Somebody deciding to drive an extra hour so they can get an mRNA vaccine is not anti-science lmao.
Most scientific fields I know I can't follow because I don't have enough background. I love reading papers in the fields I have a base understanding to be able to get something out of. The idea the average person can follow all scientific fields with no background just doesn't make sense.
When I read language that says "follow the science" or "based on science" it is almost always using science as a rhetorical device and should not be trusted, period.
This is actually closer to medieval magic than science. The incantation "based on science" makes every piece bullshit "true".