Hacker Newsnew | past | comments | ask | show | jobs | submit | micro2588's commentslogin

There is no way to tell yet what the longevity of the resource will be as it's too early. In fracked resources the main issue is "short circuiting" where increased flow rates travel along preferential paths between the doublet wells as the source rock cools and cooling rate of the source rock in general. This causes the MWt of the resource to decline per injection / production well. Fervo is getting around this by drilling extra wells per pad to be turned on in response. Many geothermal resources decline over time as heat is slowly extracted and these declines are somewhat manageable by tuning the injection production well rates and drilling new wells. They are built into the economics of existing plants. Geothermal is kind of extractive and not "renewable" in this way over medium term time scales, you need to continuously keep drilling at a certain rate. Rock is a good insulator and it takes a long, long time for it to heat back up.


You are right there is no getting around that relatively low grade heat in geothermal is a big barrier for scaling in terms of energy production. Binary /organic rankine cycle geothermal plants used for these low / medium temperature resources operate at ~10% efficiency. Dry / flash steam resources are higher but produce waste in terms of emitted GHG and / or crap in the geothermal brine.


Deep geothermal promises to provide what is usually considered high-grade heat (800+°), but what I'm trying to understand is how cheaply you can convert that high-grade heat into electricity, because the answer seems to be "far too expensively to be competitive with wind and solar".


Supercritical geothermal is similar to talking about the economics of fusion. There is a DOE enhanced geothermal test site near the Newberry Volcano in central Oregon which has temperatures close to this range at reachable depths. That is more of a demonstration site for drilling technology.


Yes, but if (as I am claiming) there's no way to economically turn heat into electricity, it's irrelevant whether supercritical geothermal steam costs trillions of dollars per borehole or is free; either way it's uneconomic as a source of electricity.


Ormat (NYSE ORA) is a publicly traded geothermal company and they are profitable.


Fervo's initial demonstration project was next to an existing power plant in Nevada which previously failed to produce at it's stated capacity over time (Battle Mountain) so they were able to tie in extra MWt capacity to an existing ORC turbine. Fervo's technology has to be located somewhat near existing traditional hydrothermal geothermal resources because it's the convection along an exiting fault for hundreds of thousands of years that produces an above background thermal gradient near enough to the surface for it to be economical. That is true for their demonstration area in Utah which is located near the existing Blundel geothermal power plant in Milford Utah.


Sure. Exposing these situational constraints and free benefits (third-party sunk costs) aligns with my stance.

I don't agree with the above comment:

> Fervo isn't just trying, they are succeeding


I think as a tech demonstration project it was successful because they were a bit conservative in some ways that will make the economics look worse. I agree it's far from "geothermal everywhere" which seems to be the hype. You can't extrapolate that from one successful EGS well literally right next to an existing geothermal power plant.


> they were a bit conservative in some ways that will make the economics look worse

Or they simply ran into headwinds on a speculative project. I'll "take the under" when your PR is cagey about basic project attributes.


They do a good job of publishing their results in technical industry publications (advancing the field overall in a surprisingly open way) but I agree can be misleading in their marketing.

It will be interesting to see the results of the Cape project once they do multi-well laterals from a single pad power plant with larger diameter wells. That is really more a demonstration of power plant economics beyond the technical feasibility of creating a horizontally fracked reservoir that can be operated for a year.


Ah!

Cape Station does look much more significant. [0] 400MW of power plant capacity with 2028 COD and mostly contracted with SoCal Edison? Good job.

[0] https://www.utilitydive.com/news/cape-station-enhanced-geoth...


Julia has been designed for single core performance fullstop. Functional collections may work well with a state of the art GC, with Julia's not so much. The fact that Julia can interop seemlessly with C code (easily) kind of bounds the design of the GC.

I think it is a little disingenuous to say that a Julia programmer does not have to worry about types. Type inference alleviates many burdens, but correct typing of arguments is essential (and hidden promotions or casting can kill performance). So while you can write correct programs easily, for efficient programs you end up worrying about this quite a lot.


> Functional collections may work well with a state of the art GC, with Julia's not so much.

I don't know the specifics of Julia's GC, but this seems a strange thing to say in 2017. Douglas Crosher's conservative generational collector for CMUCL (also used in SBCL AFAIK) supports C interoperation and is entirely adequate for handling the extra garbage that (admittedly) is generated when using functional collections. I don't recall exactly when he wrote that collector, but it must have been 20 years ago at least. It would be strange if Julia weren't using something at least as sophisticated.


I'm not sure I understand: why would the C interop limit the design of the GC?

I didn't say Julia programmers don't have to worry about types, my comment was only about type annotations, though they are obviously related. Perhaps a better way to phrase it is: by worrying a little bit about types, you can generally avoid the need for type annotations. Hopefully as the tooling around the language matures, we can reduce the amount of worrying required.


Will this eventually solve the "Julia does not like Pizza" issue (https://github.com/JuliaLang/julia/issues/3721)?


Ideally yes. I'd have to confirm that this extends to Pizza and Koalas. We fixed that issue as much as we could, even going as far as generating our own unicode width tables extracted from unifont, but it wasn't possible to fix in general without support form the terminals. Now that the standard is fixed (hopefully), I don't see a reason why the terminals wouldn't update their tables.



Hi Patrick, can you point to a paper / resource that describes this recent work?


Sure. The paper is called "Type Checking Modular Multiple Dispatch with Parametric Polymorphism and Multiple Inheritance" by Allen, Hilburn, et al.


You can do this using the SymPY.jl package. As a good chunk of matrix functionality is parametric on element type, you can do matrix operations on SymPY's symbolic variables.

See example 6 in: https://github.com/andreasnoack/andreasnoack.github.io/blob/...


I think this comment is referencing the fact that as of right now, slicing a julia array creates a copy and not a lightweight "view" of the sliced data. This makes vectorized operations on slices of arrays sometimes faster in numpy than in julia. See https://github.com/JuliaLang/julia/issues/3701.


Note that you already can make view slices, it just isn't the default. As per the issue you linked to, the default will change to views in version 0.3.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: