Hacker News new | past | comments | ask | show | jobs | submit login
Solar panel made with ion cannon is cheap enough to challenge fossil fuels (extremetech.com)
464 points by mrsebastian on Mar 13, 2012 | hide | past | web | favorite | 156 comments

That is some amazing technology right there, almost fun to read because it sounds like science fiction but it's real.

And privately designed/built/owned particle accelerators? It's definitely a new era.

What if one day the other side of the globe getting sunlight powered the grid for the other half? Of course this would require very peaceful nations on each continent, so even if we had the cost-effective technology now, it would take hundreds if not thousands of years to happen politically.

I agree the particle accelerator is pretty cool, of course transporting power around the globe is not as easy as stringing a wire, for every mile of wire you lose power due to the resistance of the wire. This is the problem of 'power transmission'. This is the same reason you can't build nuclear power plants in the middle of a desert 500 miles from anywhere and ship the power out sadly.

That being said, at .40/watt you can cover a lot of rooftops economically and that means that you cut the top off the 'peak' usage for a city or town. Cutting that peak off is a huge win because often power companies have fossil fuel plants (called 'peaker plants') which they bring online only during peak power requirements.

The cool thing about having the buildings support the power load is you get a benefit of not transporting it very far (few losses due to power line resistance). And yes, even on the most overcast day solar panels generate power. The ones on my roof in California can generate nearly 50% of their best days on overcast days. That does not hold though for panels covered in snow so there is some requirement to keep them clean.

We can build power plants in the middle of the desert and ship power out. The technology exists; we just haven't done it yet. Transmission losses are not a concern for superconducting cables (though maintaining the cryostat jacket is). Such cables are currently being proven before being used in larger grid connections.




Forget supercondictivity, what about HVDC [1]?

HVDC can be as efficient as 97%, compared with the US average of 93%. That 4% "inefficiency tax" would likely pay for long-haul HVDC connectors (as outlined in one of Obama's 2008 energy proposals) within years.

[1] http://en.wikipedia.org/wiki/High-voltage_direct_current#Adv...

New Zealand has had a 600V HVDC link from the South Island to the North Island for quite some years - most of the power is generated by the hydro lakes in the south, then delivered via a 610km link to a substation in the north, where it is fed into the normal high-voltage transmission grid.


So it is a proven technology to have this as a long distance link (been in operation since 1965) and it is capable of bi-directional power transfer as well.

I'm curious where you get the "600V" number from -- is that 600 volts? According to the Wikipedia page you linked the system uses "+270kV and −350kV", which is in the range I would expect. Maybe you meant 620kV, the difference between +270kV and -350kV?

My apologies, that must've been some confusion in my mind... it was a 600MW HVDC link... although it is higher now, and works are underway to bring it up to somewhere near 1.4GW.

HVDC definitely makes sense for connecting islands together. The cable goes underwater, and in water, the capacitance of the cable is much increased by the greater dielectric constant of the water, which makes AC transmission less efficient.

You know I've never heard of HVDC until I read this. So was Edison right all along?

Not really. The AC transformer was the most efficient way to adapt voltages until very recently. The IGBT and GTO thyristors are the technology that makes something like HVDC affordable today, and they are very modern developments.

Mercury-arc rectifiers (the technology replaced with IGBTs and GTOs) were very large and expensive, and also less efficient (even more so before the 1930s or so) so HVDC only made sense for submarine cables (which have huge capacitive losses). With IGBTs and GTO thyristors it start to become feasible to e.g. do an HVDC line across a continent.

As an example the HVDC inter-island was built using mercury-arc valves (it included a submarine leg), but they have since benn replaced with solid-state devices.

HVDC is also a good to tie two separate grids together; since they will be on different time-bases you can't just directly AC couple them.

No. AC is still far more useful in many key scenarios [1]. However, for long-distance power transmission, lots of money could be saved (and jobs created in doing so) by switching from AC to HVDC.

[1] http://en.wikipedia.org/wiki/High-voltage_direct_current#Dis... (note: some of these are due to AC parts having economies of scale and decades of efficiencies and process that HVDC may not be able to tap into without widescale implementation)

The real problem is that a nuclear power plant needs a staff of several hundred people. They can't travel 500 miles every day to work to the middle of a desert. Even if they wanted to relocate there, they would have to build a little town to serve these people and their families, and now we are back to the original problem.

Other problems with building nuclear plants (or any other type of plant that requires large steam turbines) in the desert include:

* Difficulty cooling. If you build next to an ocean or river, you can just use some of that water to provide the cold end of the temperature differential that you're using to generate power. Deserts are trickier, and more expensive.

* Transportation. If you build next to navigable waterways, you can ship really big components on barges. In deserts, you can ship some things by rail.

But hey, at least it's politically convenient to stick scary power plants in deserts.

Some of the GenIV designs don't require water cooling.

They all require some kind of cooling. Fundamentally, a nuclear reactor sets up one end of a temperature differential that you can use to do work:


Water just happens to be a particularly convenient coolant.

I'm aware of that, but water isn't the only convenient coolant. Molten salt, sodium, molten lead, and helium are other options.

Water has some disadvantages. It's a good neutron moderator, but with its low boiling point you have to keep it under a lot of pressure (160 atmospheres for most light-water reactors). That means you need very strong, thick steel, and a huge oversize containment dome, since if a pipe breaks, the steam will flash into 1000 times as much volume. Then some of it will split, and you'll be at risk of a hydrogen explosion, which is what we all saw at Fukushima.

Molten salt, on the other hand, works at atmospheric pressure, and if something leaks it just drips out and cools into rock.

Sodium has a disadvantage in being reactive with oxygen and water, but it also works at atmospheric pressure. The integral fast reactor design uses a big pool of sodium, which provides so much thermal inertia that Argonne was able to switch off the cooling system entirely, and the reactor just quietly shut down.

Either design works at higher temperatures than LWRs, giving better thermodynamic efficiency.

I think he meant a solar plant in the middle of the desert, not a nuclear plant. Solar plants don't need staffs that large.

The technology exists; we just haven't done it yet.

Actually, we have, albeit with varying degrees of success: http://en.wikipedia.org/wiki/Solar_power_plants_in_the_Mojav... http://articles.latimes.com/print/2012/mar/04/local/la-me-so...

Wow, those are pretty cool. Hate to be around when they lost cooling (dropping out of superconductivity would be a pretty energetic event :-). But its good news on that front.

If you've ever seen the insulator jacket in an underground HVDC cable it is huge.

Most of these systems both carry power and sit in a pool of coolant so they rarely experience such an 'energetic event'. AKA, if they lose power they don't need to keep things cool AND it usually takes a few hours for things to warm up anyway.

Not a superconducting cable, but this is a great read:


Thank you for the link - this was a great read =)

No superconductors needed, DC power conversion is the key. There is a large project on the way in Europe/Northern Afria called Desertec.


Forget about superconductivity.

How about flywheels?

Flywheels are for energy storage, not transmission.

Heh. Never underestimate the power of a station wagon full of flywheels hurtling down the highway?

Steering a car full of gyroscopes in fast moving traffic? Sign me up!

That's really cool, I'd never heard of these. I imagine the flywheels are horizontal so that the bus can turn without trying to change their angular momentum, but I wonder if there's any effect on banked curves or going up hills.

just let them roll to their destination!

I think he means install them for each house, store the non-peak surplus energy, and then use it during peak times.

Most of the power loss is related to using AC instead of DC.

Citation needed?

Somehow for hydro electric damns it works perfectly fine to send 2.73 GW 500+ miles. Look at the WAC Bennett damn in Northern BC which supplies power to Washington, Oregon and California, or Churchill Falls which supplies 5.7 GW of power to Quebec and New York.

Perhaps you haven't seen the Nelson River project.

There's an obvious opportunity that most people miss - put energy-intensive manufacturing processes in sundrenched places. Industrial energy consumption is about 30% of total use in the US, representing a greater share than either transport or domestic use. Aluminium smelting requires massive amounts of electricity, so we're already locating smelters near to hydroelectric dams to take advantage of cheap electricity.

There are hundreds of other processes with huge energy inputs that are currently satisfied by cheap fossil fuels, but could well be met by equatorial solar facilities. Considering how quickly manufacturing transitioned to China, it seems entirely plausible that we could move most high-energy, low-labour processes to equatorial regions within a couple of decades.

Transportation of electricity around the globe is not neccessary. Electricity usage is considerably lower during nighttime. Thus solar and wind fit in with our natural rhythm quite nicely. And it is also produced near the place there it is being consumed. Another plus.

But still the author is right, we need ways to store electricity and we will also need a much more flexible grid than the one we have today.

The neo liberal FDP here in Germany is trying to kill off the solar market by drastically reducing the fee you get for feeding solar electricity into the system. But we already have net parity, therefore it is feasible to put up a photovoltaic system on your house, when you size it carefully and consume much of the energy by yourself. It simply reduces your utility bill. If this new technology really halves the price of modules, people for sure will continue installing new systems. Yay!

Germany's solar stimulus is highly questionable. Sure, it gets solar panels installed on a large scale, but government forced diffusion of an innovation is a questionable idea. Pumping money in a relatively narrow market and creating an artificial demand situation leads to a lot of side effects, i.e. Germany's quasi-stimulation of the Asian component manufacturing market.

Compared to pumping money into pointless wars I think pumping money into subsidized solar panels is an excellent use of money. Sure there are many other ways to spend that money, and some of them may even be better.

But this is definitely long term thinking on their part. Already Germany is the current world leader when it comes to renewables as a percentage of total generating power.

As the price of power goes up this 'stupid' strategy starts to look smarter by the day.

Germany also shut down sth like 60% of their nuclear power plants in response to Fukushima hysteria, even thought in central Europe there are no tsunami, nor big earthquakes.

Then, January/February this year cold snap happened, and they had to buy power from neighborns that still have regular obsolete power plants, because they couldn't provide enough energy for themselves. Green energy means you hope for good weather or neighborns to have enough electricity.

Germany is now protesting Poland new nuclear power plant, when Poland is trying to change its dependence on fossil fuels (sth lik 96%, I don't remember exactly).

This isn't smart. This is hysteria-motivated energy policy.

Sorry, but you are not correct. During the very cold days this winter it was very sunny. Because of this, Photovoltaic could compensate. It went through the local media, that even though most of the nukuhlar powerplants were shut down, we could still export (!) electricity to our neighours. For exmple to France, where they are heating a lot with electricity and had a shortage because of the cold.


Thanks. I remember reading it the way I've posted aboove, but now I can't find that article, and I've found corrections all over the web: http://enenews.com/scandal-germany-not-restarting-nuclear-re...

EDIT: one source http://rt.com/news/germany-reactors-cold-weather-927/

I agree. And when people talk about the money required for solar being so much more than the money required for carbon based fuels they don't take into the costs of controlling the resources in the middle east, the costs to take care of the people with lung problems, the cost of cleaning up the gulf coast and all the problems related to that (not to mention the costs of similar problems worldwide, such as in Nigeria). I'm sure there are similar external costs to solar but I highly doubt they're nearly as much.

The problem is that these costs actually add to the GDP, which makes them look like a good thing, because they "make the economy grow".

The reduction of the solar subsidy is right. It is not killing the solar market at all, but your new solar-panels on the roof of your house won't give returns as nice as 10% anymore. Also those returns are paid by every electricity consumer in the country which at the current rate isn't fair at all.

Also the adaption of solar panels on homes went far faster, then most people expected. The grids in single-family home neighborhoods simply are not designed to handle much power being fed into the system at this point. The problem is that at daytime, when the sun shines there is not much power needed in residential neighborhoods, since everyone is at work. This is why self-consumption is the desired use-case for solar-energy on family homes. That's why I think there is a market for home-control right there. -> "Start washing machine WHEN solar energy > X"

IMO, as long as the tax is inline with the actual external costs it of other forms of energy generation it's not a bad thing. (http://en.wikipedia.org/wiki/Externality) It's basically renaming a sin tax as a green subsidy but there is value in having that tax even if it's revenue is mostly wasted.

I agree in principle with the idea that carbon-based energy has a ton of negative externalities (megatons, in fact...), so there should exist a relative subsidy between it and green.

The issue is that creates even distortion between the energy sector as a whole and the rest of the economy, since fossil fuels are already subsidized (through not pricing in the negative externality) and we're adding a separate subsidy to green energy.

So we're distorting the market to favor creating green energy over reduced consumption of energy. When what's really needed is simply an effective tax on carbon, and then let people decide the most efficient way to respond.

I think they are taxing existing energy sources to pay for the subsidy so on net people using electricity pay the actual cost for green power and other sources based on the actual mix of energy sources used in production. It only the producers who notice the cost difference this tax / subsidy creates.

> Thus solar and wind fit in with our natural rhythm quite nicely.

It's not that nice, because power output from wind/solar plants is random (depends on things like cloud cover or wind strength) and it causes problems in power grid, where power demand must meet the supply exactly, or bad things will happen.

"Without Hot Air"[1] covers this, and other renewable-related issues quite nicely and with real data. I recommend it, it's a good read. It has some good ideas on how to solve power supply/demand problems.

On TED2012 there was a talk about a new kind of batteries designed to solve those kind of problems in the power grid; the video from the talk is not yet up, though (but I think it should be soon, at [2]).

[1] - http://www.inference.phy.cam.ac.uk/withouthotair/

[2] - http://www.ted.com/talks?lang=en&event=2012&duration...

power output from wind/solar plants is random (depends on things like cloud cover ...) and it causes problems in power grid, where power demand must meet the supply exactly

The biggest power cost for us in warmer climates is cooling. When the sun is out, we need lots of power; when it's not, not so much. Solar power sufficient to run A/C from rooftop panels costing less than grid electricity would be wonderful, lining up perfectly with the inconsistencies of available sunlight.

I would start with better home design and energy efficiency enhancing improvements before throwing a Carnot-limited solution at the problem.

Last summer I ran an experiment. We live in a two story house in California. We usually see low temperatures at night (sub 70deg F) and highs in the range of 105+ degF during the day. At night I opened all of the windows downstairs and used a small industrial fan (about 2000CFM) to pump cold outside air into the house. In the morning I'd shut down the fan and close all windows.

I could get the lower part of the house down to below 70F on most days during the summer. Cold enough to have to wear a sweater. Even with the outside temperature hitting highs above 105+ the thermal mass of the house succeeded in maintaining a very comfortable inside temperature (max around 77F). We did not use the air conditioner at all last summer, saving tons of money. The fan costs pennies a day to run.

This summer I am looking at what efficiency improvements I can make to this arrangement. I'm itching to throw a micro-controller at it, but I want to learn a little more before I take that path. There's a lot to do in the roof. Think about it, you have this huge solar heat collector --the roof-- reaching ridiculous temperatures during the day and radiating that right into the house. Sure, there's attic insulation, but that's a ton of energy to deal with.

I'm thinking that some forced ventilation of the attic with a small fan might just do wonders.

That only works in desert type environments (i.e. little to no cloud cover) where the temperature goes up and down each day.

Where I live if it gets hot it says hot, day and night. And most of the country is the same.

Although, obviously, incremental improvements are great even if only some people can use them.

You might be able to use the thermal stability you have underground to help cool the house. Granted, this is more expensive, but probably far less costly in the long run. The basic idea is to bury a heat exchanger (coils of tubing) deep underground and circulate a fluid to move heat from hot to cold.

You can use this two ways: You can use it to try to cool the house directly by embedding tubing in the floor/walls or some other approach. Or, you could use it to improve the efficiency of an air conditioning unit by providing supplemental cooling of the A/C unit heat exchanger coils.

>I'm thinking that some forced ventilation of the attic with a small fan might just do wonders.

Radiative barriers should also be on your list. It's essentially just mylar stapled to the rafters.

Especially if one would switch to heat pumps, which can do both heating and cooling, providing/removing 3 to 5 times more heat than electricity they use up to do so. Amazing technology, by the way :).

Most air conditioners are heat pumps configured to work in one direction, using outside air as a temperature sink.

(I realize that this is at least borderline pedantry)

Oh, I interpreted A/C as Active Current; my mistake. Thanks for pointing that out.

(I, for one, don't mind pedantry :))

When solar and wind power input is not meeting demand that seems like a good time to spin up the spare fossil fuel generators, at least as a transitional sort of thing. Maybe in the near future excess wind / solar power can be stored as chemical fuel, and burned during peak usage.

An 80% / 90 % cut in fossil fuel usage or higher would be a huge thing though!

Whilst on a small scale they are very random, there are some good studies (sadly I don't have the references with me) that show that over large areas, they actually become quite predictable - so this is less of an issue than people think if you have enough coverage

I meant that it is nice in the coarse pattern. And yes, as said, we need ways to store energy and a flexible grid.

Store electricity like this http://en.wikipedia.org/wiki/Turlough_Hill - pump the water uphill during daylight and let it flow down when needed.

Germany has a plan to use Norway as a huge energy store, adding pumping capacity to the hydroelectric plants.

Really? I would be very interested in reading more about this. Got any sources? (Preferably in English or Norwegian ;) )

There is a presentation here http://norwegen.ahk.de/fileadmin/ahk_norwegen/Dokumente/Pres... And an article here about building the necessary cabling http://www.reuters.com/article/2010/08/27/us-norway-seacable...

To be fair, one could equally frame what they're doing as removing the subsidy to people who put photovoltaic systems on their roof. Arguably that same money would have been better invested in research of improved solar technology such as this.

And privately designed/built/owned particle accelerators? It's definitely a new era.

I actually worked at a startup from about 2007-2009 that was designing particle accelerators for another company pursuing what appears to be the exact same technology (possibly some of the same people).


No idea what came of the project. Very high current ion accelerators in the +1 MeV range is quite the trick without a huge budget. Our company was full of people from Los Alamos. We were actually focusing on a different application that needed higher output.

There is a quite sizeable commercial accelerator market for medical, industrial imaging, and radiation sterilization use (and of course various "homeland security uses".

> And privately designed/built/owned particle accelerators? It's definitely a new era.

It's not that new.


>Of course this would require very peaceful nations on each continent, so even if we had the cost-effective technology now, it would take hundreds if not thousands of years to happen politically.

I don't see politics as being the primary challenge; the US has allies in almost every time zone that could fairly easily facilitate this. I imagine the greater challenge would come from trying to get all that electricity across oceans. Copper wires thick enough to carry a substantial amount of electricity seem like they would be too think to actually lay, and then you have parasitic losses to contend with should they manage to find a way to do it.

>>What if one day the other side of the globe getting sunlight powered the grid for the other half? Of course this would require very peaceful nations on each continent, so even if we had the cost-effective technology now, it would take hundreds if not thousands of years to happen politically.<<

Why? Look at the world today. Russia supplies energy to Western Europe, Saudi Arabia provides energy to the US, Australia provides coal to China. None of them have much love for each other, it's just mutual self-interest.

I don't think privately designed/built/owned particle accelerators are that uncommon actually; hospitals have been using cyclotrons for cancer therapy treatments for at least a few years (http://en.wikipedia.org/wiki/Cyclotron#Usage), and I'm fairly certain these are built and sold by medical equipment manufactures like Advanced Cyclotron.

No denying it's still amazing though...

> And privately designed/built/owned particle accelerators? It's definitely a new era.

Well, you could build a push-pull Van de Graaff generator, hook it up to a discharge tube, hook the tube to a high vacuum pump, and there ya go, linear accelerator in the 1 MeV range. Totally doable in a garage.

There are people who built a cyclotron at home. You'll need to wind a huge coil, but it's doable.

> And privately designed/built/owned particle accelerators? It's definitely a new era.


I guess it's just a question of power!

> Of course this would require very peaceful nations on each continent

Or maybe it would create peaceful nations on each continent. You may say I'm a dreamer.... :)

We're already buying energy from nations that aren't very peaceful, so the political issues may be solveable. I think, though, that transmitting energy halfway around the world would present engineering problems: http://en.wikipedia.org/wiki/Electric_power_transmission

Buying oil from nations we don't like is acceptable because everyone wants to own their own car, it's very selfish motivation and massively profitable for the middlemen.

Trading cheap electric power with little profit requires governments to make forward thinking, progressive decisions because no business will bother unless they could make millions from their effort.

Also, I often ponder what will happen one day when power is cheap and easily available - you'd want to hope it means less war but I fear it means the opposite. The war machine will LOVE cheap power and then attacking power feeds or cutting off the other side of the globe as an act of war or terrorism will be too easy of a target. So we'd need peace first which is unlikely to happen given most governments.

The technology is very similar to what SOItech is doing for Silicon on insulator wafers: http://www.soitec.com/en/technologies/smart-cut/

Transmitting the power around the planet isn't a given. You'd need super conductors to limit power loss.

Line losses decrease as voltage increases. Also, using DC instead of AC can further decrease losses.

Hydrogen fuel cells powered up by cheap solar panels will be our future power source. No need to transmit electricity across the globe.

I highly recommend reading "Sustainable Energy - without the hot air"[1] to anyone interested in topic of energy sources and use. It covers lots of things mentioned in comments, like storing energy in pump storages or car batteries in order to make solar/wind plants able to provide a big contribution to power grid without breaking it.

What is important, this books talks about those ideas using real data and carefully estimates what's really feasible to do (like, how many pumped storages you'd need if you'd like to switch 50% of your energy sources to solar).

[1] - http://www.inference.phy.cam.ac.uk/withouthotair/

This is why I welcomed the backlash and even some of the sensationalism regarding the nuclear explosion in Japan. Even if I realize that nuclear energy could be safe and it's good to have an alternative that is cheap enough to compete with coal, I'd still wish we'd spend all those billions switching from nuclear and putting most of them into renewable energy technologies, which should be the future.

The arguments against solar were that the tech is "not there yet", so then it's better to just focus on nuclear. I disagree with that. I believe that if the energy industry changed focus to solar panels and other renewable energy technologies, we would get there a lot faster. We would have a lot more companies exploring different ideas that make them more efficient and cheaper.

Nuclear technology will probably never be gone, or at least not within the next century. But I just don't want it to be the holy grail of the energy industry and see the vast majority of investments go into that. I want renewable energy technologies to be that.

The idea that solar, wind and geothermal energy can replace nuclear power for electricity generation in my lifetime (the next 35 years) ignores the reality of how much electricity we consume and how it is currently generated.

Here's a link to a graph created by the Lawrence Livermore Lab that quickly illustrates the miniscule impact of doubling, tripling, or even quadrupling the three primary alternative energy sources.

I wish we could all live in a world powered by solar cells etc, but it just isn't going to happen.

Maybe I'm missing something, but I don't think you provided a link.

But renewable energies could replace nuclear (and other sources) for new plants that will be built in the future. Nuclear has a very high initial cost, and shutting down current plants while they're still efficient and safe enough wouldn't be a sound choice, but for replacements of ageing plants and for new constructions cheaper and cheaper renewable energies could realistically displace them in the next decades.

No. It can't.

And pretending it can is a problem.

Solar is nice. It'd be FAN-GODDAMN-TASTIC if we could use solar power as the backbone of our power generation. But we can't, it's just not feasible.

Solar today is half of a power plant. Much like wind Solar generates power when its convenient to its own schedule, not ours, and sometimes that means it generates zero power.

This. Is. A. Problem.

It is, in fact, the problem of solar and wind power. Today we can use solar and wind serendipitously. They run on top of base power and when they provide power they allow us to keep gas powered generators offline. That's nice, but it's an edge solution, and we're already nearing the limits of that strategy. In order to replace base load power we need something that provides power reliably when it's convenient for humans. For solar or wind that means we need to invest in vast power storage plants. Things that do not currently exist even in designs. Things that are likely to be about as expensive to build and maintain as solar plants themselves will be.

We do not have the technology to move to solar or wind power as a base load power source. And it seems likely that if we did have that technology it would put the full cost of those power sources at higher than even fission power plants.

We can no more move to solar or wind power for the majority of our power needs than we could move to Thorium reactors, or fusion power.

Today we can use solar and wind serendipitously. They run on top of base power and when they provide power they allow us to keep gas powered generators offline.

Solar thermal+thermal storage could be used in climates closer to the equator. The US ran an experimental setup with 8 hours endurance after sundown.

In the case of cooling equipment, solar coincides well with demand.

You are most certainly right that solar won't cover all our baseload power needs. We don't need to cover it all. We just need to whittle down the unsustainable and environmentally unfriendly parts as much as we can.

Subsidizing winners isn't something the government should be in the business of doing. However, penalizing losers is precisely what we have a government for, and CO2 emitting power is a losing proposition for the future.

According to Wikipedia, the largest solar facility generates 354MW of electricity (assuming ideal conditions). In contrast, the recently approved AP1000s at the Plant Vogtle Georgia site each generate 1100 megawatts, rain or shine.

So if the best solar facility, using 1600 acres of land can only provide 1/3 of one new reactor, your math just won't work.


I'm repeating myself in this tread, but seriously, I recommend to everyone taking a look at http://www.inference.phy.cam.ac.uk/withouthotair/ - it has all the numbers, including e.g. power densities of solar and wind plants in watts per square meter.

According to that book, there's no way solar or wind could compete directly with nuclear. It just doesn't add up.

There is no way to have any serious discussion about energy without doing at least some rudimentary calculations and comparing numbers, instead of adjectives and hopes.

Only we're doubling production of wind turbines every 2-3 years. With each doubling, prices come down, as they do with every other manufactured artifact from computers to TVs to washing machines.

Look out for 35 years. Is it still minuscule?

In some respects wind is like hydro, in that there are a limited number of good places to site a wind farm. As the number of turbines grows there will be diminishing returns as the new sites are less good because they have less wind, longer distances to existing power infrastructure, more damaging weather incidents, etc.

We're nowhere near that limit. From Wikipedia:

According to the National Renewable Energy Laboratory, the contiguous United States has the potential for 10,459 GW of onshore wind power. The capacity could generate 37 petawatt-hours (PW·h) annually, an amount nine times larger than current total U.S. electricity consumption. The U.S. also has large wind resources in Alaska, and Hawaii. http://en.wikipedia.org/wiki/Wind_power_in_the_United_States

Since we're engaging in fantasy (for now) what would happen if we DID extract that much energy (or even a significant fraction of that) from the atmosphere? What kind of climate change would that precipitate?

None. The amount of energy extractable by wind turbines is only a tiny fraction of the total, close to the surface.

Considering how unpopular wind energy is in some areas due to either killing a large number of raptors, or "despoiling" the view from Martha's Vineyard, I doubt the production will continue to double indefinitely.

Turbines kill fewer birds than coal plant smokestacks. Newer, bigger ones move more slowly, killing fewer birds as well as being quieter. They also produce more power per area.

As for the view... well, you can't please everyone.

Actually that's incorrect, at least for centralized wind farms:


The key takeaway from that thesis is:

"As determined in the risk characterization, a centralized wind farm does have a greater impact on avian mortality than the coal fired- power plant. "

Correct link: http://www.indiana.edu/~spea/pubs/undergrad-honors/volume-4/...

The thesis examines only Fowler Ridge, which mostly has 1.65MW Vestas v82 with a 41m blade length and hub height of 70m. The latest I could easily get specs for is a 3MW Vestas: 56m blade length, 84m hub. Higher, more efficient, and most importantly: slower which kills fewer birds.

Last year they introduced a the V164 at 7.0MW: http://en.wikipedia.org/wiki/Vestas

I think you forgot the link: https://flowcharts.llnl.gov/content/energy/energy_archive/en...

There is some other good stuff at that site.

Stop thinking linearly. Both the amount of solar energy per dollar, and the total global amount of power generated by solar panels is increasing at an exponential rate.

> if the energy industry changed focus to solar panels and other renewable energy technologies, we would get there a lot faster

This is probably true. But as always, the problem is money. More specifically, the money that entrenched interests possess (and throw around).

I'd love for solar energy to happen as soon as possible too. It shows great promise. But it might not happen for a while.

> Nuclear technology will probably never be gone, or at least not within the next century.

100 years is a mighty long time. Who knows, nuclear reactors might be completely replaced by renewable energy sources within the next fifty years. Let's hope for the best.

currently the vast majority of power comes from fossile sources (coal, gas, petrol). just imagine what amounts of gasoline are used every day for transportation. there's no way in the near future, that we're gonna replace this amount of energy with solar energy. no way. we either switch to a nuclear powered, hydrogen based economy (once we run out of gasoline) or we say goodbye to mobility. and by the way, this isnt a really a question of price. solar energy is highly dependent on current daylight, we would have to rethink our entire energy distribution, building either much cheaper batteries, to provide a constant level of energy on the grid or create some sort of highly efficient global energy grid. those problems are far from solved. at least not at an affordable rate.

The article says we still need better battery technology, and there's truth to that. But even without batteries, just providing power on days when air conditioners run continuously, solar could make a huge difference in the energy picture.

Or only burning coal at night.

Once solar is significantly cheaper than coal and demand pricing kicks in, a lot of time-shifting of energy use could occur. Right now, night-time electricity is cheaper because demand is lower at night, but if the supply of daylight electricity increases dramatically, any activity which currently benefits from cheap nighttime electricity could be shifted back to the daylight hours.

Data centers currently consume something like 2% of electricity -- it is probably possible to shift at least some of that to bright, sunny days. It may stop being cost-effective to run night shifts at factories, especially if your manufacturing process is energy-intensive. We might end up charging our electric cars at our offices during the day instead of over night at our homes. It'd be pretty silly to fill a battery with solar electricity during the day just to transfer it to another battery at night.

Even without batteries, solar energy could still pick up a lot of our current nighttime energy usage because a lot of our nighttime energy usage doesn't actually have to be at night.

To be clear-- the reason energy is cheap at night is not JUST because it is used less-- it is also because there are certain continuous energy sources. Nuclear power, geothermal, tidal, wave and wind power are all producing energy round the clock and if you don't harvest it, it's lost.

We are not heading towards a world where we have a single power source (solar), and most of our power uses cannot be rescheduled. As fossil fuels are one of the few sources which can be turned on and off at our choosing, I would expect they will function to fill in temporary gaps between supply and demand once renewables capacity is large enough to take the average load. I don't see these gaps occurring predominantly at night.

> ... Nuclear power, geothermal, tidal, wave and wind power are all producing energy round the clock and if you don't harvest it, it's lost.

This isn't true of nuclear power plants; the fission rate, and thus the heat generation rate, can be throttled up and down as needed. In pressurized water reactors this happens automatically as the throttle is opened and closed, thus increasing or decreasing output from the "steam side" of the heat-exchange boilers, a.k.a. steam generators [1]. (In a prior life I was a Navy nuclear engineering officer.)

[1] http://en.wikipedia.org/wiki/Pressurized_water_reactor#Contr...

I think the idea is that with nuclear power plants while you can bring down the output you don't want to because it saves you very little money. Most of your costs are huge and fixed.

That's true, but does this make the fuel last longer, and anyway what are fuel costs as a percentage of total operational costs? My impression is it doesn't make sense to operate a nuclear reactor at less than full power.

> does this make the fuel last longer

Yes. Heat is generated by fission of fissile material such as uranium. Fuel rods have X amount of fissile material in them. Higher power -> faster depletion of the fissile material.

> what are fuel costs as a percentage of total operational costs?

Around 30%, according to the Nuclear Energy Institute. This compares with 80% for coal, natural gas, and oil.[2]

> My impression is it doesn't make sense to operate a nuclear reactor at less than full power

I would think that'd be true of almost any machinery, but that's almost a tautology: You design your machinery for an optimized balance of performance versus wear-and-tear, then try to operate at (what you call) "full power" as much as you can, so as to reap maximum value from your investment.

In any event, the original comment was that excess power is inevitably generated by nuclear plants (at least during some time periods) and therefore must be dumped somehow. That's not the case; nuclear plants can be throttled up and down as needed.

[2] http://www.nei.org/resourcesandstats/nuclear_statistics/cost...

You can still store extra energy to buffer those gaps. Where I live, pumped-storage hydroelectricity has probably a great future. Thermal storage seems to reasonably efficient too, although I'm not quite sure how much of what I've seen is a little "too enthusiastic".

I'm now thinking of amazon EC2's 'spot instances' where you launch an instance, do some computation then shut down, but only if the price per hour is less than some choosen point. (Basically a stock market for computation).

If solar took off, i can see a correlation between price per hour and sunny days, as more machines are turned on when it's cheaper to run, hence increasing supply, and decreasing price. :P

This already happens, both at the energy generation level (power companies broker energy futures (excess supply/demand) between each other - see Enron for more) and at the energy consumer level (big businesses like refineries and foundries and the like pay less for electricity pulled in off-peak hours).

> Or only burning coal at night.

Coal power is a base load technology. Spin up time for a coal plant is too long for it to work well for night time use only.


GE recently announced a faster natural gas based generator. 61% efficiency and can ramp up supply at 50mega-watts per minute.


So there can be a decent enough complement to cheap solar power.

That's true for pulverized coal, but gassified coal could be used like natural gas.

Data centers most assuredly do not use 2% of electricity. That's a made up stat by fossil fuel industry lobbyists and repeated by server vendors.

And using electric cars for battery storage too for excess power generation.

Just imagine the world if it was 10 cents per watt instead of $1 a watt for solar. I hope I live long enough to see it. Dare we dream 1 cent per watt, just like what has happened with CPU development in the past three decades? Could we have 1 cent per watt solar in 30 years? 50 years? 100?

If current trends[1] continue, it will take about 15-25 years for solar energy to become cheaper than fossil fuels, which seems like a reasonable length of time for a trend to continue.

To get the installation cost down from an estimated $1.40 in 2020 to $0.10 and $0.01 with the same trend would take 30-50 and 60-100 years, respectively. 70-110 years is definitely further in the future than I am comfortable peering; 40-60, also a little shaky.

But given current trends, it looks like 2050-2070 is roughly what you need to shoot for to live to see $0.10 / Watt solar energy. If you were born in America after 1976, I'd aim for 2076; it'd be nice to see the Tricentennial.

[1] http://www.bloomberg.com/news/2011-04-05/solar-energy-costs-...

Sadly no way to see tricentennial unless there is a leap in not only medical technology but the politics and cost of it in the USA.

Here's hoping for 10 cents in 30 years.

ps. Isn't it $1.40 right now in 2012? Or are you calculating for off-grid with batteries instead of grid-tie?

> it will take about 15-25 years for solar energy to become cheaper than fossil fuels

You neglected to account for the change in price of fossil fuels. Natural gas is extremely cheap right now and is expected to only get cheaper.

The price of fossil fuels could also go up. We are starting to shift production from conventional drilling to more expensive methods like tar sands.

And this is a beautiful example of manufacturing innovation and advancement, evolution and revolution, coming from those that actually make things. When you make things you find a way to make them better. This is why I believe we in the USA need to maintain a strong industrial/manufacturing culture. Not simply to employ people but rather to be at the epicenter of where innovation happens.

Any claim of something being cheaper, with a picture of some shiny stainless-steel small-scale lab equipment, is suspect. It's not demonstrated to be cheaper until you're producing at scale.

Could this method also be used to create ultra-thin wafers for microprocessors?

From the literature I've seen, the realistic maximum amount of solar energy that can be produced through photovoltaic cells is about 40-50 watts per square meter. Although solar is exciting, there's just no way it can be used to replace fossil fuels, no matter how cheap it gets to manufacture. It'll have to be a combination of renewables (why not more hydroelectric power) and greater efficiency (like LED's).

Not really a big crisis; we could shut down every nuclear plant in the US today if people switched from drying clothes in an electric dryer to hanging them outside. We live in a time of absolute plenty.


Nuclear power generates 21% of the US electrical usage, of which residential consumption is roughly 1/3 of total usage.

The math won't work...

Nuclear power is a way cleaner than coal. Please, shutdown coal plants first.

For traditional wafer processes, you consume a great deal more feedstock than goes into the active portion of a wafer, and even then you may end up grinding away much of the material such as in the case of backlit imaging sensors.

This 'exfoliation' approach in some ways plays into the concept Elon Musk floated about SpaceX - the actual atoms in a booster are relatively simple, they just need to be arranged in the right way.

A case in which building your hammer to build your desk does actually add value.

I think I may have misunderstood the manufacturing process. To my reading, it looks like they have 3mm-thick wafers, accumulate a 20mm-thick layer of hydrogen, which then shears off in a furnace, leaving...a 3mm-thick wafer. Which they started with. Thanks in advance to whoever explains how I've misunderstood this.

EDIT: Oh, I neglected to pay attention to units. The above should be 3mm, 20-micrometer, and 2.98mm, respectively, which means the sheet shearing off is 0.002mm thick. This is seriously cool. Thanks for everyone's patience.

They must have meant 20 micrometer thick layers since they also state that they are "a tenth of the thickness" of standard "200-micrometer-thick (0.2mm)" wafers.

Exactly, the point is that the 20micron-layer of silicon that was shearing off is now the solar cell. Compare that to 200micron sheets, as used currently, and you see where the savings come from (i.e. you can get up to 10x more solar cells from a standard waver).

I hope this helps.

The hydrogen layer accumulates below the current surface of the silicon. When you put it in a furnace, it causes a very thin layer of silicon to shear off from the 3mm wafer.

Err, 2.998, not 2.98.

[..] cost of around 40 cents per watt, about half the cost of panels currently coming out of China (where the vast majority of solar panels are made)

To me, this is the second cool part of the story. It shows that we can still do industrial enterprises in the west by applying technology. Sooner or later the production and assembly industry will have no more cheap labor forces to "exploit" on the globe and production, assembly and automaton technology may (again) be an industrial game changer for the west as it was with "spinning jenny".

Is there a mistake in the 40 cents per watt cost reported in the article? I work for an energy company and our wind-farm energy is around 4.5 cents per kilowatt.

I suspect your wind-farm energy is around 4.5 cents per kilowatt-hour

edit: napkin math: .40 * 1000 watts = $400/kw Assuming 4380 hours of optimum sunlight per year and lifetime of 10 years ~= .01/kwh Should be competitive even when my ridiculous assumption meets reality.

Presumably, the only cost in solar panels is their construction, so there are no recurring costs (other than amortizing their lifetime). It seems complicated to figure the number of hours it lasts for, and figure out the energy generated in those hours. It's a lot simpler to figure out the amount of power generated for a given surface area (in some standardized amount of light input), and the cost of that surface area to generate. So if you wanted a kilowatt of power generated continuously from the sun, minus overcast and such, it would cost $400. I'm not an expert, but that would be my guess.

Seems like something they could scale up and mass produce.

Looking at Twin Creeks' website[0], it appears they are in the wafer manufacturing business and are using solar to promote their technology. It doesn't seem clear from their website that they would ever be the ones selling solar panels - but they might manufacture them for a client. They are selling their services, one application of which is solar cells.

[0] http://www.twincreekstechnologies.com/

It is not clear if glass as a protective cover is still used or required for final production of solar cells made with this process.

Does anyone know?

Almost certainly. You could use another material, such as some clear plastic or epoxy, but the idea is still the same (and glass has the most desirable transmittance properties I believe).

Glass blocks UV which isn't ideal since it heats things up and UV has useful energy. But it's cheaper than quartz.

And the ion cannon is powered by...? And the raw materials were extracted by machines powered by...?


"fossil fuel" is likely a misnomer. Read to your heart's content here: http://trilogymedia.com.au/Thomas_Gold/usgs.html

Similar idea (solar cell would be much cheaper if they were much thinner), different process: http://www.naanovo.com/home

Waiting for shills to block this technology using peer review to save petro-dollar empire.

And as an aside, companies are already pretty good at storing energy with things like molten salt.

The thing I'm puzzled about here is why saving silicon makes your solar cells cheaper. I mean, silicon is really cheap, right? Metallurgical-grade silicon is 77 cents a pound: http://minerals.usgs.gov/minerals/pubs/commodity/silicon/sil... — and that works out to around a penny a watt.

I tried to dig into this a few years ago. Evergreen Solar's 10-K for 2007 http://edgar.sec.gov/Archives/edgar/data/947397/000095013508... has some information. Evergreen's competitive advantage is supposedly that they use less silicon than other manufacturers because they don't saw their wafers — they grow them. They say they use about 5g of silicon per watt (in 2007, planning to reduce it to 2½g per watt by 2012), and it sounds like they get paid about US$3.87 per watt on average (US$58M revenue in 2007, maxed-out manufacturing capacity of 15MW/year, 276 full-time employees in manufacturing). Their "cost of revenue" (i.e. manufacturing cost) was US$53M, or US$3.53/W. But 5g of metallurgical-grade silicon at the price above is US$0.008. If each employee costs US$120k per year (including health benefits, and remembering that a bunch of them are Ph.D.s) then that would be US$2.20/W in labor costs, which already accounts for the majority of that cost of revenue.

But they're not buying metallurgical-grade silicon; they're buying "polysilicon", short for "polycrystalline silicon", which is perhaps a bit of a misnomer, since how many crystals are in each piece of silicon supplied by their suppliers is somewhat immaterial, since Evergreen melts the silicon down and crystallizes it in polycrystalline silicon ribbons in their "String Ribbon" furnaces. Maybe that costs a lot more than metallurgical-grade silicon?

It used to be hard to find that information! But it's much better now; http://pvinsights.com/ lists current PV-grade polysilicon prices at US$29 to US$35 per kilogram, and http://www.pv-tech.org/news/polysilicon_prices_declines_will... explains that this is a major drop from previous prices of US$80/kg. 5 g at US$35 per kilogram is US$0.175. But "Silicon PV Module Price Per Watt" ranges from US$0.75 to US$1.40. Dropping 17½¢ off that price still isn't going to get you to 40¢. And if Evergreen has really made it to 2½g/W, silicon cost is even less of the total cost.

http://www.futurepundit.com/archives/008483.html mentions that in 2008 polysilicon prices peaked at US$400/kg.

Anyway. I'm obviously no expert, but I'm skeptical that peeling silicon with a particle accelerator is going to decrease the cost of photovoltaic cells.

This isn't raw silicon, it's spun ingots. The cost is almost entirely from the process they undergo, not the raw materials. From Wikipedia:

"A typical wafer is made out of extremely pure silicon that is grown into mono-crystalline cylindrical ingots (boules) up to 300 mm (slightly less than 12 inches) in diameter using the Czochralski process. These ingots are then sliced into wafers about 0.75 mm thick and polished to obtain a very regular and flat surface."

So it's a better way to slice the ingots into wafers. Those ingots are ridiculously expensive.

Well, first of all, most solar cells are not made from monocrystalline boules; they're made from polycrystalline boules, which are cheaper and faster. Second, I don't think the Czochralski process costs US$400 per kilogram either.

If Evergreen is already fairly silicon efficient and doesn't have any labor costs associated with ingots and wafers, it seems hard to draw much information about the implications of this technology from them.

The PVinsights link suggests that wafers are a big chunk of the cost of cells. Depending on how much of current costs are ingot production and how good the relative efficiency of this cutting method is, the wafer could become a small part of the costs of cells. That doesn't justify claims of a 50% cost reduction, but it supports the notion of significant cost reductions.

WARNING: it's another one of those OnSwipe mobile sites that crash your browser.

I wish there was some way to opt out of OnSwipe and just load the desktop version of a website on my iPad.

And if the solar panels don't work out, they can take four of those ion cannons and use them as blinged-out wheels on their Escalade.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact