Chinese Say They're Building 'Impossible' Space Drive

Discuss life, the universe, and everything with other members of this site. Get to know your fellow polywell enthusiasts.

Moderators: tonybarry, MSimon

93143
Posts: 1131
Joined: Fri Oct 19, 2007 7:51 pm

Post by 93143 »

gblaze42 wrote:it's been twenty years since I worked with computational fluid dynamics. I'm curious if it's changed much?
Twenty years, eh? If I'm correct, high-resolution Godunov-type upwind finite volume methods were fairly new back then. Our research group uses them almost exclusively - I sit next to a guy who's working on extending a 4th-order upwind method to 3D. (Of course, the airfoil guys use centered finite difference with artificial dissipation, because that's how it's always been done, and it works. They don't need the same robustness to unexpected flow situations as we do; all they need is to be able to reliably run a steady-state case thousands of times during an optimization run.) I think the levelset method was developed around that time too.

The principles of CFD are essentially the same as they were, but the increase in computer power means we can actually do a lot more - my own research is probably going to involve using the Peng-Robinson equation of state, combined with droplet modelling equations, in 3D multiphase turbulent flow using large eddy simulation (LES). Another guy in our lab is working on a flame model with soot particle formation and volumetric radiative heat transfer. A paper I read recently shows off the versatility of the latest AUSM-based flux function by solving, if I recall correctly, the interaction of a strong shock in air with a drop of water - they had some really beautiful numerical Schlieren images, which compared rather well with the real Schlieren photos of a matching experiment...

Modelling is still necessary for almost everything when turbulence is present, but there's more detail (lots of transport equations to solve!) and less fudge because of the extra power. Sometimes models are validated against DNS nowadays...

Oh yeah, and we have a guy investigating the possibility of running some of the more highly parallelizable chunks of our code on GPUs instead of CPUs. It turns out that it works pretty well, and also (I think) that the cheapest way to get the processing power is to buy a top-of-the-line graphics card or two, because the dedicated physics GPU packages are more expensive for basically the same hardware. Unfortunately it's currently single-precision only...

gblaze42
Posts: 227
Joined: Mon Jul 30, 2007 8:04 pm

Post by gblaze42 »

93143 wrote:
gblaze42 wrote:it's been twenty years since I worked with computational fluid dynamics. I'm curious if it's changed much?
Twenty years, eh? If I'm correct, high-resolution Godunov-type upwind finite volume methods were fairly new back then. Our research group uses them almost exclusively - I sit next to a guy who's working on extending a 4th-order upwind method to 3D. (Of course, the airfoil guys use centered finite difference with artificial dissipation, because that's how it's always been done, and it works. They don't need the same robustness to unexpected flow situations as we do; all they need is to be able to reliably run a steady-state case thousands of times during an optimization run.) I think the levelset method was developed around that time too.

The principles of CFD are essentially the same as they were, but the increase in computer power means we can actually do a lot more - my own research is probably going to involve using the Peng-Robinson equation of state, combined with droplet modelling equations, in 3D multiphase turbulent flow using large eddy simulation (LES). Another guy in our lab is working on a flame model with soot particle formation and volumetric radiative heat transfer. A paper I read recently shows off the versatility of the latest AUSM-based flux function by solving, if I recall correctly, the interaction of a strong shock in air with a drop of water - they had some really beautiful numerical Schlieren images, which compared rather well with the real Schlieren photos of a matching experiment...

Modelling is still necessary for almost everything when turbulence is present, but there's more detail (lots of transport equations to solve!) and less fudge because of the extra power. Sometimes models are validated against DNS nowadays...

Oh yeah, and we have a guy investigating the possibility of running some of the more highly parallelizable chunks of our code on GPUs instead of CPUs. It turns out that it works pretty well, and also (I think) that the cheapest way to get the processing power is to buy a top-of-the-line graphics card or two, because the dedicated physics GPU packages are more expensive for basically the same hardware. Unfortunately it's currently single-precision only...

Nice!! I did mostly micro-climate testing in a wind tunnel using finite analysis and that was big then, glad something's haven't changed. Of course back then we were using a 25 MIP add in board to the 386's systems we had, and that was considered to be impressive for the time. Of course being a relatively small company of engineer's they weren't very eager to change, most of the calculations were still done on paper. Now. I've heard, they're even started using Beowulf cluster's for they're CFD.

93143
Posts: 1131
Joined: Fri Oct 19, 2007 7:51 pm

Post by 93143 »

...most of the calculations were still done on paper.
I have the opposite problem. I tend to assume I need a numerical solution to even guess at something, and then later I realize that the back of an envelope will do fine.
I've heard, they're even started using Beowulf cluster's for they're CFD.
Our supercomputer is basically a glorified Beowulf cluster. We use MPI (Message Passing Interface) to do block-based adaptive mesh refinement over multiple CPUs.

I haven't run anything on the supercomputer yet. Using the Peng-Robinson equation of state with a lookup table for the vapour pressure, I can do reasonably large transient 2D test problems (or steady 1D problems) in a matter of minutes, just using the Athlon XP 2500+ at my desk.

kcdodd
Posts: 722
Joined: Tue Jun 03, 2008 3:36 am
Location: Austin, TX

Post by kcdodd »

pstudier wrote:
kcdodd wrote:And not only breaking conservation of momentum, but also the 2nd law of thermodynamics.
How does this break the 2nd law, which states that entropy increases? Electricity in, allegedly some of this energy is converted to motion, and most of the energy is converted to heat.
Ok then let's say the source of radiation is a blackbody. No part of the spectrum has exactly zero, so the resonanct frequency will always have some power from the body, at least in a range of temperature? Ok, well magically those frequency photons are converted to motion, which cools the blackbody and continuously fills in that frequency until the blackbody has lost all heat and converted 100% to motion. And even if it worked like that in only a range of temperature you're still violating the law.
Carter

Skipjack
Posts: 6080
Joined: Sun Sep 28, 2008 2:29 pm

Post by Skipjack »

Hmm, I have the luxury of just leaning back an watching what happens, being carefully sceptical at the same time of course.
I have the same attitude towards the BFR, btw (even though I give the BFR a much higher chance of success).
The fun part for me is in the thrill of the chance that something cool might come out of either. That is great and keeps me entertained ;)
I mean even if it was not science but science fiction, it would still be way more interesting and entertaining then Britney Spears latest love affairs, or Heffners girls, or whatever else the entertainment industry tries to poison our minds with, right?
When do the Chinese believe that they will have this thing working? I have not seen any timeframes for this...

Post Reply