SpaceX News

Point out news stories, on the net or in mainstream media, related to polywell fusion.

Moderators: tonybarry, MSimon

krenshala
Posts: 914
Joined: Wed Jul 16, 2008 4:20 pm
Location: Austin, TX, NorAm, Sol III

Post by krenshala »

paperburn1 wrote:
Skipjack wrote:
The fact that most programmers no longer have hardware limitations dictating a requirement for clean, concise, and (most importantly) compact code isn't helping matters either. I have repeatedly heard over the years "Why should I optimize or write smaller code? The user won't see a difference because the system has enough resources to run things just fine they way they are now."
Indeed and on the other side you often have a management that is completely oblivious to the realities of software development and thus makes decisions that facilitates this sort of behaviour.
Cut and paste mentality, why develop what you can steal off of the internet.
I've often wondered how much of the problem comes from folks that know "it works" but don't know why. Some management decisions/methods plus "I'll just use these regardless of what its really meant for" doesn't help matters, thats for sure.

CaptainBeowulf
Posts: 498
Joined: Sat Nov 07, 2009 12:35 am

Post by CaptainBeowulf »

You know, one thing I liked about the old dial-up Internet days of the 90s was that people who used programs to create bloated, ridiculous websites were frequently mocked. There was a certain skill in writing concise html code that resulted in a nice looking webpage but loaded quickly. It is, of course, a skill now long forgotten. What shocks me is that some of the websites out there are now so poorly written that they take a few seconds to load even on high speed connections... slowly enough that they can get lagged and stuck half-loaded if some of the servers, routers or other connections along the way aren't too good.

Of course html is pretty simple compared to some of the code discussed above, but I just notice a general zeitgeist...

kunkmiester
Posts: 887
Joined: Thu Mar 12, 2009 3:51 pm
Contact:

Post by kunkmiester »

I thought once of a vague story idea of a time when actual code became so obfuscated that programmers were no longer learning C++ level code, let alone learning about binary and basic principles. Computer power and abstraction meant that databases became enormous, and no one quite understood what anything did, let alone how it did it.

Plot was to be a company with a bunch of young programmers who hires a really really old guy to come in and fix some legacy stuff. Takes the kids under wing to a limited extent and blows their minds with things like assembly level work. Still working on learning to write well enough to make a story like this, and probably don't have all the technical details needed to really make the contrast work out.
Evil is evil, no matter how small

CaptainBeowulf
Posts: 498
Joined: Sat Nov 07, 2009 12:35 am

Post by CaptainBeowulf »

Nice idea though - keep at it! Would make a good short story in a magazine.

paperburn1
Posts: 2466
Joined: Fri Jun 19, 2009 5:53 am
Location: Third rock from the sun.

Post by paperburn1 »

kunkmiester wrote:I thought once of a vague story idea of a time when actual code became so obfuscated that programmers were no longer learning C++ level code, let alone learning about binary and basic principles. Computer power and abstraction meant that databases became enormous, and no one quite understood what anything did, let alone how it did it.

Plot was to be a company with a bunch of young programmers who hires a really really old guy to come in and fix some legacy stuff. Takes the kids under wing to a limited extent and blows their minds with things like assembly level work. Still working on learning to write well enough to make a story like this, and probably don't have all the technical details needed to really make the contrast work out.
Did this not happen for real in 2000???

jcoady
Posts: 141
Joined: Fri Jul 15, 2011 4:36 pm

Post by jcoady »

Elon Musk talk entitled "The Future of Energy and Transport"

http://www.oxfordmartin.ox.ac.uk/videos/view/211

Skipjack
Posts: 6110
Joined: Sun Sep 28, 2008 2:29 pm

Post by Skipjack »

There is also this one from his lecture at the Royal Aeronautical Society
http://www.youtube.com/watch?feature=pl ... B3R5Xk2gTY

ScottL
Posts: 1122
Joined: Thu Jun 02, 2011 11:26 pm

Post by ScottL »

paperburn1 wrote:
kunkmiester wrote:I thought once of a vague story idea of a time when actual code became so obfuscated that programmers were no longer learning C++ level code, let alone learning about binary and basic principles. Computer power and abstraction meant that databases became enormous, and no one quite understood what anything did, let alone how it did it.

Plot was to be a company with a bunch of young programmers who hires a really really old guy to come in and fix some legacy stuff. Takes the kids under wing to a limited extent and blows their minds with things like assembly level work. Still working on learning to write well enough to make a story like this, and probably don't have all the technical details needed to really make the contrast work out.
Did this not happen for real in 2000???
Short answer, nope. At the time, most of the worry had occured back in 93 about mainframe jobs running cobol code. All the front-end stuff had long since been designed for the turn-over. It was largely a non-issue.

As for old-timers teaching, I haven't experienced that in the least. In my experience they're usually former cobol coders who have a specific procedural mindset with little understanding of object-oriented functionality. Most are retired, retiring, or attempting to learn modern programming paradigms. To also clarify, aseembly is still taught in college along with C, C++, Java and so on.

As for web related, applications are very little HTMl and more JQuery, Javascript, etc. in the front-end, but java or C# in the back-end. The optimizations are often in the compiler, although I've seen plenty of spaghetti code in my time. The problem is there is little incentive to simplify pages with such high speed connections these days. Honestly I spend most of my time running workshops for old-timers or correcting their misunderstandings of the coding paradigms in production applications. What managers are starting to realize is that they can designate small teams of programmers, 1 senior, 3-5 junior, and properly cultivate proper practices while allowing new ideas, but its a work in progress with many managers.

ladajo
Posts: 6204
Joined: Thu Sep 17, 2009 11:18 pm
Location: North East Coast

Post by ladajo »

The ease of object oriented approach is simply understood. The effect of the unused functionality in objects compounding over an entire project is another thing. I personally disagree with the paradigm, but understand it is about cranking out product for profit. My point is that other industries have found ways to use common block approach but still minimize greatly the "flashing" carryover for unused functionality. Cars are a good example. Unused functionality in common parts is excess weight. Excess weight limits mileage. Reduce excess weight, improve mileage, save money in material costs, etc...I tihnk this can cross to software world. Think about how cars were designed and marketed when gas and steel was cheap. Now look at how it is done. Software world will not forever have cheap gas and steel.
The development of atomic power, though it could confer unimaginable blessings on mankind, is something that is dreaded by the owners of coal mines and oil wells. (Hazlitt)
What I want to do is to look up C. . . . I call him the Forgotten Man. (Sumner)

ScottL
Posts: 1122
Joined: Thu Jun 02, 2011 11:26 pm

Post by ScottL »

Until quantum computing is realized there will be no move toward efficiency\optimization as only then will we have hit our peak hardware power. As for paradigms versus efficiency, of course you trade-off efficiency for understandable code, but I honestly don't see that changing. My prediction in this case is that all optimizations will be done at the compiler level and never at the OO language levels.

hanelyp
Posts: 2257
Joined: Fri Oct 26, 2007 8:50 pm

Post by hanelyp »

The compiler can deal with machine specific details better than most programmers. And even those few who can beat the compiler at that level rarely do, not worth their time for the minor improvement. The overall algorithm used has a much bigger impact on overall software performance. A problem sometimes encountered using high level languages and libraries is that levels of computational complexity can be obscured. A single function call may invoke a large loop or other expensive calculation, unknown to a naive programmer.

Netmaker
Posts: 78
Joined: Sat Sep 11, 2010 8:17 pm

Code optimization

Post by Netmaker »

There are at least two areas where code will continue to be optimized.

1) For horizontal scalability in a multi-processor/multi-computer environment - eg HPC and "the cloud". Generally because you can't just take an existing monolithic program and run it efficiently (if at all) on multiple processors which necessitates that it be re-architected and re-written. At least some cruft will be removed in the process.

2) For low power environments the key driver being smartphones and tablets. CPU inefficient code kills battery life and so developers have every incentive in the current environment to optimize their code. Memory constraints are also an issue but only relative to desktops.

Carl White
Posts: 335
Joined: Mon Aug 24, 2009 10:44 pm

Post by Carl White »

CaptainBeowulf wrote:You know, one thing I liked about the old dial-up Internet days of the 90s was that people who used programs to create bloated, ridiculous websites were frequently mocked. There was a certain skill in writing concise html code that resulted in a nice looking webpage but loaded quickly. It is, of course, a skill now long forgotten. What shocks me is that some of the websites out there are now so poorly written that they take a few seconds to load even on high speed connections... slowly enough that they can get lagged and stuck half-loaded if some of the servers, routers or other connections along the way aren't too good.

Of course html is pretty simple compared to some of the code discussed above, but I just notice a general zeitgeist...
Ads, ads, ads and more ads now. Some that stream video too.

GIThruster
Posts: 4686
Joined: Tue May 25, 2010 8:17 pm

Post by GIThruster »

While there are several of you here so knowledgable about coding, I'm curious if anyone knows what the state of the art is like for spacecraft code optimization. I recall when I first heard that Paul March's wife worked for ULA writing code for the shuttle, I was shocked that was an ongoing venture. Likewise, when I learned an old friend was writing code for a nuclear powerplant in CA, I was again surprised that was being done many years after the plant had opened. Apparently some sort of optimization must go on as a continuing process. I wonder if that's true of Falcon, Dragon, even ISS. Anyone know?
"Courage is not just a virtue, but the form of every virtue at the testing point." C. S. Lewis

DeltaV
Posts: 2245
Joined: Mon Oct 12, 2009 5:05 am

Post by DeltaV »

As a systems engineer, not especially a coding expert, and without targeting any specific industry:

- Some of it is actual fixes.
- Some of it is cleanup to improve understandability/maintainabilty for future engineers (big danger is 'cure' causing worse problems than 'disease' because subtleties are not comprehended).
- Some of it is adapting to ever-changing IDEs, compilers, linkers, OSes, standards, HW, configuration management tools, utilities, new/changed requirements, regulations, user interface, telemetry, etc.
- Some of it is a desire to increase commonality with the next generation product, or form a basis to start a next generation.
- Other things I can't think of at the moment.

It is never really 'finished'. It is and must be an ongoing 'spiral process' of successive refinements when significant system complexity is involved.

Post Reply