Page 9 of 155

Posted: Wed Nov 21, 2012 6:36 pm
by krenshala
paperburn1 wrote:
Skipjack wrote:
The fact that most programmers no longer have hardware limitations dictating a requirement for clean, concise, and (most importantly) compact code isn't helping matters either. I have repeatedly heard over the years "Why should I optimize or write smaller code? The user won't see a difference because the system has enough resources to run things just fine they way they are now."
Indeed and on the other side you often have a management that is completely oblivious to the realities of software development and thus makes decisions that facilitates this sort of behaviour.
Cut and paste mentality, why develop what you can steal off of the internet.
I've often wondered how much of the problem comes from folks that know "it works" but don't know why. Some management decisions/methods plus "I'll just use these regardless of what its really meant for" doesn't help matters, thats for sure.

Posted: Wed Nov 21, 2012 10:48 pm
by CaptainBeowulf
You know, one thing I liked about the old dial-up Internet days of the 90s was that people who used programs to create bloated, ridiculous websites were frequently mocked. There was a certain skill in writing concise html code that resulted in a nice looking webpage but loaded quickly. It is, of course, a skill now long forgotten. What shocks me is that some of the websites out there are now so poorly written that they take a few seconds to load even on high speed connections... slowly enough that they can get lagged and stuck half-loaded if some of the servers, routers or other connections along the way aren't too good.

Of course html is pretty simple compared to some of the code discussed above, but I just notice a general zeitgeist...

Posted: Wed Nov 21, 2012 11:26 pm
by kunkmiester
I thought once of a vague story idea of a time when actual code became so obfuscated that programmers were no longer learning C++ level code, let alone learning about binary and basic principles. Computer power and abstraction meant that databases became enormous, and no one quite understood what anything did, let alone how it did it.

Plot was to be a company with a bunch of young programmers who hires a really really old guy to come in and fix some legacy stuff. Takes the kids under wing to a limited extent and blows their minds with things like assembly level work. Still working on learning to write well enough to make a story like this, and probably don't have all the technical details needed to really make the contrast work out.

Posted: Wed Nov 21, 2012 11:37 pm
by CaptainBeowulf
Nice idea though - keep at it! Would make a good short story in a magazine.

Posted: Thu Nov 22, 2012 11:39 am
by paperburn1
kunkmiester wrote:I thought once of a vague story idea of a time when actual code became so obfuscated that programmers were no longer learning C++ level code, let alone learning about binary and basic principles. Computer power and abstraction meant that databases became enormous, and no one quite understood what anything did, let alone how it did it.

Plot was to be a company with a bunch of young programmers who hires a really really old guy to come in and fix some legacy stuff. Takes the kids under wing to a limited extent and blows their minds with things like assembly level work. Still working on learning to write well enough to make a story like this, and probably don't have all the technical details needed to really make the contrast work out.
Did this not happen for real in 2000???

Posted: Fri Nov 23, 2012 3:36 pm
by jcoady
Elon Musk talk entitled "The Future of Energy and Transport"

http://www.oxfordmartin.ox.ac.uk/videos/view/211

Posted: Fri Nov 23, 2012 3:53 pm
by Skipjack
There is also this one from his lecture at the Royal Aeronautical Society
http://www.youtube.com/watch?feature=pl ... B3R5Xk2gTY

Posted: Sat Nov 24, 2012 10:09 pm
by ScottL
paperburn1 wrote:
kunkmiester wrote:I thought once of a vague story idea of a time when actual code became so obfuscated that programmers were no longer learning C++ level code, let alone learning about binary and basic principles. Computer power and abstraction meant that databases became enormous, and no one quite understood what anything did, let alone how it did it.

Plot was to be a company with a bunch of young programmers who hires a really really old guy to come in and fix some legacy stuff. Takes the kids under wing to a limited extent and blows their minds with things like assembly level work. Still working on learning to write well enough to make a story like this, and probably don't have all the technical details needed to really make the contrast work out.
Did this not happen for real in 2000???
Short answer, nope. At the time, most of the worry had occured back in 93 about mainframe jobs running cobol code. All the front-end stuff had long since been designed for the turn-over. It was largely a non-issue.

As for old-timers teaching, I haven't experienced that in the least. In my experience they're usually former cobol coders who have a specific procedural mindset with little understanding of object-oriented functionality. Most are retired, retiring, or attempting to learn modern programming paradigms. To also clarify, aseembly is still taught in college along with C, C++, Java and so on.

As for web related, applications are very little HTMl and more JQuery, Javascript, etc. in the front-end, but java or C# in the back-end. The optimizations are often in the compiler, although I've seen plenty of spaghetti code in my time. The problem is there is little incentive to simplify pages with such high speed connections these days. Honestly I spend most of my time running workshops for old-timers or correcting their misunderstandings of the coding paradigms in production applications. What managers are starting to realize is that they can designate small teams of programmers, 1 senior, 3-5 junior, and properly cultivate proper practices while allowing new ideas, but its a work in progress with many managers.

Posted: Sun Nov 25, 2012 8:22 am
by ladajo
The ease of object oriented approach is simply understood. The effect of the unused functionality in objects compounding over an entire project is another thing. I personally disagree with the paradigm, but understand it is about cranking out product for profit. My point is that other industries have found ways to use common block approach but still minimize greatly the "flashing" carryover for unused functionality. Cars are a good example. Unused functionality in common parts is excess weight. Excess weight limits mileage. Reduce excess weight, improve mileage, save money in material costs, etc...I tihnk this can cross to software world. Think about how cars were designed and marketed when gas and steel was cheap. Now look at how it is done. Software world will not forever have cheap gas and steel.

Posted: Sun Nov 25, 2012 9:32 pm
by ScottL
Until quantum computing is realized there will be no move toward efficiency\optimization as only then will we have hit our peak hardware power. As for paradigms versus efficiency, of course you trade-off efficiency for understandable code, but I honestly don't see that changing. My prediction in this case is that all optimizations will be done at the compiler level and never at the OO language levels.

Posted: Tue Nov 27, 2012 1:38 am
by hanelyp
The compiler can deal with machine specific details better than most programmers. And even those few who can beat the compiler at that level rarely do, not worth their time for the minor improvement. The overall algorithm used has a much bigger impact on overall software performance. A problem sometimes encountered using high level languages and libraries is that levels of computational complexity can be obscured. A single function call may invoke a large loop or other expensive calculation, unknown to a naive programmer.

Code optimization

Posted: Sat Dec 01, 2012 6:15 pm
by Netmaker
There are at least two areas where code will continue to be optimized.

1) For horizontal scalability in a multi-processor/multi-computer environment - eg HPC and "the cloud". Generally because you can't just take an existing monolithic program and run it efficiently (if at all) on multiple processors which necessitates that it be re-architected and re-written. At least some cruft will be removed in the process.

2) For low power environments the key driver being smartphones and tablets. CPU inefficient code kills battery life and so developers have every incentive in the current environment to optimize their code. Memory constraints are also an issue but only relative to desktops.

Posted: Sat Dec 01, 2012 7:49 pm
by Carl White
CaptainBeowulf wrote:You know, one thing I liked about the old dial-up Internet days of the 90s was that people who used programs to create bloated, ridiculous websites were frequently mocked. There was a certain skill in writing concise html code that resulted in a nice looking webpage but loaded quickly. It is, of course, a skill now long forgotten. What shocks me is that some of the websites out there are now so poorly written that they take a few seconds to load even on high speed connections... slowly enough that they can get lagged and stuck half-loaded if some of the servers, routers or other connections along the way aren't too good.

Of course html is pretty simple compared to some of the code discussed above, but I just notice a general zeitgeist...
Ads, ads, ads and more ads now. Some that stream video too.

Posted: Sat Dec 01, 2012 8:02 pm
by GIThruster
While there are several of you here so knowledgable about coding, I'm curious if anyone knows what the state of the art is like for spacecraft code optimization. I recall when I first heard that Paul March's wife worked for ULA writing code for the shuttle, I was shocked that was an ongoing venture. Likewise, when I learned an old friend was writing code for a nuclear powerplant in CA, I was again surprised that was being done many years after the plant had opened. Apparently some sort of optimization must go on as a continuing process. I wonder if that's true of Falcon, Dragon, even ISS. Anyone know?

Posted: Sat Dec 01, 2012 10:20 pm
by DeltaV
As a systems engineer, not especially a coding expert, and without targeting any specific industry:

- Some of it is actual fixes.
- Some of it is cleanup to improve understandability/maintainabilty for future engineers (big danger is 'cure' causing worse problems than 'disease' because subtleties are not comprehended).
- Some of it is adapting to ever-changing IDEs, compilers, linkers, OSes, standards, HW, configuration management tools, utilities, new/changed requirements, regulations, user interface, telemetry, etc.
- Some of it is a desire to increase commonality with the next generation product, or form a basis to start a next generation.
- Other things I can't think of at the moment.

It is never really 'finished'. It is and must be an ongoing 'spiral process' of successive refinements when significant system complexity is involved.