Programming languages

Discuss life, the universe, and everything with other members of this site. Get to know your fellow polywell enthusiasts.

Moderators: tonybarry, MSimon

chrismb
Posts: 3161
Joined: Sat Dec 13, 2008 6:00 pm

Post by chrismb »

usernameguy wrote:I'm a software engineer.

Most of you guys are insane.

EG: I wouldn't wish Fortran on my worst enemy. Poor sonsabitches with big math Fortran libraries...
yeah, sure - and I'm guessing you think all those bandwidth intensive pop-ups that jam my ageing laptop are examples of "cool programming"!

If you want to stick in a quickie equation for parameter optimisation, or some such, F77 gives an engineer the option of rolling off a working programme for himself for that purpose in 2 minutes. It takes that long to find a software engineer at the end of his phone who knows how to fire up the "Visual-Cobblers" software programing environment the company now uses, 2 hours to explain the requirement, and 2 days to write and debug it.

Engineers in that same situation are now forced into using excel to generate these kinds of things, with all the limitations that you can expect for a programme designed for accountants.

Yeah, we used to create stuff to help us create stuff. Now we are forced to take the scraps of the managerial table and do the impossible with it.

Give me a team of mechanical and electrical engineers to do work with, and if I must have a software engineer on the team then they'd better know F77!

JohnFul
Posts: 84
Joined: Sat Feb 27, 2010 7:18 pm
Location: Augusta, Georgia USA

Post by JohnFul »

The first language I ever used was Fortran 77. Lots of assember back in the day. Forth, Basic, Pascal, C, C++, Java, C#, you name it.

These days I still dabble with programming, but I'm not a programmer. I spent many years doing live onsite debug for major software vendor. I don't even do much of that any more. For the last several years my focus has been large scael storage infrastructure architecture. If I need something quick, I use C# or PowerShell on Windows or Perl on Unix.

J

JohnFul
Posts: 84
Joined: Sat Feb 27, 2010 7:18 pm
Location: Augusta, Georgia USA

Post by JohnFul »

The first language I ever used was Fortran 77. Lots of assember back in the day. Forth, Basic, Pascal, C, C++, Java, C#, you name it.

These days I still dabble with programming, but I'm not a programmer. I spent many years doing live onsite debug for major software vendor. I don't even do much of that any more. For the last several years my focus has been large scael storage infrastructure architecture. If I need something quick, I use C# or PowerShell on Windows or Perl on Unix.

J

BenTC
Posts: 410
Joined: Tue Jun 09, 2009 4:54 am

Post by BenTC »

Diogenes wrote:Lately, most of my projects have been using the PIC 16F690 processors, (because they're cheap, reliable, and easy to program with a PC) but i'm always looking out for something better. I actually hate the PICs Instruction set, and lack of instructions, but i've managed workarounds for all the projects i've used them on so far.
Try the Arduino mentioned above. As they advertise, I got it up and running in about 30 minutes. Note, since this is an opensource hardware design, schematics are available and there are several derivatives that may be more suitable - paricularly the "mini" or BareBones.
In theory there is no difference between theory and practice, but in practice there is.

BenTC
Posts: 410
Joined: Tue Jun 09, 2009 4:54 am

Post by BenTC »

MSimon wrote:The Z-80 Forth I'm writing will eventually get ported to ARM (it was my original target).
Just for interest I had a look for Forth on the Arduino. Not a complete solution, but some possibility...

http://newsgroups.derkeiler.com/Archive ... 00517.html

http://eddiem.com/micros/sam7forth/armforth.html
In theory there is no difference between theory and practice, but in practice there is.

Luzr
Posts: 269
Joined: Sun Nov 22, 2009 8:23 pm

Re: Programming languages

Post by Luzr »

BenTC wrote: C++ on the other hand is a mongrel beast of an OO language that should have been shot in the head at birth
Well. I am professional programmer for over 20 years and my opinion is exactly opposite.

C++ is the ultimate workhorse language. It is very hard to master, but it is a language to actually DO the stuff. It is no surprise that most big software ends being done in C++.
- darn you Microsoft!!!!
What has Microsoft to do with that? If anything, you should blame them for Visual Basic or C#. They have adopted C++ for some apps for exactly the same reasons others do (basically, C++ definitely helps to manage big projects when C becomes too much trouble) for SOME projects, but that is all about it.
Why oh why could you not have chosen Objective-C with its Smalltalk roots?
Because Objective-C is nowhere as effective (in both runtime and productivity) with its messaging model?

Disclaimer: I am a strongly biased as I am heavily involved in this opensource project: http://www.ultimatepp.org/www$uppweb$ov ... en-us.html
Last edited by Luzr on Sun Mar 21, 2010 4:57 pm, edited 1 time in total.

Luzr
Posts: 269
Joined: Sun Nov 22, 2009 8:23 pm

Post by Luzr »

MSimon wrote:
These optimized interactions, still abstract at this point, should then be used to define the requirements for an all-new HW/SW architecture, which minimizes the number of HW and SW "layers".
Forth did this 20 to 30 years ago. It is a very nice wheel. Why reinvent it?
Forth? Joking?

If I remember Forth well, there are at least 2 problems with it:

- its stack based model is totally incompatible with modern high-performance CPU architecture.

- it has zero compile time checks. That means it can be used for small toy projects only, but I cannot imagine Forth to be used for 100000 lines projects with a team of developers. How are you going to maintain interfaces when there are no function signatures?

Tom Ligon
Posts: 1871
Joined: Wed Aug 22, 2007 1:23 am
Location: Northern Virginia
Contact:

Post by Tom Ligon »

Ah, but I never worked with million-line programs with teams of developers. Nor do I have any desire to. I realize that is bread and butter for some people, but to me it is a picture of Hell on Earth.

If I write something, I want full control over it, and I want to know what every line does. I write it to use. I like being able to run from within the compiler and debug or change on the spot. The notion of having to go thru a software revision control process with a review committee just to clean up the formatting of a line of output, or customize the program to a one-time need, would bug me no end.

For me, 4000 lines of code, allowing me to run customized fatigue crack growth rate tests on a servohycraulic fatigue machine that almost nobody else could do because they lacked the software (well, it was back in the 80's) was my bread and butter. And being able to take the same code and modify it to test shock mounts was money in the bank. If I had bought a fatigue program off-the-shelf, I would not have been able to set up this one-of-a-kind application in just a couple of days.

I don't believe Labview can do either application without custom C modules. The amount of processing of the data to to extract the data in near-real-time is beyond anything a bunch of drag and drop icons can do, and Labview is notorious for making it difficult to control the actual sequence of operations performed. Roll-your-own code, in which you have full control of the whole thing, is the superior route. Chose your preferred high level language ... whatever you are comfortable with, as long as you can conveniently use it with your own assembly-language interface drivers and maybe even assembly-language finite element calculating routines.

My preference for Pascal is aesthetic. In my opinion, it writes the prettiest, most readable code. Pascal writes poetry. C writes gutteral grunts.

JohnFul
Posts: 84
Joined: Sat Feb 27, 2010 7:18 pm
Location: Augusta, Georgia USA

Post by JohnFul »

Hell on earth is being the one guy that has to onsite during a highly escalated and highly visible "critical situation" and do the live debug to determine the source of the issue. Try that 150 times. It gets old real quick.

Modern compilers produce symbol files as well as the compiled output. If you have the symbols and know how to use a debugger, you can generally find what you're looking for fairly quickly. When it gets fun is when there are multiple modules from multiple vendors that you don't have symbols for, and the code path in question is murky. One thing I do like about the Windows platform is the level of instumentation it provides.

I still dabble with programming, but would never want to do that for a living. It's a lot more fun to do the Infrastructure Architecture. I've been in an enterprise storage niche for the last several years, and that has been a lot of fun.

J

DeltaV
Posts: 2245
Joined: Mon Oct 12, 2009 5:05 am

Post by DeltaV »

Tom Ligon wrote:For me, 4000 lines of code, allowing me to run customized fatigue crack growth rate tests on a servohycraulic fatigue machine that almost nobody else could do because they lacked the software (well, it was back in the 80's) was my bread and butter. And being able to take the same code and modify it to test shock mounts was money in the bank. If I had bought a fatigue program off-the-shelf, I would not have been able to set up this one-of-a-kind application in just a couple of days.
Worked part-time ($4/hr) for some of my professors in college doing something similar. Custom systems for automated recording and plotting of stress-strain and fatigue-crack growth. Instron electrohydraulic rigs, Vishay for stress-strain DAQ, Tektronix 4051/BASIC controller. For long-duration fatigue-crack tests (up to 1 week), I built a custom HW box interfacing the Tektronix to a Votrax speech synthesizer and a telephone. The grad student doing the test could call in to find out how far the sample had cracked, wanted to be there when it finally broke (which might be 3AM). Only about 10 pages of code total. The professor in charge of the lab actually let me solder a data cable directly to a circuit board on his brand new, multi-thousand $ Instron test machine (the available outputs didn't include the signal we wanted), and I wasn't even an EE, just a hobbyist. It would have taken me 10+ years to pay for that board if I had screwed it up, but fortunately it turned out well. Another professor let me build a laser-based speed measuring setup for bullets. Those hit a nylon piston, that forced out a hypersonic jet of water, that impacted a ceramic nosecone to simulate raindrop erosion effects at reentry speeds.

Tom Ligon
Posts: 1871
Joined: Wed Aug 22, 2007 1:23 am
Location: Northern Virginia
Contact:

Post by Tom Ligon »

Yeah, but sweet satisfaction is being called in to the old job to consult on disputed results from an old program.

The customer thought my code must be working wrong, or they had the machine mis-calibrated or set to a wrong range. I came in to set up the program.

I looked at the parts they were testing, looked at the results, and looked at the specs. I concluded that the parts met their spec. The customer replied that they most certainly did not meet the spec for model 300. I said the parts you are testing are model 150.

One engineer running the whole project means one engineer can spot the problem, which in this case was shipping the wrong part to be tested.

Tom Ligon
Posts: 1871
Joined: Wed Aug 22, 2007 1:23 am
Location: Northern Virginia
Contact:

Post by Tom Ligon »

DeltaV, you were not working for a Dr. Joyce, were you? That Tektronix sounds mighty familiar. I improved on the method he used, switching to a NorthStar Horizon and then an 8088 PC. In the end, I could process a cycle of fatigue data in under 2 seconds.

DeltaV
Posts: 2245
Joined: Mon Oct 12, 2009 5:05 am

Post by DeltaV »

No, don't know Dr. Joyce. What I was doing was a fairly slow data rate, except for the laser thing.

The Tektronix used the CRT as an analog video memory (vector graphics). I used to worry about X-ray exposure when I hit the refresh button to flash the screen.

MSimon
Posts: 14335
Joined: Mon Jul 16, 2007 7:37 pm
Location: Rockford, Illinois
Contact:

Post by MSimon »

Luzr wrote:
MSimon wrote:
These optimized interactions, still abstract at this point, should then be used to define the requirements for an all-new HW/SW architecture, which minimizes the number of HW and SW "layers".
Forth did this 20 to 30 years ago. It is a very nice wheel. Why reinvent it?
Forth? Joking?

If I remember Forth well, there are at least 2 problems with it:

- its stack based model is totally incompatible with modern high-performance CPU architecture.

- it has zero compile time checks. That means it can be used for small toy projects only, but I cannot imagine Forth to be used for 100000 lines projects with a team of developers. How are you going to maintain interfaces when there are no function signatures?
Dual Stacks are easy on MODERN architectures. ARM for instance. Although I will admit that the ARM architecture is crippled for RETURN stacks.

Well no matter. If you want ultimate speed design a FORTH CPU in FPGA.

You want compile time checks? Adding them is trivial. But you have a point. 100,000 line programs are tough. But FORTH is not for brute force programmers. It is for programmers who would rather finesse a problem in 10,000 lines.

Your kind of thinking is why programming is in such a sorry state. Everyone thinks - we will get 100 programmers and brute force it. Give me the top 10 out of that 100 and let me finesse it. There will be less code. It will be more thoroughly tested. And it will be easier to maintain.

It is similar to the way we design processors - throw more gates at the problem. Fortunately the speed of light is forcing us back in the direction of simplicity.
Engineering is the art of making what you want from what you can get at a profit.

BenTC
Posts: 410
Joined: Tue Jun 09, 2009 4:54 am

Post by BenTC »

I went looking for stuff I recalled on 'Joel on Software's blog. While I didn't find it, I did come across an interesting article on version control. I've been trying to choose between version control systems for some time, and I quite respect Joel's opinions.

Distributed Version Control is here to stay, baby
In theory there is no difference between theory and practice, but in practice there is.

Post Reply