Go Forth And Prosper
Go Forth And Prosper
When I started looking into Polywell I wanted to design an instrument system for at least part of the setup. I'm on the way:
https://www.indiegogo.com/projects/go-f ... /x/7082918
I want to displace "C" as a control language and replace it with a language that is 10X faster to develop.
Some good places to start:
http://www.wulfden.org/downloads/Forth_ ... gForth.pdf
http://www.dnd.utwente.nl/~tim/colorfor ... -forth.pdf
https://www.indiegogo.com/projects/go-f ... /x/7082918
I want to displace "C" as a control language and replace it with a language that is 10X faster to develop.
Some good places to start:
http://www.wulfden.org/downloads/Forth_ ... gForth.pdf
http://www.dnd.utwente.nl/~tim/colorfor ... -forth.pdf
Engineering is the art of making what you want from what you can get at a profit.
Re: Go Forth And Prosper
Having done C/C++, multiple forms of assembly programming, Forth, and numerous assorted high level programming languages, I'd say Forth programming feels a lot closer to assembly programming than any of the other high level languages I've used. That includes development time. Having to keep manual track of the stack is a pain in the posterior.
The daylight is uncomfortably bright for eyes so long in the dark.
Re: Go Forth And Prosper
Why do you need to keep manual track of the stack? What you have to keep track of is that producers = eaters. Something you have to do in any language.hanelyp wrote:Having done C/C++, multiple forms of assembly programming, Forth, and numerous assorted high level programming languages, I'd say Forth programming feels a lot closer to assembly programming than any of the other high level languages I've used. That includes development time. Having to keep manual track of the stack is a pain in the posterior.
I have seen a fairly consistent 10X productivity improvement over "C". I did a project once where my factor was better than 100X. I have heard of the occasional 1,000X. It seems stupid to throw away those kinds of results because it is annoying to make sure your pushes and pops match.
Close to assembler? i.e. close to the machine? That is a feature. Supposedly a "C" advantage.
So what does a stack machine give you? You don't need so many named variables. A routine can return more than one value if you need to. BTW "C" is a stack machine - a poor one IMO but never the less a stack machine. As are all compilers.
I have discussed Forth with compiler designers. They get it. Because that is how they build compilers.
Reverse Polish --> You can't compute 2 * ? until you know what ? is. So reverse Polish actually makes the most sense.
2 4 *
Engineering is the art of making what you want from what you can get at a profit.
Re: Go Forth And Prosper
MSimon is working up some boards for me. I'm new to Forth and am looking forward to firing up the first one he sent me.
As for C, I just retired from an outfit where the C programmers would not let the rest of us mortals tinker with their stuff, for fear we might break it. If you say to a C programmer, "I've had so many interrupts today that my stack has crashed into my heap and I was forced to reboot," they knew just what you meant. And their code was prone to it. Programmers getting lazy and not managing things like their stacks, but just letting stuff run amok, are a large part of the reason why our computers are such a cluster%@*#.
I just wish these little boards were running Pascal, although I've used Assembler, too.
As for RPN, I've had HP calculators dating back to my HP-35. Standard algebraic notation is wrong ... you cannot do the operation such as addition until you have both numbers you intend to perform the operation ON. So in using an operator to separate the two numbers, you're just requiring that the operation be stored. The HP used a stack to store the numbers, then you performed the operation on them. And the stack is better than parenthesis.
As for C, I just retired from an outfit where the C programmers would not let the rest of us mortals tinker with their stuff, for fear we might break it. If you say to a C programmer, "I've had so many interrupts today that my stack has crashed into my heap and I was forced to reboot," they knew just what you meant. And their code was prone to it. Programmers getting lazy and not managing things like their stacks, but just letting stuff run amok, are a large part of the reason why our computers are such a cluster%@*#.
I just wish these little boards were running Pascal, although I've used Assembler, too.
As for RPN, I've had HP calculators dating back to my HP-35. Standard algebraic notation is wrong ... you cannot do the operation such as addition until you have both numbers you intend to perform the operation ON. So in using an operator to separate the two numbers, you're just requiring that the operation be stored. The HP used a stack to store the numbers, then you performed the operation on them. And the stack is better than parenthesis.
Re: Go Forth And Prosper
I remember Pascal. It taught me a lot of discipline that has served me well, even though I no longer use the language.
As to RPN vs. algebraic notation: RPN translates well to actual execution, at least on a scalar CPU. For a more than moderately complex expression it becomes very hard to read and maintain the code. Which is why we have compilers that can translate the more easily read and maintained code into efficient machine code. Code notation which is easy on the compiler is often not easy on the programmer.
As a side note, I've been following the development of a new CPU architecture, described at http://millcomputing.com/, for which algebraic notation strikes me as good as RPN. The exposed execution logic supports evaluation of subexpressions in parallel. Forth / RPN code compiled for optimal execution on such a machine would need extensive analysis of which operations depend on which others.
As to RPN vs. algebraic notation: RPN translates well to actual execution, at least on a scalar CPU. For a more than moderately complex expression it becomes very hard to read and maintain the code. Which is why we have compilers that can translate the more easily read and maintained code into efficient machine code. Code notation which is easy on the compiler is often not easy on the programmer.
As a side note, I've been following the development of a new CPU architecture, described at http://millcomputing.com/, for which algebraic notation strikes me as good as RPN. The exposed execution logic supports evaluation of subexpressions in parallel. Forth / RPN code compiled for optimal execution on such a machine would need extensive analysis of which operations depend on which others.
The daylight is uncomfortably bright for eyes so long in the dark.
Re: Go Forth And Prosper
I remember Fortran, entered on an IBM 26 keypunch that had no parenthesis keys. I think we used # and % or some such nonsense.
To this day we use * for multiplication in virtually all high-level programming languages.
I'm not saying algebraic notation is not convenient, but just that it does not reflect the way any machine actually operates. We use high-level programming to mask what the machine actually must do. But it makes sense, particularly if you are trying to write tight code and run fast, to learn how the machine actually does its operations.
For example, will it execute more efficiently if you use A*A, or A^2? (In Fortran, there was no ^, and we would have entered A**2.) Are you working in integer or floating point? Are you taking the difference of two large numbers of similar magnitude? What precision are the numbers? What precision will the difference be? If you continue incrementing a floating point value by 1, at what point will it cease increasing? If an integer, will it roll over to 0, or suddenly shift to a negative value (is the MSB the sign)?
Even in Pascal, we were taught never to use a GoTo. Yet, the program structures used to get around this could not function in machine language without some form of GoTo or Jump instructions.
At its heart, everything is binary. How many programmers have ever tried doing multiplication or division in binary, using machine instructions? It is straightforward enough, but if you never get down to registers and instructions to rotate their contents right and left, how would you ever realize how the machine really works?
The most stone-axe machine I ever worked with was the PDP-8. On that, I learned how computers actually worked, more than on any other machine before or since. The thing had no secrets, a computer that you could understand from Focal all the way down to individual gates.
To this day we use * for multiplication in virtually all high-level programming languages.
I'm not saying algebraic notation is not convenient, but just that it does not reflect the way any machine actually operates. We use high-level programming to mask what the machine actually must do. But it makes sense, particularly if you are trying to write tight code and run fast, to learn how the machine actually does its operations.
For example, will it execute more efficiently if you use A*A, or A^2? (In Fortran, there was no ^, and we would have entered A**2.) Are you working in integer or floating point? Are you taking the difference of two large numbers of similar magnitude? What precision are the numbers? What precision will the difference be? If you continue incrementing a floating point value by 1, at what point will it cease increasing? If an integer, will it roll over to 0, or suddenly shift to a negative value (is the MSB the sign)?
Even in Pascal, we were taught never to use a GoTo. Yet, the program structures used to get around this could not function in machine language without some form of GoTo or Jump instructions.
At its heart, everything is binary. How many programmers have ever tried doing multiplication or division in binary, using machine instructions? It is straightforward enough, but if you never get down to registers and instructions to rotate their contents right and left, how would you ever realize how the machine really works?
The most stone-axe machine I ever worked with was the PDP-8. On that, I learned how computers actually worked, more than on any other machine before or since. The thing had no secrets, a computer that you could understand from Focal all the way down to individual gates.
Re: Go Forth And Prosper
It is mostly a matter of familiarity. I have been using RPN since my HP-35 days (hi Tom!) and find it natural for any length expression. I find getting the operation order correct is usually easier than getting parens correct.Code notation which is easy on the compiler is often not easy on the programmer.
Engineering is the art of making what you want from what you can get at a profit.
Re: Go Forth And Prosper
A good programmer will know that on most machines A*A is faster. A top notch compiler with optimization enabled will also spot that A^2 can be rendered more efficiently as A*A and convert for you. Which is nice because (complex_expression)^2 is easier to manage in source than (complex_expression)*(complex_expression)Tom Ligon wrote:For example, will it execute more efficiently if you use A*A, or A^2?
All details a good programmer needs to know so they can select a numeric representation that avoids nasties over the range of interest.Are you working in integer or floating point? Are you taking the difference of two large numbers of similar magnitude? What precision are the numbers? What precision will the difference be? If you continue incrementing a floating point value by 1, at what point will it cease increasing? If an integer, will it roll over to 0, or suddenly shift to a negative value (is the MSB the sign)?
The daylight is uncomfortably bright for eyes so long in the dark.
Re: Go Forth And Prosper
Yes, A^2 typically calls up a transcendental function, and typically it assumes you are raising to a floating point power and not an integer. It takes the same time as raising to the A^2.104617. So A*A is generally faster. In the old days, it was likely a LOT faster. Then they started incorporating math coprocessors to handle the transcendental functions and the advantage got a little murkier.
Squaring on the HP35 without using the x^2 button was pretty easy. With the number displayed (the X register), press ENTER then multiply. This pushed the X register up into the stack while also keeping it in the X register. Hitting multiply then multiplied the X register times the Y register. Stack operation. Fast, efficient. Easier than entering 2 and then hunting for the y^x key.
The idea that the compiler would need to detect that A^2 would be better than A^2 suggests that programmers have forgotten this.
Programming is full of little idiosyncrasies that GOOD programmers are aware of and others are not. We used to teach this stuff. It is not so clear to me that the last couple of generations have cared to learn it. Instead, the philosophy is to crank out crappy code, and issue updates on a weekly basis. Each new generation takes several times more memory than the one before, and runs slower unless you scrap the old machine and get a faster one. It doesn't seem that most of the industry cares any more about writing tight code that runs fast and reliably.
What C++ programmers are especially guilty of is calling up objects and libraries that are absolute black boxes to them. They don't know what is inside. And every new version of compiler is likely to make the libraries, etc even murkier, using more memory. Compilers seem to have gotten sloppy ... the whole library loads even if you only need a single function call. Turbo Pascal, in its later versions, stripped unused code when it compiled, and wrote really fast code using a minimum of compiled space.
Squaring on the HP35 without using the x^2 button was pretty easy. With the number displayed (the X register), press ENTER then multiply. This pushed the X register up into the stack while also keeping it in the X register. Hitting multiply then multiplied the X register times the Y register. Stack operation. Fast, efficient. Easier than entering 2 and then hunting for the y^x key.
The idea that the compiler would need to detect that A^2 would be better than A^2 suggests that programmers have forgotten this.
Programming is full of little idiosyncrasies that GOOD programmers are aware of and others are not. We used to teach this stuff. It is not so clear to me that the last couple of generations have cared to learn it. Instead, the philosophy is to crank out crappy code, and issue updates on a weekly basis. Each new generation takes several times more memory than the one before, and runs slower unless you scrap the old machine and get a faster one. It doesn't seem that most of the industry cares any more about writing tight code that runs fast and reliably.
What C++ programmers are especially guilty of is calling up objects and libraries that are absolute black boxes to them. They don't know what is inside. And every new version of compiler is likely to make the libraries, etc even murkier, using more memory. Compilers seem to have gotten sloppy ... the whole library loads even if you only need a single function call. Turbo Pascal, in its later versions, stripped unused code when it compiled, and wrote really fast code using a minimum of compiled space.
Re: Go Forth And Prosper
True, but mainstream CPU architectures don't operate on a stack either but use a directly addressable register file. And they will try to execute multiple independent operations in parallel.Tom Ligon wrote:I'm not saying algebraic notation is not convenient, but just that it does not reflect the way any machine actually operates.
The best notation for readability would likely be something that resembles a tree. In any case, we can do much better than a linear sequence of ASCII symbols. It's sad, really, how readily programmers accept this massive regression going from paper to screen as the status quo, when we can in fact do so much better on screen. http://glench.com/LegibleMathematics/ for an example that just scratches the surface.
PN/RPN may have the benefit of not having to type parentheses, but the visual representation should still have a clear grouping of sub-expressions.
-
- Posts: 892
- Joined: Thu Mar 12, 2009 3:51 pm
- Contact:
Re: Go Forth And Prosper
Testing my new net connection the other day I typed "d" into Google and got a wiki page on a D programming language. It was described as a "lessons learned from c" language also incorporating some features from other languages. Sounded interesting, anyone familiar with it?
Evil is evil, no matter how small
Re: Go Forth And Prosper
Programming in Three Dimensions
http://marc.najork.org/files/jvlc1996.pdf
http://marc.najork.org/files/jvlc1996.pdf
Re: Go Forth And Prosper
If you are used to reading RPN it is pretty clear. And there are things you can do to help clarity.Teahive wrote:PN/RPN may have the benefit of not having to type parentheses, but the visual representation should still have a clear grouping of sub-expressions.
A fragment to convert deg C to deg F.
Deg-C 9 * 5 / 32 +
Engineering is the art of making what you want from what you can get at a profit.
Re: Go Forth And Prosper
How do you get a superscalar CPU to work with FORTH like stack semantics?
The daylight is uncomfortably bright for eyes so long in the dark.
Re: Go Forth And Prosper
Well for one thing you design it differently. The current thinking in Forth is at most the pipeline is one level deep. So you are never more than one instruction away from processing an interrupt. And you don't have to wait long for the pipeline to flush to call a new routine. Which you have to do if you do out of order processing.hanelyp wrote:How do you get a superscalar CPU to work with FORTH like stack semantics?
Superscalar is for the most part a response to the languages used and the way silicon has been traditionally designed. You design them different and you don't need all those tricks.
The thinking in Forth is that you get superscalar like parallelism by a sea of processors passing data between them. And the processor design is async. If it has no data it does nothing.
Look up - GreenArrays GA144 - 144 processors on a chip.
Forth people think differently.
Engineering is the art of making what you want from what you can get at a profit.