Wintel

Discuss life, the universe, and everything with other members of this site. Get to know your fellow polywell enthusiasts.

Moderators: tonybarry, MSimon

MSimon
Posts: 14334
Joined: Mon Jul 16, 2007 7:37 pm
Location: Rockford, Illinois
Contact:

Wintel

Post by MSimon »

Mike Holmes wrote:You mean the "Wintel" model of computing? Yep, that's Gates' fault. He caused all the damage by "unifying" operating systems (read "stealing").

But this has nothing to do with the direction of his philanthropy, vis a vis fusion. I apologize for starting this new thread tangent.

If anybody knows anything about it, a new thread on the subject of Tri-Alpha might be interesting. I get the feeling that they're about as respected as Focus Fusion...

No, like I said, new thread, new thread!

Mike
It is not just Wintel. It is also Apple. Linux. etc.

I blame it on Bell Labs.

And then you get such advances as Operator Overloading. Where the operator changes depending on what you feed it. Another sterling advance. In confusion. And then the casting problem. And a whole host of others.

What do you get from all that? A mass of confusion.
Engineering is the art of making what you want from what you can get at a profit.

JohnSmith
Posts: 161
Joined: Fri Aug 01, 2008 3:04 pm
Location: University

Post by JohnSmith »

Actually, the idea behind operator overloading is to reduce confusion. Sure, it can be misused, but that's a programmer error. Dereferencing pointers is a good example of increased clarity. Code of the form
obj->next()->more_next()->deeper_into_the_list(); is very self explanatory.

(as an aside, I'd like to kill anyone who uses the structure obj->next()->previous() without a darn good reason)

I am interested, though. Simon, you seem to be a proponent of FORTH.
What do other people suggest as the 'best' programming language?

MSimon
Posts: 14334
Joined: Mon Jul 16, 2007 7:37 pm
Location: Rockford, Illinois
Contact:

Post by MSimon »

The problem with operator overloading is that you have to keep track of what a particular version expects so you can be sure you have called the right operation. Not critical for writing. Very important for debugging.

FORTH handles it differently. You can only access (without contortions) the last definition of the operation.

Example in pseudo code:

(1) Define Operation
(2) Define Operation
(3) Call Operation

In such a situation the Call only references (2). Debugging then is much simpler.

BTW FORTH had objects way before any other language.

They work like this:

Object:
Define Data Structure
Define operation(s) on the Data Structure
End Object

Two words are used to make the construct. BUILDS and DOES.

BUILDS gives you a pointer to the structure DOES does the required operation(s). For multiple possible operations you can use CASE to determine which one you want.

===

FORTH machines with multiple processors use a data flow architecture. No clocks required. Very slick. For those processes that require clocking: clock edges or levels can regulate the flow.

This eliminates the internal hassles with clock trees and powering sections up or down. Sections automatically power down white waiting for data.
Engineering is the art of making what you want from what you can get at a profit.

blaisepascal
Posts: 191
Joined: Thu Jun 05, 2008 3:57 am
Location: Ithaca, NY
Contact:

Post by blaisepascal »

MSimon wrote: FORTH handles it differently. You can only access (without contortions) the last definition of the operation.

Example in pseudo code:

(1) Define Operation
(2) Define Operation
(3) Call Operation

In such a situation the Call only references (2). Debugging then is much simpler.
Well, yes, but..

(1) Define MyOperation to print "(1)"
(2) Define My2ndOperation to print "(2)" and call MyOperation
(3) Define MyOperation to print "(3)
(4) Call MyOperation -- output is "(3)"
(5) Call My2ndOperation -- output is "(2)(1)"
BTW FORTH had objects way before any other language.
Citation?

FORTH was named in 1968, Simula 67 (an object-oriented language) was formalized in, well, 1967, before FORTH was named and released into the wild.

Also, in the 1984 book, "Thinking FORTH", author and FORTH evangelist Leo Brodie had some quite nasty things to say about objects and how bad they were (and how FORTH handled things better). His thinking did evolve, here's a couple of excerpts from the prefaces to the 2004 edition:
Bernd Paysan wrote: The original 1984 Thinking Forth feels a bit dated today. A lot happend with
Forth in the last 20 years, since this book was first published.
...
Paradigms like object oriented programming were adopted to Forth.
...
This edition adds all the missing things from the original:
• Add chapters about Forth and OOP, Forth debugging, and maintenance.
Leo Brodie wrote: In the 1994 Preface, I apologized that my dismissal of objected-oriented
programming in the 1984 edition was a little overreaching. What motivated
that apology was having worked for a time with an object-oriented flavor of
Forth developed by Gary Friedlander for Digalog Corp. I discovered that
the principles of encapsulation could be applied elegantly to Forth “objects”
that derived from classes that each had their own implementation of common
methods. The objects “knew” information about themselves, which made code
that called them simpler. But these were still just Forth constructs, and the
syntax was still Forth. It wasn’t Java written in Forth. There was no need
for garbage collection, etc. I won’t apologize now for my apology then, but
please know that I didn’t mean to sell out in favor of full-blown object oriented
languages.
A survey of Object Oriented FORTH variants lists NEON (1984) as the earliest.

To say that "FORTH had objects way before any other language" is stretching things a lot.
BUILDS gives you a pointer to the structure DOES does the required operation(s). For multiple possible operations you can use CASE to determine which one you want.
That sounds a bit like how Scheme "has" objects. An "object" is a function which returns other functions to do what needs to be done:

Code: Select all

(define (rect-constructor)
  (lambda (message)
    (define height 0)
    (define width 0)
    (cond
       ((= 'set-height! message) (lambda (h) (set! 'height h)))
       ((= 'set-width! message) (lambda (w) (set! 'width w)))
       ((= 'get-height message) (lambda () height))
       ((= 'get-width message) (lambda () width))
       ((= 'get-area message) (lambda () (* height width)))
       ;; etc, for more 
)))

(define rect1 (rect-constructor))
(define rect2 (rect-constructor))
((rect1 'set-height) 5)
((rect1 'set-width) 10)
((rect2 'set-height) 6)
((rect2 'set-width) 8)
((rect1 'get-area))
===> 50
((rect2 'get-area))
===> 48
It's not elegant, but it's a demonstration of the idea that you can do objects and object oriented programming in almost any language. Linux is written in C, but is developed using objects and object-orientated programming. That doesn't mean I'd say that Scheme or C "have objects". Lisp 1.5 (released in 1962) has all the features necessary to implement "objects" in that fashion. Does that mean that Lisp 1.5 "had objects way before any other language"?
[/quote]

Mike Holmes
Posts: 308
Joined: Thu Jun 05, 2008 1:15 pm

Post by Mike Holmes »

Heck, even Pascal had a built in object, the "record" for use in creating data structures, and that was released at just about the same time. Though I suppose that this just makes it "structured" and not neccessarily "object-oriented."

Data flow architecture is, indeed "slick." But it's so slick that sometimes it's a little mind-bending to follow. I think I do a pretty good job, but I've seen the same architecture boggle even some pretty good programmers. That may, of course, be because they're used to something else...

This all makes me want to get back to doing some real low-level hacking... these days I mostly "program" at a very high level, mostly manipulating objects without even peering into them all that closely. I don't even want to know what the xml java objects I'm using in something like Sharepoint are doing, because it's ugly, and I know that it's going to render badly in some browser somewhere.

Am I dating myself if I bring up RISC? ARM is now used in a lot of electronics that are "computer-like." iPods, and iPhones, and Nintendo products, etc. And, of course, we've had SPARC servers where I work for ages, running powerfully ahead of the PCs for the most part.

But, again, Intel dominance has meant that innovations of any sort are just not going to be mainstream. I had Motorola chips on my Macs at least until Apple decided to jump on the bandwagon (I think these are in the Wii now). If you can't beat em, join em? Oy!

Mike

MSimon
Posts: 14334
Joined: Mon Jul 16, 2007 7:37 pm
Location: Rockford, Illinois
Contact:

Post by MSimon »

I always liked the 68000 architecture - except the way it handled bytes was clumsy. An 8 bit "shifter" or a "pick the byte" structure shouldn't have been too hard to add to the hardware.

These days I like the ARM best. The Power PC stuff is junky IMO.
Engineering is the art of making what you want from what you can get at a profit.

tomclarke
Posts: 1683
Joined: Sun Oct 05, 2008 4:52 pm
Location: London
Contact:

Post by tomclarke »

Operator overloading.

Is one of those trade-offs. It reduces the strictness of typing, allowing more errors not caught by the type system. It also allows clearer and more readable programs by reducing unnecessary type conversions.

VHDL has massive overloading of operators. std_logic_arith had just too much, and was prone to unpleasant type ambiguities that made the semantics hard to follow. IEEE.numeric_std has less (though still lots) and gets it about right.

ALGOL68 had nice operator overloading.

Functional languages are lost without operator overloading - when you define your own operations for eveything not being able to use appropriate infix operators (e.g. + for something that is the add operation of a ring) is just silly.

C has too little type system for sensible static checking - all the fault of people conditioned by lack of touch-typing skills & clunky teletypes!

BTW language design seems to have undergone a rennaissance in the last 10 years. Look at Python (clever syntax, pragmatic semantics), F# - attempt to turn ML industrial, COBRA - trying to get the best out of many recent developments and compromise between static and dynamic typing.

Best wishes, Tom

MSimon
Posts: 14334
Joined: Mon Jul 16, 2007 7:37 pm
Location: Rockford, Illinois
Contact:

Post by MSimon »

I was never a fan of typing and all those things required to keep mediocre programmers from making mistakes.

One of the things C does to encourage mistakes is the stack thrash which makes calls expensive. So instead of writing very short easily testable fragments, functions are long and hard to test (all the branches) because a call is very expensive.

FORTH is no good for average programmers or below average programmers. The safeties are off. However, for top programmers it is a Force multiplier. No types. Bits are bits. You can do anything you want with them. When doing bit banging in C the casting becomes a real pain.

The other thing cheap calls gets you is re-usability. A fragment is likely to have more uses than a complicated function.

You can pass more than one parameter out of a routine without the need for naming variables. This also makes recursion easier since all the temporary variables and data can reside on the stack. And all of that is automatically cleaned up at the end of the recursive function.

And then there is the inherent goodness of reverse Polish. Get Data, get data, add. For processors get data, add, get data, is unnatural. It has to be translated into reverse Polish to make it work. But you do have to learn to think differently. I learned on a HP-35. It is second nature to me. No parens. Nest as deep as you want.

And the real kicker is: if you need blazing speed you can design a very simple chip where FORTH is the native language. There are no simple chips where C is the native language. It has been tried.

A dual stack architecture (data, returns) simplifies a lot of things with the right programming language.

And further - in a chip where FORTH is native all you pay for are CALLs. Returns can have zero cost i.e. the chip can be loading the program counter from the return stack while it is doing an add or some other function.

And then there is testing. Since no "main" is required to call a function you can put your data on the stack and just do a CALL and see the results. If you have to do a lot of cases for testing you write a test function. It can initialize variables, put things on the stack CALL the function and then report the results.
Last edited by MSimon on Sat Jan 03, 2009 4:54 am, edited 1 time in total.

JohnSmith
Posts: 161
Joined: Fri Aug 01, 2008 3:04 pm
Location: University

Post by JohnSmith »

VHDL hardly qualifies as a programming language. It certainly qualifies as a pain in the rear, though. For some reason I had to learn it last year. Wish they'd taught us Verilog instead.

Sorry Simon, but you seem to be advocating a return to assembly language, because a really good programmer doesn't need anything more. It starts to be a problem when lots of programmers have to work on the same bit of code, or the code has to be maintained. Then clarity and typecasting and all the extra bulk c++ seems to have might be a bit more useful.

MSimon
Posts: 14334
Joined: Mon Jul 16, 2007 7:37 pm
Location: Rockford, Illinois
Contact:

Post by MSimon »

John,

I actually had a government inspector tell me my FORTH code was the most readable he had seen in years. There were practically no comments either.

I'd have constructs like:

MOVE? IF
POSITION? MOTOR ON
ELSE
THEN

No comment required. It reads like a pigdin English.

Native FORTH is a very high level assembly language. Which is sort of a contradiction in terms unless you have actually used it.

It is just as high level as C, C++. And yet if it is the native language of the processor it can be the assembly language of the processor. Very slick.
Engineering is the art of making what you want from what you can get at a profit.

MSimon
Posts: 14334
Joined: Mon Jul 16, 2007 7:37 pm
Location: Rockford, Illinois
Contact:

Post by MSimon »

Let me add John,

I once did a project in FORTH with myself as manager/hardware designer/programmer and two full time programmers writing most of the code competing against another company that had a team of 30 C programmers. The customer would ask for a complete change in the front panel interface all new hardware different display technology etc. We turned it around in a month. The C guys were still struggling after 6 months. They hated us because no matter how much money they threw at their project we were whipping them. Bad. BTW my guys had never done a line of FORTH code when I hired them. I taught them the basics in 2 or 3 days.

FORTH done right doesn't need big teams and maintenance is easy because if the code is done right you can tell what is going on by inspection.

So what should the C team have done? Fired 27 of their programmers and kept just their top 3. Then spend 3 days or a week learning FORTH and competed in a fair fight.

Think of it: I was getting 60 times the productivity out of my team they were getting out of theirs. That seems like a pretty good advantage although in most cases I wouldn't expect that. Maybe only 10X.

BTW my company was not noted for leading edge technology and the guys we were up against were known as tops in the business. Heh.
Engineering is the art of making what you want from what you can get at a profit.

blaisepascal
Posts: 191
Joined: Thu Jun 05, 2008 3:57 am
Location: Ithaca, NY
Contact:

Post by blaisepascal »

JohnSmith wrote:Sorry Simon, but you seem to be advocating a return to assembly language, because a really good programmer doesn't need anything more. It starts to be a problem when lots of programmers have to work on the same bit of code, or the code has to be maintained. Then clarity and typecasting and all the extra bulk c++ seems to have might be a bit more useful.
I was with you up to C++. C++ was good for what it was originally intended to be -- a quick-and-dirty graft of Simula-style object-oriented features into C -- but it retains many of the faults of C, and adds a few of its own. The (inherited) preprocessor is purely textual and doesn't understand or respect the structure of the language, the (new) templating mechanism (used for writing "generic" code) is Turing-complete, operator overloading is unrestricted, and making local optimization guarantees (like const-ness) can involve global changes in the code. Overall, the language design is a mess. I'm glad that I'm no longer coding in C++.

One problem with C++ (and Java, and C#, both of which are better than C++ in many ways) is that the designers aren't following the KISS principle when evolving the language. New features are being added on which are neat, but bloat the language.

tomclarke
Posts: 1683
Joined: Sun Oct 05, 2008 4:52 pm
Location: London
Contact:

Post by tomclarke »

JohnSmith -

VHDL is a full programming language - Not what you would choose for general programs but it has much of the stuff (functions, procedures, data types, info hiding, v powerful arrays and records).

Simon -

I think you don't think much of compilers. The most recent GNU C compiler can do good global static optimisation. It can remove function ovehead completely for non-recusrive or tail-recursive calls. It can move memory references into registers for efficiency where this works.

All this work used to have to be done by programmers - but not now. C is pretty efficient in terms of (non-optimised) function call & return. In ARM assembly call & return are one instruction, or two/one instructions where local data is saved. And other architectures, e.g. SPARK, can reduce this overhead.

C is not great for declarative programming - but there are many other languages which are! Keeping data local to functions and avoiding globals is possible in in any half-way decent language. And compilers will do what they can with this to increase efficiency.

Programmers have the job of writing correct programs (impossible, in general - but the more effort the better). Static typing helps with this, if the type system is rich enough (an implementation can be seen as a proof of a type specification, so static type checking can become arbitrarily complex) and choosing decent algorithms O(n^2) not O(2^n).

Low-level optimisations should be done by the compiler - and a decent language will help with this, by providing type restrictions (where these cannot be inferred).

There is currently an interesting debate between the "runtime debug" programmers - extreme programming, "test as specification", interpretive quick an dirty dynamically typed languages. And the "static debug" CS programmers: static typing & top-down design aided by formal declaration of data structures etc.

The static checking advocates claim that the run-time programming philosophy is an excuse for bad programmers, who cannot understand or think at the level of abstraction allowed by complex type systems. They say that debug-lead program design leads to sloppy and badly thought-out programs.

The run-time advocates say that their methods are just more efficient - and lead to better verification - for everyone.

Both are probably right!

Best wishes, Tom

tomclarke
Posts: 1683
Joined: Sun Oct 05, 2008 4:52 pm
Location: London
Contact:

Post by tomclarke »

Re FORTH

I think Simon's experience with FORTH is somewhat similar to mine writing in 6502 assembly language for a complex embedded real-time system. Because I adopted a very functional style, with standard register conventions and everything done by small functions, the resulting code was quick, bug-free and maintainable.

That does not mean that assembler is the best language for the job (this was a long time ago). But the right programming style is more important than the right language, and bad programmers can be so in any language with the possible exception of Haskell.

Best wishes, Tom

tomclarke
Posts: 1683
Joined: Sun Oct 05, 2008 4:52 pm
Location: London
Contact:

Post by tomclarke »

Blaisepascal -

Re bloated C++, Java, C#

There are separate issues - semantic clarity, expressiveness, and size of language. For big jobs and experienced programmers a big language may provide needed expressiveness that allo9ws more concise and checkable code - but only if the sematics are clear.

C++ is a disaster: with many pitfalls that allow programmers to make subtle semantic mistakes. (Effective C++, Meyers). But in for different applications programmers need support to use:

functional programming
object oriented programming
logic programming (perhaps of limited use)
dynamic typing
static typing
reflection

So you have either a small language which which is limited, or a big language which provides support for all these paradigms. The key is whether you can combine the different types of support needed without losing semantic clarity and concisivness. If not, best with the more limited language.

New languages are still being developed, and we do not yet know which are the "sweet spots".

Best wishes, Tom

Post Reply