Programming languages
Of course not. I was referring to high level languages.Tom Ligon wrote:Everybody uses GO TO statements. They just mask them using high-level program structures.
You don't really think elegant CASE statements compile without resorting to JUMP or JUMP REL instructions, do you?
I did some 6502 assembly in the 80s, just for kicks. Thought it was a lot of fun. I kicked around the idea of using assember on some TBD CPU to write my own OS, compiler, editor, video I/F, etc, but nixed it, mainly because of how fast the industry was changing. I figured by the time I got it done the HW would be obsolete. Unless you can be satisfied with obsolete HW, and never intend to market your SW, doing it all yourself with the available resources takes too much time. I'm referring here to complex, graphics-intensive, stand-alone programs with user-friendly interfaces as the ultimate end product. For embedded code, etc, I think it can still be done.
A new, user-oriented paradigm is needed, something that hasn't evolved from specific HW/SW of the past (sort of a reverse evolution). Development of the new paradigm should start with an identification of the fundamental types of interaction a person can have with a PC, followed by a procedural optimization of those interactions to minimize user-expended time/energy. These optimized interactions, still abstract at this point, should then be used to define the requirements for an all-new HW/SW architecture, which minimizes the number of HW and SW "layers". Hard-wiring and hard-coding of interactions very unlikely to change would be encouraged. Some things now done separately might end up getting combined, and some things now combined might end up getting done separately. Some things now done in SW might end up getting done in HW, and some things now done in HW might end up getting done in SW, but a transition to SW providing unnecessary "flexibility" should be discouraged to avoid the trend towards bloat and slowness. The chips can do GHz. Speed, speed, speed. The overriding criteria should be quality of the user's experience and future stability of the user's hard-won code, not how much of the legacy HW/SW can be retained, or whether it's more profitable to use this chip or that driver. Make the HW/SW adapt to changing technology, not the user, who deserves a stable, robust development and execution environment.
Sounds like I'm smokin' crack. This has probably been attempted in the past under several guises. If it worked, it must have been quashed by Wintel or such.
Forth did this 20 to 30 years ago. It is a very nice wheel. Why reinvent it?These optimized interactions, still abstract at this point, should then be used to define the requirements for an all-new HW/SW architecture, which minimizes the number of HW and SW "layers".
BTW when I was buying my new box I ran into an old time geek who was looking for a box and he was looking at the same Gateway machine. We got to talking about the good old days BBSes, FIDO net, Forth, Assemblers, etc. Fun.
Engineering is the art of making what you want from what you can get at a profit.
-
- Posts: 892
- Joined: Thu Mar 12, 2009 3:51 pm
- Contact:
On the other hand, DeltaV, there's plenty that can be done now--look at Linux. While it's not what you're looking for, it's a SW example--the community has gradually gotten a few set standards set up, and so while every Linux is different, they share traits. Expand this to hardware, especially standards--make sure you don't need to worry about a bajillion different drivers so your program can work on any machine, and you'll be most of the way there.
If we can get hardware development out of the hands of the big companies, just like Linux is doing for software, you're idea is sound. There are a few things that could do this coming up in the near future. If you can get something like this:
http://nextbigfuture.com/2006/06/diamon ... uctor.html
in you garage, than you can break free. It would share the same issues with Linux though, you'd have to make sure everyone's using the same basic core so standardized software can run on it, and then the sky is the limit.
Or something like that.
If we can get hardware development out of the hands of the big companies, just like Linux is doing for software, you're idea is sound. There are a few things that could do this coming up in the near future. If you can get something like this:
http://nextbigfuture.com/2006/06/diamon ... uctor.html
in you garage, than you can break free. It would share the same issues with Linux though, you'd have to make sure everyone's using the same basic core so standardized software can run on it, and then the sky is the limit.
Or something like that.

Evil is evil, no matter how small
Get HDW out of the big companies? No sweat. FPGAs big enough for a 32 bit machine (Forth) are not too expensive. And guess what - unlike C you can turn Forth from a virtual machine to a real machine without too much strain.
Register machines are clunkers. Build a 32 bit stack machine. 16 bit microcode instruction per word plus a 16 bit address/constant.
If you haven't done it before separating your stacks into Return and Data stacks is a real speed booster. Context switching can be made lighting fast.
Register machines are clunkers. Build a 32 bit stack machine. 16 bit microcode instruction per word plus a 16 bit address/constant.
If you haven't done it before separating your stacks into Return and Data stacks is a real speed booster. Context switching can be made lighting fast.
Engineering is the art of making what you want from what you can get at a profit.
That is why you do Forth. A complete compiler can be written in 8K or less with about 2K assy lang. Porting is easy.kunkmiester wrote:I have an ARM based single board, that has a 100Mhz processor, among other advantages, but it's pretty much programming a computer, and anything I write for it I'd imagine I'd be doing linux or DOS based. Atmel promises to be much simpler, and it will be recycled a few times.
I suppose learning assembler wouldn't be too bad with just the one chip, but it'll be a pain in the butt to write anything really fancy for it. I'll have to look into eclipse.
Engineering is the art of making what you want from what you can get at a profit.
Yes, I think there's something useful in the "open" paradigm. Other examples are OpenGL and fabber projects like RepRap.kunkmiester wrote:If we can get hardware development out of the hands of the big companies, just like Linux is doing for software, you're idea is sound.
Home-produced computing HW (silicon-level) is going to be a while yet. A garage fabber would need micron-level precision. Maybe DIYers can come up with something completely different, that doesn't need ultra-precision, like a plasma computer...
Tom wrote:
For a large part of my career, debuggers and packet analyzers were my best friends.
J
It's not that hard. You'd be surprised. If you really what to know what the error codes defined in the headers are, then go get http://www.microsoft.com/downloads/deta ... laylang=en It's a database of all the error codes from all the common headers compiled in an exe. so some user mode app throws 0x800040005, then type err 0x80004005 at a command prompt. If you want to have fun, go get windebug or better yet kd. http://www.microsoft.com/whdc/devtools/ ... fault.mspxMore than a decade later, the problem can only have gotten worse. Nobody has a clue what is in some of these libraries and headers any more.
For a large part of my career, debuggers and packet analyzers were my best friends.
J
Eric,
Without revealing any deep dark secrets (I honestly didn't look that closely), some of EMC2's code in the 90's was written in Fortran, which was linked to very sophisticated Excel spreadsheets to provide a user interface.
I don't know if the choice of language was the programmer's. It may have been Dr. Bussard's preference, or a legacy from an earlier incarnation of the program.
I worked with one of the programmers enough to know that she used C or C++ more in most of her professional work.
Without revealing any deep dark secrets (I honestly didn't look that closely), some of EMC2's code in the 90's was written in Fortran, which was linked to very sophisticated Excel spreadsheets to provide a user interface.
I don't know if the choice of language was the programmer's. It may have been Dr. Bussard's preference, or a legacy from an earlier incarnation of the program.
I worked with one of the programmers enough to know that she used C or C++ more in most of her professional work.
These might be useful for you:
http://llvm.org/
http://sablecc.org/
Eclipse is just an editor. These let you customize your own language that also exploits whatever the hardware has to offer.
http://llvm.org/
http://sablecc.org/
Eclipse is just an editor. These let you customize your own language that also exploits whatever the hardware has to offer.
-
- Posts: 892
- Joined: Thu Mar 12, 2009 3:51 pm
- Contact:
John, I think the issue is that it would take one line(maybe two) of code to spit out an error message that actually gives a good idea of what's going on, without external reference.
I've been told that Fortran is pretty much the preferred language for doing math work. It was originally designed for number crunching, and while a few people have mentioned other languages, it still has the market share.
I've been told that Fortran is pretty much the preferred language for doing math work. It was originally designed for number crunching, and while a few people have mentioned other languages, it still has the market share.
Evil is evil, no matter how small
-
- Posts: 3
- Joined: Tue Sep 08, 2009 8:24 pm