Programming languages

Discuss life, the universe, and everything with other members of this site. Get to know your fellow polywell enthusiasts.

Moderators: tonybarry, MSimon

Luzr
Posts: 269
Joined: Sun Nov 22, 2009 8:23 pm

Post by Luzr »

JohnFul wrote:
I quite doubt it. What actually can you do with 144 simple cores? Are there any practical applications?
You can use 35,000 cores to apply texture files to CGI images and create a movie called Avatar.

http://wellington.scoop.co.nz/?p=19750

J
Yes. 35000 *Intel Xenon* Cores.

I dare to say that Ga144 with 64 words of memory is a little bit less powerful than Intel Xenon with 4GB....

And no, you would not fix that by adding thousands of GA144s. If nothing else, they can only address 256MB of external memory - and if I understand it well, all 144 cores of single CPU share the same memory. No FPU either. Totally unsuitable for the task.

JohnFul
Posts: 84
Joined: Sat Feb 27, 2010 7:18 pm
Location: Augusta, Georgia USA

Post by JohnFul »

They are indeded Xeon cores. The point however was that the multipule cores are being used for specific repetitive tasks. The Xeons were used because they were off the shelf.

I suppose a better example would have been Polaris or better yet Avebury

http://news.cnet.com/Designer-puts-96-c ... 99128.html

http://news.cnet.com/Intel-readies-mass ... 90856.html

J

MSimon
Posts: 14334
Joined: Mon Jul 16, 2007 7:37 pm
Location: Rockford, Illinois
Contact:

Post by MSimon »

Totally unsuitable for the task.
The market being addressed is different. Cell phones vs desktops. Rooftop solar vs a rendering engine.

Since control interests me more than crunching vast volumes of data.

The right tool for the job. Why use a 20 Hp jack hammer for pounding a couple of nails?
Engineering is the art of making what you want from what you can get at a profit.

kunkmiester
Posts: 892
Joined: Thu Mar 12, 2009 3:51 pm
Contact:

Post by kunkmiester »

So what's the smallest you could get such a mini-core and still have it useful?
Evil is evil, no matter how small

MSimon
Posts: 14334
Joined: Mon Jul 16, 2007 7:37 pm
Location: Rockford, Illinois
Contact:

Post by MSimon »

kunkmiester wrote:So what's the smallest you could get such a mini-core and still have it useful?
I believe they have a 4 core machine in an 8 pin chip.

Where they want to compete is down at the Microchip end of the market.

So you get 2.4 GIPS for about 10 cents a chip in volume with a peak power consumption of 18mW. I was just reading an ad for the latest and greatest control chip that uses 4 mA (probably 1.8V I haven't checked) for 20 MIPS.

Since there is no Flash on the die you need a small SPI or I2C Flash to load your program. A small price to pay for the performance. And each core can have up to 256 instructions (128 words of RAM) without reloading (traded off with what ever your RAM rqmts are).

I'd expect a wider array (heh) of devices if the company becomes profitable.

Down where us controller boys operate "weird" architectures are not uncommon. The only thing that matters for a volume application is price/performance first and speed of development second.

My understanding is that they also have a C compiler for those uncomfortable with native code. But it is like DSP (another land of weird architectures) if you want to get the full measure of performance Assy Lang is the way to go. It is just that this chip's assy lang is Forth.
Engineering is the art of making what you want from what you can get at a profit.

kunkmiester
Posts: 892
Joined: Thu Mar 12, 2009 3:51 pm
Contact:

Post by kunkmiester »

I was referring to my million-core concept. If you divided up a processor like that, just how small would you want ot make the core?
Evil is evil, no matter how small

MSimon
Posts: 14334
Joined: Mon Jul 16, 2007 7:37 pm
Location: Rockford, Illinois
Contact:

Post by MSimon »

kunkmiester wrote:I was referring to my million-core concept. If you divided up a processor like that, just how small would you want ot make the core?
I don't think you would want to make the core any smaller. In fact if they could get it into a 90 nm process (currently 180) I'd go to 32 (or 36) bits. 4 instructions per word (max) and 256 words of RAM (1024 instructions max). I might even shoehorn a hardware multiplier in some of the cores for fast DSP.

I'd leave the stacks the same depth (or maybe double them).

I think the 144 core job is limited by package size. Costs these days are mostly limited by the cost of a package. Silicon for most devices is trivial. Figure a penny a pin for the package. More for a BGA.
Engineering is the art of making what you want from what you can get at a profit.

MSimon
Posts: 14334
Joined: Mon Jul 16, 2007 7:37 pm
Location: Rockford, Illinois
Contact:

Post by MSimon »

BenTC wrote:I went looking for stuff I recalled on 'Joel on Software's blog. While I didn't find it, I did come across an interesting article on version control. I've been trying to choose between version control systems for some time, and I quite respect Joel's opinions.

Distributed Version Control is here to stay, baby
Thanks for that.

It seems excellent for team software development. It will not pass FAA muster for delivered code. IMO.
Engineering is the art of making what you want from what you can get at a profit.

usernameguy
Posts: 3
Joined: Tue Sep 08, 2009 8:24 pm

Post by usernameguy »

It seems excellent for team software development. It will not pass FAA muster for delivered code. IMO.
What a foolish thing to say.

Of course it won't. Have you seen those ISO standards? No one adheres to any of that if they don't absolutely have to.

It's like saying "Oh the iPad is good, but it won't work underwater." So what?

BenTC
Posts: 410
Joined: Tue Jun 09, 2009 4:54 am

Post by BenTC »

MSimon wrote:It will not pass FAA muster for delivered code. IMO.
I'm not at all familiar with those. What is the general concept (or specific example) this falls down on?
In theory there is no difference between theory and practice, but in practice there is.

MSimon
Posts: 14334
Joined: Mon Jul 16, 2007 7:37 pm
Location: Rockford, Illinois
Contact:

Post by MSimon »

BenTC wrote:
MSimon wrote:It will not pass FAA muster for delivered code. IMO.
I'm not at all familiar with those. What is the general concept (or specific example) this falls down on?
Versions are strictly controlled. All the tools for a version and all the code are cataloged. Including make files. Compilers. etc. Then you make them write only (effectively) except for authorized changes.

Before the first official build you can organize the project any way you want. After the first official build change becomes much harder. And the organization becomes more top down.
Engineering is the art of making what you want from what you can get at a profit.

BenTC
Posts: 410
Joined: Tue Jun 09, 2009 4:54 am

Post by BenTC »

MSimon wrote:Versions are strictly controlled. All the tools for a version and all the code are cataloged. Including make files. Compilers. etc. Then you make them write only (effectively) except for authorized changes.

Before the first official build you can organize the project any way you want. After the first official build change becomes much harder. And the organization becomes more top down.
You would still have a central authoriative build repository which pulls changes from developers after their unit testing. For the build environment, Mecurial can apparently lock parts down - see "Repository Permissions" here. At each release build the tool binaries could be fingerprinted with something like md5sum to a log file to diff with previous versions.

From this article I like this description:
So there is the appeal of distributed version control systems. Branching allows you to track individual changes in their own branch and “graduate” them to the trunk after they have been vetted. Not only that, if you want to test a fix in isolation from other trunk changes, the branch allows you to do this by implementing the fix on top of just the last released version of the code, without any interim work (vetted or not) to gum up the works.
...
Getting back to my point about branching being the regular mode of operation in this model, each repository in this case can be viewed as its own branch. The reason I say this is because each copy of the repository is not required to synchronize after a change is committed. In Subversion, once a commit is made, no other changes can be committed by other developers without them first updating their working copy and resolving any conflicts that arise. This means that, in Subversion, for a given directory path there is only ever one line of commits (barring discussion about explicit branches, which are different directory paths). Each subsequent commit contains all of the changes from every commit prior to it.

In a Distributed VCS, concurrent changes can be committed to the various developers’ separate repositories, and no communication happens between them unless the developers initiate it manually.
I take this to mean that its harder for developers to change-control their work-in-progress files without interuptions from other developers - and also that developers can parallel work on the same area of code and cross-test with each other before it hits the trunk and affects the rest of the team. [Disclaimer: I'm only familiar with this in-theory, not-in-practice]

BTW, I assume theydo currently us some sort of source versioning system. Do you know what is typically used?
In theory there is no difference between theory and practice, but in practice there is.

MSimon
Posts: 14334
Joined: Mon Jul 16, 2007 7:37 pm
Location: Rockford, Illinois
Contact:

Post by MSimon »

BTW, I assume theydo currently us some sort of source versioning system. Do you know what is typically used?
Not currently.
Engineering is the art of making what you want from what you can get at a profit.

blaisepascal
Posts: 191
Joined: Thu Jun 05, 2008 3:57 am
Location: Ithaca, NY
Contact:

Post by blaisepascal »

MSimon wrote:
BTW, I assume theydo currently us some sort of source versioning system. Do you know what is typically used?
Not currently.
Do you mean that you don't know what source versioning systems are used in the development of FAA-certified flight software, or that you do not believe that developers of FAA-certified flight software are using source versioning systems?

MSimon
Posts: 14334
Joined: Mon Jul 16, 2007 7:37 pm
Location: Rockford, Illinois
Contact:

Post by MSimon »

MSimon wrote:
BTW, I assume theydo currently us some sort of source versioning system. Do you know what is typically used?
Not currently.
Engineering is the art of making what you want from what you can get at a profit.

Post Reply