Software Development

Sun: Always an Innovator


Sun is revamping, refreshing, and rewriting Fortran as "Fortress", a new language designed to automagically split work up amongst logical cores:

http://news.zdnet.com/2100-9595_22-6150063.html?tag=st.prev

This may be just the kick in the pants that the T1 Niagara CPUs need to run circles around the AMD and Intel CPUs: while the hardware may be slower, better compilers/interpreters might just do the trick. I have been saying for some time that progrzmmers are holding back our hardware, because they have been stuck in the single threaded mindset for too long to take advantage of newer hard Especially since much of it, like the mid-range Intel and AMD CPUs, as well as the T1 run single threads slower than their predecessor did.

Too bad it is from Sun, which means that they will find a way to make it technical amazing, but impossible for a mere mortal to use (see also: Solaris, Java for much of its life when backwards compatability was a joke).

J.Ja

About

Justin James is the Lead Architect for Conigent.

29 comments
TechExec2
TechExec2

The problem with Sun is not the lack of Fortran. Similarly: The problem with GM and Ford is not the lack of 1960s-style muscle cars (which they both are also bringing back). I'm all for real innovations that make multi-programming easier for people who find it difficult (I don't). But, IMHO, bringing back Fortran is another clear sign of desperation by Sun. What's next? Sun Cobol? P.S. Hey Sun! If you bring back Cobol too, please innovate by renaming the "ENVIRONMENT DIVISION" to just "ENV DIV". OK? That will help a lot. Thanks!! :^0

gwcarter
gwcarter

I don't think there is a Sun COBOL, but there is a Sun CICS. Nevertheless, Sun has given away Java, and it can't capture C++, so it must find something unique in the market. I have always regarded Fortran as a communicable disease rather than a meaningful programming language (Autocoder forever!) so I think this is just another ill-advised stab at product marketing. BTW: We used ENV DIV back in the early '70s in COBOL D with a very simple preprocessor. 8^)

Tony Hopkinson
Tony Hopkinson

may 2001 - 2003. On VMS. I got a few other offers to do it as well, still quite popular in certain epochs I means circles.

bshodges3
bshodges3

Those who have ever coded Fortran and have moved on to newer languages should look at Fortran 90 and 95 standards to see what has been added. And, for kicks, check out Intel's Fortran which has familial ties back to DEC's, Compaq's DVF, possibly H-P's, since H-P/Compaq passed their DVF on to Intel. I learned Fortran in '62 and still have a major application running on PC's.

DanLM
DanLM

Besides, I'd rather see IBM rewrite it instead of SUN. Yuk a hey, you can already code OO with it. What else do you want? roflmao, wait... I know, it to do funky graphic interface's instead of the CICS screens? Thats not a bad idea though.... Env division... How about Id division instead of Identification... Shoot, now you got me thinking how much typing I could have saved over the years. Dan

The Terminated
The Terminated

I dragged a skeleton COBOL program around with me for years. It had the four DIVISIONs with my favorite counters, flags, initialization routines, a loop for sequential processing, and my end of job routine. At each new site, I'd tweak the ENVIRONMENT DIVISION to match the site standards and then not touch it until I left. I'd copy the skeleton for each new program that I wrote and then fill in the blanks. Sometimes when I was doing maintenance, I'd copy pieces from the skeleton to the existing program. I probably typed the word "DIVISION" less than a dozen times in my 20-some years of COBOL. I agree that COBOL is a wordy language, but at least you always know what it's doing even if you don't know why the program is doing it.

Jaqui
Jaqui

innovating, then screwing themselves over with making it marketable and usable. :D

Justin James
Justin James

... I had the same discussion with some of our developers on my way out of the office tonight. You know, like making a SunRay X Workstation cost twice as much as an entire PC + Windows + Office, despite the fact that it is just a monitor with a small firmware, NIC, and input ports... hardly "marketable". Better to just get a dinky PC, put a free *NIX on it, and get to X like that... J.Ja

Justin James
Justin James

Do you think that a Fortran based language designed for multi CPU systems will help Sun get ahead? Or will it be ignored? Let me know! J.Ja

metilley
metilley

APL is already optimized for parallel processors and is a better choice for engineering and scientific programming, or any kind of work that is numerically intensive. This could be financial analysis (actuarial or cost forecasting). Also a host of other "grand-scale" type programming applications, such as weather forecasting, nuclear explosion simulation, etc. Unfortunately, much of this work is done in archaic languages like Fortran, ADA, and (yeeech!!!), C or C++. APL is an array processing language and would be better suited. Fortran, by comparison, is for retarded people. (No offense intended)

jslarochelle
jslarochelle

I am allergic to Fortran. Probably because we were still using punch cards we I wrote programs in that language. I still have stacks of them around. I use them as placeholders in books. I think APL would be a better choice if Sun is to work on an old language (I cannot say dead language because I learned recently on this site that the APL was still alive). This version would need to have classes however. Because I think this is the best way to partition large application and is a better match for reality. JS

Tony Hopkinson
Tony Hopkinson

no really. :D Does APL have the most important feature of Fortran though? Can you use variables without declaring them and program in it even though you've never been taught how to do so? The idea of a mechanical engineer or a mathematician, with an array processing language is enough to give me the heebies, most of them screw fortran up.

Justin James
Justin James

It has been on my "to do" list since you first mentioned to me. I am slack. I admit it. J.Ja

XEntity
XEntity

I originally learned to program using Fortran on an Old Dec Vax machine. Both the code and hardware were unforgiving and rigid. Today we are now working with new and improved technologies and methods. So SUN has rewritten the FORTRAN compiler and is seeking to expand its library much like C++ has evolved. Of course, the open source market operates on a democratization of the design through RFC's. This for Sun may be a first step towards enriching multi-core capabilities but I feel it may fall short of real scientific and engineering processing advancement. The problem today is that science and engineering are moving beyond the capability of a single person or even a small group of scientist or engineers. The high end serious processing needs to be able to deal with massive polynomials and a vast array of sensory inputs in a more intelligent manner. I have been studying the use of Field Programmable Group (Gate) Arrays, FPGA, and their ability to solve parallel and time sensitive processing issues. I feel an ideal product design would be able to have a virtual backplane of processors that can be arrayed logically to solve complex problems. By arraying the processors or their individual cores I am not talking about stacked job processing or breaking code into chunks to be processed by individual core elements. I am talking about arraying logical processing paths to arrive at intelligent outputs within the machine much like artificial intelligence. Fortran or Fortress needs to be able to, not only handle parallel processing issues, but be able to have libraries that array processors to solve the more complex problems. In this way, instead of having rooms of people iterating designs or data processing analysis, the system and more aptly its code could solve more complex problems in less time focusing attention on other issues for humans. However, I have not been impressed with McNealy's leadership at Sun. Sun seems to have a high degree of marketplace arrogance. I feel that in the recent past Sun has missed the mark on too many opportunities. Fortran is not as widely used as C++ which could perform the same task with equal skill. Perhaps Sun should rethink their positioning in the market on this one.

Tony Hopkinson
Tony Hopkinson

deal with parallelising a serial program. Languages for parallel programming have been around for decades. Hardware setups to do it have been around for decades. To do it without at least hints from the language would require us to write a program that could program. If we ever get AI, it will happen by accident. If we were capable of producing AI deliberately, we wouldn't need to.

Justin James
Justin James

The JVM and the .Net CLR (particularly the .Net CLR) can be viewed in that light. You can write various languages that are quite specialized (like F#, IronPyton, Perl.Net, APL Next, and so on) that compile down to a general language (the MSIL that runs in the .Net CLR). J.Ja

XEntity
XEntity

Not familiar with it. Although there are 100's of seemingly unknown attempts at various ideas. The military is infamous for developing languages that have unique applications. For example, because speed and crunch time are so important in the digital battlespace, AEGIS systems use a special language of their own. Perhaps they could develop a language that is the same for everything but may have a special "skin" for the unique uses.

Tony Hopkinson
Tony Hopkinson

OCCAM that was designed with parallelism in mind. That was about 93 I think, so it must have been out a while by then. Was a concept thing, never used it in anger. There's a lot of parallelism going on right now in the software hardware interface. Aside from any pre-designed threading, there's context switching between processes, delegating to intelligent devices like your disk controller... These are all patterns the system has been configured to recognise. An AI system would identify them, and configure itself.

XEntity
XEntity

Tony, I agree with you on that case of robotics you cited. The problem is that unimaginative minds tinker with the constrained limits of existing technology. I puttered around with robotics. I purchased some stamp controllers and a OOPIC controller. The added a bunch of devices exploring its capabilities and cascaded some of the controllers. Most people do not understand what AI is. In a web application project I did about 6 years ago, I used the structure of AI and built a application where clerks at desk input their piece of data manually or process data. Over the intranet these nodes all begin exchanging information and processing it. For example, one low level node enters its data. Several other nodes received the data as raw input, some processed it in various manners providing report outputs. High level nodes received several processed reports over the intranet and 'crunched' their output. I designed it in such a way as lines of logic began to emerge and the final output was a knowledge report. Each node was assigned specific information processes and the lines of logic often reflected contingent conditions. Therefore, we could monitor for the emergence of contingency conditions. ie a specific event may be 80% formed. While a crude way of doing things the same kind of thinking could apply to processor arrays. Overall, in order to be creative we must be able to see beyond bump and store processes.

XEntity
XEntity

In my limited view of the world, it seems to me that the most flexible and friendliest system to dynamic computing would be uniquely architected differently than todays systems. In my minds eye, I can see one virtual bus in a system having scores of processors or processing cores that can be arrayed through code in parallel or series based on the computing dynamics. I can also envision a new kind of virtual bus in that same system that arrays a host of sensory devices. These sensory devices are designed very much like human senses operating on fourier waveforms and holography. New kinds of sensory devices may emerge as an outcome of profound new understanding. I can also see a co-operation in this future system between biological and silicon based computing. While such a system could be more than 100 years into our future, I can see the virtual bus of processors in the near term, possibly in the next ten years as the basic architecture and technology exists today. Perhaps the glimpses into the future we are given in Star Trek could illustrate the kinds of analysis possible with these systems. Perhaps an emergency medical hologram that reprograms itself will be possible. Who knows with what technology will avail humans?

gwcarter
gwcarter

All of the parallelization efforts I have seen have at their core the adaptation of the code to the hardware. Nevertheless, we have successful algorithms for adapting code to multiple processors, such as the one that SETI uses to spread their computations among the world's screensavers. Why not adapt the hardware to be dynamically reconfigurable in the same manner? If history is any indication, someday soon the prices of multicore processors will drop in the same manner as did the prices of megabytes of disk storage. Then we can use a language with OO syntax to present source-declared parallelism parms to the OS which can just ask the processor to reconfigure. FORTRAN clearly isn't the answer but I can't believe we couldn't d this now if we wanted to. All of this must have John Backus spinning in his grave. Thit is quite a trick for someone buried face-down 9-edge first.

Tony Hopkinson
Tony Hopkinson

and all the AI, such as it is, is us imprinting patterns on a system. I was watching a bit of tv a while back about significant advances in AI. They got a robot, put a collision sensor on it and stuck it in a circle of card. It 'learnt' the range of it's movements by bumping into the barrier, it remembered where the barrier was so eventually it could scoot around in there and pull up with out colliding. This was described as intelligent, the really clever bit was when they took the circle of card away and it still never strayed out!. :D I laughed my ass off until I realised it was paid for with my taxes. :(

XEntity
XEntity

I think what you are pointing to is the machine being able to infer versus literal matching. This is far more complex and relies on symbolic logic and holography. This kind of approach requires rethinking the basic machine and tenents of coding. I'll look into this more.

Tony Hopkinson
Tony Hopkinson

I can parallelise an algorithm. Depending on how complex it is and how amenable to being multi processed in the first place. Writing a generic algorithm to do it to any serial process, well I aren't that clever. Graphics, is a case in point, there are a number of well understood patterns for things like filters and effects that are in use now. Some clever human sussed them out. Pattern recognition is something we beat computers on all ends up and we don't know how. It sure as crap isn't comparing two images point by point though which is basically the limit of our coding skills for a generic comparison routine.

XEntity
XEntity

I have spent time studying AI and neural nets. We cannot achieve AI by sitting around remarking that AI needs to create itself. That thinking sounds to close to the soup kitchen theories regarding the origins of the universe. I suppose there were a bunch of computer chips sitting around and the conditions were just right when they sparked into artifical life. I suggest that nueral nets are put into practice to solve signaling issues. However, these systems require surreal numbers of processor nodes in order to achieve serious results. However, I believe that crude results could be achieved using as few as four cores and primitive results with two cores or processor nodes. In time, we will learn and these nueral nets will become increasingly more sophisticated.

XEntity
XEntity

I wanted to think on this one as it has been some time since I last worked on these kinds of issues. I hope I am framing this correctly for you? Parallelizing wave functions is a long standing issue. I can see the importance of this issue as human senses, thermodynamics, string theory, holography, information theory, and much of the physical is based on wave dynamics. The waveforms arrive in both a spatial and temporal manner lending well to serial processing. However, dimensionality of the waveform drastically impacts the crunch time and complexity of processing. Some approaches attempt to map bifurcations then use the bifurcation points as breakpoints for processing but only applies if the waveform set is structured as such. In fractals and chaos theory the waveform can be of numerous types. Many of the waveforms are standardized for specific applications. In the combination of hardware and software solutions there are many approaches to this problem. In general, buffering a sample of the incoming waveform for a specific sample interval and conducting pattern matching could identify the specific patterns for selection of standard processing. More than likely though the coder will know the waveform he is dealing with and be able to code for that unique event. If the hardware and operating system is fixed as usual then coding, of course, is constrained to that extent. I am not familiar enough with the current multi-core technology to know the extent to which it can be independently tasked or organized. Nonetheless, if a dual or more core can be tasked fully independently and the processing can be arrayed then the problem may dramiatically reduce to establishing a library of numerical recipes. The constraining points are recieving a sample of sufficient size to conduct some sort of pattern recognition then crunch time. There are also constraints based on the operating system's ability to task the cores. For the coder, in the end, it is selection of the breakpoints and subsequent numerical recipe to solve that pattern. Fortress is looking at those numerical recipes and potential ways of tasking the processor. I suggest that the fortress librarys and command statements center on arraying processing cores in series and parallel combinations based on the signal sample requiring processing.

XEntity
XEntity

Is to address the parallizing serial wave functions in another post. Generally, my think is to establish a library of pre-designed neutral sets that when called organize the processors in a specific pattern at the time of coding. Each processing node would act independently from the others and trigger processing based on its library signature. Its not true AI but approximated. I had a friend who does very high end graphics and has to crunch tens of thousands frames. Usually he is given short notice and goes sleepless for days trying to complete the work. I helped him reduce his processing bottle neck by setting up several machines and arrayed them based on the intensity of work. Manually, he broke the job up into several subsets then started the first tier machines crunching. As the batches completed we moved them on to the machines in the second and thirds tiers until all the batches had been processed. We were able to reduce the crunch time to 20% of the original crunch time. This same effort could have be achieved using processors arrayed on edge cards. The modelling, ray tracing, and shadowing could have been executed in code tasking the processors. Likewise, I have noted in mathematical modelling amongst the sciences that some the models are gaining increased complexity. Look at M-Theory which uses polynomials to the 10th order. Conducting conformal mapping into three space and simulating events could take days of crunch time. However, by arraying processors this time could be drastically reduced and more complex analysis observed. I see greater possibilities in human centric relationships with machines.

XEntity
XEntity

While there is numerous product possibilites for science and engineering, there is also a practical everyday possibility of such a combination between a language and hardware design I discussed. These kinds of systems can add life to a building, vehicle, or business system that communicates with people in a human centric manner. I think there are enormous possibilities but we seem to act with small minds.

Tony Hopkinson
Tony Hopkinson

but I know some guys with 20+ years of fortran on DEC kit under VMS who are looking at potential alternatives, I'll post it on.

Editor's Picks