Skip to comments.SQUID: Open Source Compiler & IDE for the Ada 2012 Language
Posted on 03/18/2014 4:40:53 PM PDT by OneWingedShark
This project is to develop the basis for an Ada 2012 Integrated Development Environment more akin to the `80s/`90s idea of a Programming Support Environment (complete with project-management tools) -- to start, this kickstareter project is for me to:
In order to avoid possible licensing issues, I am planning on using Delphi 2007 (for Dotnet) as the implementation language, with InterBase for the database, though it should be possible to compile with freepascal (and/or use FireBird). I cannot guarantee that such a compilation will be in accordance with all licenses as I have not read the freepascal license in full.
(Excerpt) Read more at kickstarter.com ...
Could I get a tech ping for this?
Ada is still around???
It shouldn’t be. General Short said that Ada would live or die with Stanfins back around 91 and I know for a fact that Stanfins died on account of Ada.
Ada was an idea who's time was over before its specifications were finalized.
There are apparently several hobby Operating systems out there
I've seen so many crap RFPs go out, and so many poorly written (and ever-changing) requirements. The problem isn't that we aren't using enough ADA -- the problem is that the people in charge of program management are generally fools.
Cutting to the chase: most DoD technology projects are basically treated as jobs programs -- "I need to spend $1B ... how can I do that? Who owes me a favor? Who do I owe? Let's create an empire and start writing checks!!"
The purpose of the program is quite incidental, and everyone knows that by the time the thing fails (most of 'em do) the people who set the ball rolling will be long gone and working on some other "next great thing".
I’ve paid my dues writing Ada code. I am glad that part of my life is far, FAR behind me. What a horrible language. Generics? Give me a break.
Yes — the Ada 2012 standard came out in Dec of 2012 and added a lot of nice things to the language:
(case Ch is when '0'..'9' => true, when others => false )
go stalelike they can/do in annotated comments.
Really? An emphasis on maintainability, correctness, and programming as a human action are an outdated idea?
And here I thought it had a good reputation in areospace (eg Boeing's 777, the F-22, the Apache helicopter) and things like rail-transportation and nuclear reactors and security applications (e.g. Tokeneer) precisely because of these qualities.
A language can "emphasize" maintainability, correctness, and programming as a human action until the cows come home; that won't make a crummy programmer into a good programmer.
C++, Java, C#, are chock-full of features and instrumentalities that can help good programmers write good software. That doesn't stop people from writing impenetrable spaghetti code using those languages.
I've done lots of contracting work. Calls for Java and C++ programmers outnumber those for ADA programmers by at least twenty to one. YMMV.
I very much agree — in my most recent employment project there was no real spec or requirements, and they kept changing what it's process would be. (It was essentially a web-based document-management application, but things like how to determine when a document would expire were never explained, the process/workflow kept changing, and the manager wanted to do a UI-first development mentality [which was all the more difficult because of the changing workflow].) So, yeah, I understand.
While I'm not necessarily against anything new in the language, OS or IDE realm, I think it's pretty clear that we do already have tools that function. We just don't know what to do with the tools we have -- or, if we know, we do not act on the knowledge.
Again, we're in agreement.
What I'd like to do in this project is have not only the code under tight control, but things like project-management and requirements and testing integrated into the programming environment — I'm just trying to break things down into manageable, achievable steps.
I like Ada's generics a lot more than C#'s -- and there's a lot worse languages out there, some of which are quite popular (e.g. PHP).
Arrrggghhh! I know a certain Chief Engineer is who is trying to get ready for a Critical Design Review -- he is doing this by working on the UI for the information that will satisfy the CDR checklist. Of course, we don't have much of a checklist. Or any POCs assigned for any KPPs. And no thought has gone into what sort of objective evidence that would be used to satisfy the checklist. But golly, we're going to have an awesome UI!! Of course, the problem there is that there are no requirements for the UI and although he has viewed 4 prototypes of the UI, his constructive feedback so far has amounted to: "I don't like this one either."
The guy wants to paint his house before he builds his house, and he can't even decide on the color of the paint. But he's a GS-15, so he's basically God.
I assume that POC is Proof of Concept, but what’s KPP? (And CDR?)
Point Of Contact (POC) the person you need to talk to in order to get something done.
Key Performance Parameters (KPPs) — Performance attributes of a system considered critical to the development of an effective capability.
CDR — The Critical Design Review (CDR) confirms the system design is stable and is expected to meet system performance requirements, confirms the system is on track to achieve affordability and “should cost” goals as evidenced by the detailed design documentation, and establishes the systems initial product baseline.
This is true, but we can write tools to minimize this.
That's the thrust I'm aiming for.
I've done lots of contracting work. Calls for Java and C++ programmers outnumber those for Ada programmers by at least twenty to one.
*shrug* — I'm not looking at language popularity, I'm looking at making a solid IDS/PSE that's freely available and geared toward making secure/solid programs.
Ada 2012 has a lot of features that lend itself towards this goal, partially because the original 83 standard required a library, which was required to ensure consistency, and partially because of the new attribute-system, which can be thought of as analogous to properties for an object — which makes a database back-end [for code storage, retrieval, manipulation] plausible and likely more practical than in the past. — I want to integrate these tools (like requirements and documentation handling) into the system as well.
Thank you for the clarification.
My personal belief is that "good programming" is the result of a certain style of thinking and a willingness to cooperate with a group, to have faith in the idea that one's future interests are best served by making the best possible project for the customer.
Bad programming results from people who (a) can't think straight, and (b) people who are focused on looking smart or creating job security for themselves, or both.
The off-shore programmers I've dealt with are terrible programmers, mostly. They have a bureaucratic, legalistic, procedural view of the software development process. Their concept of the project's lifetime costs, scalability, maintainability issues is essentially nonexistent.
I wrote a large project back in the late 1990s mostly in object-oriented assembly language. I implemented almost all the object-oriented instrumentalities in MASM (the one I wasn't able to do was "data hiding," which is a bit difficult at the assembler level). The customer kept adding more and more unexpected requirements, and the code base kept growing and growing and working and working. I was amazed.
A language and programming environment that can guide programmers to good practices would certainly be a wonderful thing. I'm wondering if it can actually be accomplished though.
I've been doing some database programming for the first time recently. I like it. I like the "back end" concept.
Here's my take on it (as an EE with a hardware background).
Making your software project "data-base centric" is a total game-changer. It not only changes the architecture of the code, it also changes the architecture of the project team and of the management of that team.
The difference is similar (in my view) to the difference between a hard-wired computer network and a wireless network.
When you have a hard-wired network, adding a new component typically means tracing through cable trays and conduits back to some component (a router, a switch, etc.) and then running a cable from the new device back to the correct connector. If - after doing this - the set up is deficient in some way (not enough bandwidth, too much latency, etc.) you have to go back closer to the backbone of the system and try again. This means more cable tracing, more wire threading, etc.
With a wireless network, you just set up the necessary IP addresses, accounts, password information, and you're done. The details of the connection are taken care of by the wireless router. Yes, there is a speed penalty, especially as more users share the same wireless domain, but that is often more than outweighed by the convenience and flexibility of the connection.
In like wise, adding a new software component, a new capability, to a DB-centric software architecture is much simpler. You don't have to trace back through mounds of legacy code to find out which class, which list, which map, which vector, your data is in, and then create new plumbing to allow your new component to interact with the old data. You just set up a new query; the DB takes care of the details. Worst case is that you need one or more new columns in a table, and a re-indexing is necessary.
Yes, that’s exactly it.
I want to move all the complexity of our current software environments to a single ‘package’ (the IDE/PSE) — no more worrying about which compiler you’re hitting on your path, or wasting time/energy on a recursive make, no more noise on repository-commits because you forgot to change tabs/spaces settings on your editor, etc.
Sorry for the delay--I was on a plane this morning.
Not a problem at all — hopefully
pingees will benefit from the [admittedly limited] discussion above.
PS — I hope your flight was a good one.
I’m a little surprised with the name, given how Squid is the name of a very well known open source Web Proxy Server that has been around for years.
“Ada was an idea who’s time was over before its specifications were finalized.”
And whose time will arrive again when enough people get killed by software written under the “programming by caffeinated hubris” model.
Hehe. I hear you! I sure have had to deal with an oversized ration of that over the years.
But, in all honesty, how does Ada combat hubris?
I'm really interested.
As one who hires programmers.
I am not strictly an Ada fan, but I think that the idea of actually having a specification, a design, and documentation will make a comeback when a software disaster involving lives, money, or both occurs and people outside our industry start to look at how we do things. The need for this kind of rigor was recognized in the 70’s, but now that the cost of fixing bugs in the field has dropped, it has devolved back into nonverbal hackers feverishly pounding out reams of indecipherable code.
I'm not sure you're right, though. Here's why...
Let me make an analogy. Bear with me a second here.
I'm an electrical engineer, right? I always loved electronics. Even before I could design electronics, I loved the idea of designing electronics.
I actually got a chance to design some cool stuff, in the '70s, '80s, and '90s. By the mid-1990s, I found myself doing more and more design work in software, instead of in hardware.
My rationalization of this (to myself) was this: I can produce more functionality per unit time in software than I can in hardware.
Now, with computers so fast, memory so cheap, and peripheral equipment (displays, cameras, scanners, digitizers, etc.) so inexpensive and high quality, there's almost no reason to design anything electronic. Whatever you want is already designed and expressed in chip form. A pure "circuit design" type of electrical engineer just can't compete with the combination of digital logic and firmware, integrated at the chip level.
Here's where the analogy comes in.
The kind of high-quality, carefully designed code-level software development you're describing may be (and this is just a theory with me) going the way of high-quality electronic design, the kind you do with transistors, transformers, op-amps, that sort of thing.
It's being replaced with the activity of what are called "script kitties." This term, although it originates in the hacker world, seems to me to describe what is going on in the larger computer world.
Young people, getting out of college now, don't really know that much about the inner workings of computers. They don't care about memory, or about speed, or about parallelization, or bus bandwidth, or any of those low-level concepts.
Instead, they "rack-and-stack" software components that they get on-line, and glue them together with PHP and Python. I hate those languages. I like C++, and even C, because they're "so close to the metal."
But I'm an anachronism, and I admit it.
I'm not saying you are wrong in your concerns about quality, and I'm not saying your idea doesn't have merit. But what if the pendulum never swings back?