tag:blogger.com,1999:blog-684183198890094283.post5552043288482004499..comments2023-11-05T06:12:59.718-05:00Comments on S.Lott-Software Architect: The COBOL ProblemS.Lotthttp://www.blogger.com/profile/06337323642834330176noreply@blogger.comBlogger3125tag:blogger.com,1999:blog-684183198890094283.post-17020915960619715912020-04-12T13:57:10.789-04:002020-04-12T13:57:10.789-04:00(Context: I spent several years early in my career...(Context: I spent several years early in my career building a system in COBOL. I've since been through about forty languages, and am now a Scala geek.)<br /><br />Huh. The interesting corollary of this approach (which, I agree, is likely the only practical way to go in many cases) is that step one can probably be done *automatically*. That is, I would do this as:<br /><br />1. Write a COBOL-to-X translator, where X is a more-modern programming language that -- very important -- provides good refactoring tools. (I would of course use Scala; given that Scala is actually fairly popular in the finance world, that might actually be right in some cases.) Along with this, you'd need to write the necessary libraries and adapters for the data and environment.<br /><br />2. Test the hell out of it, the way you describe.<br /><br />3. Start refactoring the resulting monstrosity.<br /><br />The heart of the current problem isn't just that COBOL is obsolete, it's that it predates the notion that refactoring *matters*; the result is that making incremental improvements is unreasonably hard. If you did a literal translation to a better language, the resulting code would still be horrible, but you would have a path forward.<br /><br />And yes, I would bet that writing an automatic translator isn't all that hard, in the grand scheme of things. Trying to *analyze* COBOL code properly is likely impossible, but simply translating it, warts and all, is simply a routine cross-compiler -- a substantial project, but not a huge one.Justin du Coeurhttps://www.blogger.com/profile/11400051979489457296noreply@blogger.comtag:blogger.com,1999:blog-684183198890094283.post-64081989984309566862020-04-07T23:07:26.918-04:002020-04-07T23:07:26.918-04:00[Good response](https://slott-softwarearchitect.bl...[Good response](https://slott-softwarearchitect.blogspot.com/2020/04/why-isnt-cobol-dead-or-why-didnt-it.html), thanks.Tom Rochehttps://www.blogger.com/profile/02860614024937471220noreply@blogger.comtag:blogger.com,1999:blog-684183198890094283.post-13910584909067919752020-04-07T11:04:23.570-04:002020-04-07T11:04:23.570-04:00Great post ... or great-sounding anyway, as I'...Great post ... or great-sounding anyway, as I'll admit to having minimal exposure to COBOL. But since you seem to have had lots, perhaps you can answer this question: Why didn't COBOL evolve more successfully?<br /><br />I'm asking because I have had significant exposure to FORTRAN, the other surviving-at-scale 1st-generation language. By which I mean, there is still a lot of it "in production" in engineering and science, as opposed to<br /><br />* Lisp: while it continues to be popular in some non-academic niches (e.g., Emacs), there is (IIRC, ICBW) no economically-significant long-lived software coded in any Lisp dialect.<br />* Algol: which is all-the-way dead.<br /><br />FORTRAN, OTOH, has survived precisely because it--and more importantly, related tools, esp compilers--has evolved to solve/overcome many (certainly not all!) of the sorts of pain-points you describe, while retaining the significant performance edge that (IMHO, ICBW) prevents challengers (e.g., Python) from dislodging it for tasks like (e.g.) running dynamical models (esp weather forecasting).Tom Rochehttps://www.blogger.com/profile/02860614024937471220noreply@blogger.com