## Why did some US institutions not migrate their very old software systems to use somewhat newer ones?

25

5

This article shows us that some US institutions rely on very old systems (written decades ago, which is very long in IT time perspective):

New Jersey Gov. Phil Murphy says that the state is looking for volunteers with skills that can be used to help in the COVID-19 coronavirus outbreak, and one of those skills is knowing your way around a 61-year-old programming language used on big, old, mainframe computers.

COBOL is an old computer programming language that was first developed in the 1950′s in conjunction with the Department of Defense. Today, most programmers prefer and use more modern languages, but there are pockets where old software written in COBOL remains in use, particularly financial applications and in large enterprises or government agencies.

I understand that state institutions are typically much slower than private companies when it comes to IT investments, but having such old systems seems strange. Relying on systems written so long ago can cause lots of issues which translates to wasted time and money:

• initial developments done way before modern software development frameworks were around
• it is much harder to find specialists who, not only know the language, but also are able to adapt to such an old software development style
• inability to act in a timely fashion when changes or maintenance must be performed (e.g. Coronavirus outbreak)

It makes sense for banks to be reluctant to rewrite their systems due to the large costs and risks, since they must also make profit. But a public institution is not profit based and, in theory, it should be easier for them to invest in modernizing their software systems.

Question: Why did some US institutions not migrate their very old software systems to use somewhat newer ones?

Note: I am not thinking about big leaps, but why not at least to work with languages and frameworks that are at most 20 years old, not 60 years old.

While the main arguments are related to software development risks, I am interested in the political aspect of the "why", since we are talking about public institutions making the decision to partially / totally rewrite some systems. Such aspects might be related to the size of projects or quality of management in the public sector.

14

VtC because the question is about IT management. That said, it is worth remembering Joel's classic article about the problems of just scrapping old software for newer, "better" software

– SJuan76 – 2020-04-10T11:08:09.417

2I think that the case that there is some political value to this question can be made by the frequency with which big governmental IT project failures get covered in the mainstream press. The monetary outlays in some cases compete with more traditional government spending on other stuff like new roads or hospitals. – Italian Philosophers 4 Monica – 2020-04-10T19:49:36.113

@SJuan76 - it is also about software development, but there many political aspects highlighted by the existing answers: quality of management in the public sector, size of the projects that drastically limits who can develop them, spanning over multiple political cycles which might shift the priorities etc. – Alexei – 2020-04-11T05:19:44.070

## Answers

25

Because it's hard to do, prone to failure and very costly. And because the newer alternatives aren't always a big improvement.

Very costly in the sense that failure costs in the multiple 100M$are frequent and could easily become an election issue (which makes it a political issue). Take a look, for example, at Canada's revamped payroll system.$1B+ and counting, IBM was the lead on that. Or look at Capita's activities in the UK and its litany of project failures. Or, 70M$for a failed asset management system in Australia, for a population base of 250 000 people . ## TLDR: Even with the best of intentions, we in the IT industry do not yet have a proven, repeatable, low-risk solution to migrating old government IT systems. Nor is it all that clear what the next good-for-60-years technology solution is. Until that is the case, there will be a strong case for keeping seemingly obsolete, but working, systems running as long as possible. ## The details: ### Bad project management in public sector It is very difficult to reduce risks when doing this kind of work. One option is to do a strict migration from the old system to the new system, with it doing the exact same thing as before. This is very rarely done, usually all sorts of new bells and whistles are attached which makes it very hard to judge if the new system does indeed provide the same results as the old. Often, under the guise of the rewrite, the existing business processes and organization are revamped, on the basis of the promises made by the promoters of the new system. Governments IT is often driven by consensus and stakeholders. Give too many people a say in what a complex system should do and what should be fairly straightforward degenerates into endless meetings and shifting requirements. Worse, rather than picking a representative sample of senior front-line users of the existing system, the stakeholders are usually their managers. Often, retraining of, and engagement with, front-line users takes a backseat to more glamorous techy and PM stuff. Many government departments also consider themselves special and do not like off-the-shelf existing solutions. That ranges from not buying a commercial product but writing it from scratch to acquiring an off-the-shelf product but then customizing it all out of recognition. Even when intentionally trying to keep things simple there are inherent complexities to government IT. A payroll system will have to keep track of multiple union agreements and seniority scales, for example. Not all payroll vendors are able to do this without custom development. Unclear and shifting requirements, and to a lesser extent, incompetence by the outsourcing contractors is what kills most government IT projects. Big IT projects require a Mr./Ms. No. The successful projects I have participated in all usually circled back to the project director having the authority and willingness to tell the different stakeholders to buzz off, but only when necessary. Their driving principle is to keep things simple, a single-minded focus on reducing risk and to get a solid working basic system out as quickly as possible. That person needs to be backed at the highest level. It's rare to see that recognized, and doubly so in a public sector setting. ### Limited choice of, not always good, contractors Public sector IT relies heavily on outsourcing to big, high-profile, companies. IBM, Capita, Accenture, etc... (in the distant past, when governments were a major driver in early IT a lot of this work was done in-house). The government consulting companies are often more skilled in navigating the complexities of government procurement and contracts than they are at actually producing working products. However, when a project fails, the immediate government reaction is to be more careful the next time and select a trustworthy, credible, vendor. Small companies, even assuming they somehow knew better than the big consulting giants, don't have that credibility, or the commercial skills to win a bid, and the next big project tends to be awarded to yet again the IBMs and Capitas of this world, even while these companies outsource more and more to lower cost countries like India. The next line of defense is to write longer and longer specifications documents, with every contingency covered. That 1.2B$ Canadian payroll project had a 6000 page spec at some point. Specs cost, but are not working code at the end of the day. No one is going to read 6000 pages of tech specs, making it very hard for any one person to fully understand the project.

A primary point of effort for a consultancy (their core competency to cite a comment) then becomes managing the spec and limiting their contractual risks, rather than getting things done. At the opposite end, the much vaunted spec-less methods like Agile are woefully inadequate in these contexts.

### Technology

COBOL is 60 yrs old, about as unhip as you can get and is suffering from a shrinking pool of talent. On the other hand, it is very stable, you can take 30 yr old code, run it in a modern environment and expect it to work and it also quite clear to read. It's there and it works.

The primary alternative would be Java. 25 yrs old, no shortage of talent. But... there is much less version-to-version stability, the codebases are rarely very clear and there is a strong culture of overcomplexity in its practitioners. Java is also quite allergic to non-Java stuff, its perfect world is one in which everything is also written in Java. Unfortunately, that is not the way the real world works.

Java's shortcomings are being recognized and it's start to have the whiff of "legacy" as well. In the last 5-10 years not many would have picked Java as a technology stack for complex new websites for example, unless they are already invested in that technology.

However, there are no other real contenders for massive-scale, government-level, IT development languages.

Such a language would need to be:

• predictably available 30-50 years from now
• secure by design, plays well with others, not overly verbose
• suitable for line-of-business apps (i.e. processing health claims, rather than doing high-performance 3D graphics)
• clear and with low initial complexity for junior programmers (John Carmack ain't gonna be working at NJ's DMV)
• appropriate to projects in which tens or hundreds of developers work together on a tightly integrated system

COBOL comes close to this ideal - its main weakness are scarcity of skilled labor and verbosity. Java less so IMHO and I can't think of another widely adopted language that combines these attributes. Simple modern compiled languages like Swift, Go (C#?) might be adapted/adopted for this, but that's not their current aim and they lack vendor neutrality.

### Why does this matter to politics?

Because software matters. Banks have sometimes been described as IT companies specialized in customer service and money management. A modern government relies on IT in order to deliver all sorts of services and to collect taxes. Large scale IT failures have a budgetary impact, crowding out spending in other areas and can directly affect how citizens interact with their government. And they become talking points during election cycles.

On a related subject, the inability of local government IT, for schools, hospitals or cities, to manage and secure their systems have made them prey to ransomware attacks, which can very directly affect thousands of citizens.

In order to function well, governments are going to have to improve considerably in this field.

9

To expand on what you say about contractors, getting awarded a big government project is often a big project in it's own right (I saw one that had ~20 people working for 2 years). Only specialist companies are going to try. Trouble is, their core competency (https://en.wikipedia.org/wiki/Core_competency) becomes the getting of contracts, not the fulfilling. Also once the contract runs in to trouble the Gov dept can't just demand they deliver at the original price, because suing them just gives a long expensive court case and no new system, so its a matter of riding the horse that you have.

– Paul Johnson – 2020-04-10T20:23:49.670

1@PaulJohnson The people in charge of setting out those specs and assessing the bids the contractors have prepared are not specialists in IT, or even necessarily have a clear vision of what the system could do. I saw a project that had a team of people, including external contractors, working for 2 years to work out a spec on the government side that could be bid for. At that point there is a pressure to pick someone, rather than spending more time doing an analysis of bids. And cheapest bid is a quick objective way of choosing bids that is terrible for getting quality bidders. – user1937198 – 2020-04-10T23:20:19.830

1(Continued) So you have pressures on both sides, that favour these specialist contractors. – user1937198 – 2020-04-10T23:22:33.393

A camel is a horse designed by a committee. This is the real point you are driving at: that any new software system will tend to fail because too many people are allowed an input into its design. – Ed999 – 2020-04-11T20:13:09.507

1“…there is much less version-to-version stability…” You make a lot of very good points, but I don’t understand this one. A program written in 1995 for Java 1.0 will still compile today. Or were you talking about third party libraries? – VGR – 2020-04-12T02:11:09.640

Some real gems in this answer. One thing to note, which is not specifically mentioned, is that (in the UK at least) many government legacy systems were written by the government themselves - often by IT staff attached permanently to specific departments - not by contractors. The same is true of legacy banking systems - they were written and refined by actual bank staff, not by the latest contractors or freelancers. – Steve – 2020-04-12T11:26:18.420

@VGR I admit, I might on shakier grounds there than for the rest, never having been a production sysadmin. My thinking relates to the occasional "we are stuck on Java version x, because our vendor only certifies that". Is that a hard technical stop? Is it a support issue? Is it caused by having some form of non source-based library, compiled for an earlier VM? How does it compare to a COBOL site where not all resources are available as source code? Not entirely sure, but it does seem to be somewhat of a problem, more so than old-generation COBOL apps. – Italian Philosophers 4 Monica – 2020-04-12T22:10:12.640

Excellent answer. The only (admittedly, minor) objection I have to it is: "Even with the best of intentions, we in the IT industry do not yet have a proven, repeatable, low-risk solution to migrating old government IT systems." This is the same in the private sector. I.e., the wife of a friend just retired from forty years in the private sector as a COBOL programmer with a major financial institution that you might well recognize. – Doug R. – 2020-04-14T13:01:33.773

50

It's not just US institutions, there are many countries, institutions - including banks - and companies running on ancient software and hardware.

There are multiple reasons for that:

First, from the business standpoint, replacing software costs money. Why spend money to replace something that already meets requirements? If you have software A that does what you want, already exists and is "free", since you already own it, and software B, which doesn't exist yet, costs years of development time, probably requires new hardware and has the associated costs for all of that, which would your boss or your (often very much non-technical) boss 's boss approve? What would a government employee with a tight budget approve? What would a politician calling for less government spending approve?

Secondly, the exact workings of the system are often lost in time. No one actually knows what the system does exactly, but it interfaces with dozens of other systems through APIs that are equally unknown, obscure or deprecated. Creating a specification for such a system that a contractor/contracted company can work with is difficult, if not impossible. There is a huge risk that the resulting system will not replicate the behavior of the old system and stuff will stop working. It might do things like round .5 up instead of rounding to even like the old system, which could cause all the bills or transactions to suddenly be off by 1 cent compared to the previous system.
In government, a broken system could mean that the unemployed might not get their benefits, the tax payers not their yearly tax payback or any number of things that will lead to an elected politician not being reelected. Politicians like being reelected, so even if they understand the technical need, which is rare anyways, they might avoid the risks for political reasons. Often there were attempts at replacing the system, which then failed and then those failures get pointed at whenever someone proposes replacing the system.

Thirdly, the old systems usually have a very bad user interface. A new system will have a shiny, new interface, sometimes because creating the old interface wouldn't even be feasible in a new application. This requires training people. People that have worked in the company or institution for 40-50 years and are very set in their ways. Those people often have a lot of blocking power, which can completely halt development on the new system, because their requirements, i.e. "don't change anything," aren't met. They are usually also very hard to fire and replace, so the institution has to hire new people without firing them, which requires a bigger budget, which comes back to the first point.

In summary, as technical people we might understand that those systems will someday inevitably go "poof" and everything will break down catastrophically. Other people either don't know that it can happen, or they just hope that it will happen after their time, because in the end no one wants to pay the money, take the risk and take the blame for a project that will probably fail.

38I would disagree with the user interface aspect. Older interfaces, while they might not be "intuitive", are generally learnable. Newer, "intuitive" interfaces tend to be unusable for those of us lacking the particular intuition (or the cultural context) that the developers think everyone has. – jamesqf – 2020-04-10T17:15:39.323

4@jamesqf I agree with your general point. New UIs can be very awful for non-programmers (which is why UX designer is an actual job these days), though there are actually old terminal UIs still in use where you had to remember obscure key combinations to get things done, which you could only learn through rote memorization.Those are definitely not intuitive, though they are very fast once you know that e.g. left ctrl + right shift + f12 creates a new entry while right ctrl + right shift + f12 deletes all visible entries. – Morfildur – 2020-04-10T18:11:26.650

37You've got it exactly backwards, at least for me, and I am a programmer :-) The "obscure" key combinations are easy to learn because you can simply look them up, whereas there's no @$#! way to look up those stupid little pictures most UX designers like to use. – jamesqf – 2020-04-11T03:44:00.317 To your first point: one of my former professors liked to say that "software doesn't rot". In other words, it doesn't stop working just because it's old. Software is just a tool. Many of our most important tools were designed centuries ago, and that isn't necessarily a bad thing. – bta – 2020-04-11T05:13:58.653 3I honestly think that (3) undermines your argument -- it is perfectly possible to keep the UI intact, and only change the backend. – Matthieu M. – 2020-04-11T11:44:03.910 2 @MatthieuM. : Disagree. Show me a keyboard made in the last thirtyfive years which can type quadruple bucky cokebottle. – Eric Towers – 2020-04-11T19:56:06.313 @MatthieuM. That kind of requires things to have been implemented such that that's possible, and even if they are, the same argument about obscure, undocumented APIs still applies. – Austin Hemmelgarn – 2020-04-11T22:57:38.703 4The "obscure" key combinations are easy to learn because you can simply look them up. Only if they were documented, and the documentation still exists. – Yay295 – 2020-04-12T06:54:37.827 @ Eric Towers: Why do you think you can't program a standard keyboard to make use of 4 modifier keys? See for instance the documentation for xmodmap. And if you want to get really creative, you can have up to 7 - Shift plus left & right Ctrl, Alt, and Super (the ones with funny Windoze symbols). also maybe rthe Multi_Key. Now most of us aren't dexterious to use them all at the same time, but using 2 or 3 is common for entering Unicode characters. – jamesqf – 2020-04-13T05:06:10.487 3@bta Yes, software doesn't rot, it simply stops working on the new hardware or OS. Relying on perpetual existence of the original runtime environment is overly optimistic. To make things worse, imagine a program with a MAGIC instruction. You can't hope to get it running if no one in the world remembers what was the meaning of that. – Dmitri Urbanowicz – 2020-04-13T12:20:24.557 @jamesqf - actually, the UI that's important in this context is the UI of the development and production system! I could pick up enough COBOL this weekend (based on very old memories) to help NJ out. But there's no way to learn whatever cranky old horrible mainframe OS, development tools, and build methodologies they're still using in less than a large fraction of a year ... – davidbak – 2020-04-13T18:03:14.017 @bta Software that is completely self contained doesn't rot. Software that has to interact with anything else that might change (talk to external systems, run on newer hardware, run within an OS that receives updates, be compiled by a newer release of the compiler, etc, etc) most definitely does "rot", enough that there is "bit rot" as an informal technical term for it. – Ben – 2020-04-14T04:19:28.340 @davidbak: Well, that's you :-) I still use a descendant of the Xedit editor I first used on an IBM mainframe (because it's the best I've ever found). Makefiles haven't changed all that much. – jamesqf – 2020-04-14T16:25:39.597 @Ben - That's exactly why there (sadly) exists an entire industry that builds UNIX-based servers that maintain backwards compatibility at all costs. The vendor handles all of the things that change, and the software just keeps on trucking. It definitely reduces the incentive to change the software. – bta – 2020-04-14T23:17:33.510 12 I would like to address the underlying misconception that was expressed in the note: I am not thinking about big leaps, but at least to work with languages and frameworks that are at most 20 years old, not 60 years old. Yes, COBOL is a pretty old language and new systems are likely not made with it anymore. However, FORTRAN is even older (63 years) and still used in science and engineering. 20 years, on the other hand, is young for a programming language. I can think of very few languages that are "at most 20 years old" and already known and used. Rust, Go, Kotlin are about ten years old. Swift is 5. C# is 20. Most of the other languages are more mature. Web servers mostly use PHP (26 years old), Java (25), Python (30), also C#. Web frontend uses JavaScript (25). HTML is not a programming language, but it's about 30. Operating systems are written in C (48) and C++ (35). And the assembly language obviously precedes even FORTRAN, it's close to 70 and still used. Of course, these languages have evolved since initial appearance, but so (slightly) has COBOL with the latest additions in 2014. My aim is to drive home that age does not make language bad and most software is built on languages older than 20 years. You are right about the maturity of the programming languages, but software development is much more than that. PHP and Perl were voted as the most hated programming languages / frameworks for several years on Stack Survey and they are very mature. I think it is not the languages themselves, but the mindset of how development was done with them: automatic testing, code quality, documentation etc. Newer frameworks also promote good practices. – Alexei – 2020-04-11T05:17:17.797 If html isn't a programming language how is java script? – Neil Meyer – 2020-04-11T17:15:26.967 4Php is no worse than any other scripting language, people's views of it is tainted by how much bad code is written for it and how soul destroying it is to fix said bad code. – Neil Meyer – 2020-04-11T17:17:35.980 5 @NeilMeyer, I firmly disagree about PHP, and there's a classic essay that makes the point well. On the other hand, a well-designed language can indeed be timeless; I make my living at a Clojure shop, and am thrilled whenever I get a chance to interact with that part of our codebase; LISP's dated to what, the 1940s? – Charles Duffy – 2020-04-11T18:01:54.557 4 @NeilMeyer JavaScript is quite literally a programming language according to whatever definition of "programming language" there is. HTML is a markup language, calling it a programming language would be a stretch. https://stackoverflow.com/questions/14512218/is-html5-a-programming-language https://stackoverflow.com/questions/145176/is-html-considered-a-programming-language – Džuris – 2020-04-11T18:29:45.573 1@CharlesDuffy that essay is 8 years old and PHP is quite different by now. And the community is also different. I would say that for web it is better than Ruby, Python, Java and C#. The ecosystem is great. – Džuris – 2020-04-11T18:41:32.093 1 Another essay, this one more timeless -- are you familiar with the Blub Paradox? When working with a language that doesn't have abstraction capabilities, one doesn't know what one doesn't have. (Took me a long, long time to get used to Clojure for that reason -- I'd been in the Python/OOP/procedural world for a long time; it was years to wrap my brain around functional programming, but those years were well worth it). – Charles Duffy – 2020-04-11T18:56:16.933 1@CharlesDuffy Oh, I love that one. And I can relate — when I was doing science it was 15 declarative lines in Wolfram language to show that an algorithm works and then 15 hundred lines in C to make it work slightly faster. I am not comparing PHP to functional languages, my angle is that PHP has greatly improved over the past decade and has overtaken the other OOP/procedural languages (for web programming). – Džuris – 2020-04-11T23:05:12.763 1@Džuris: My history is fortran c, c++, perl, Java, python, and now PHP. Modern PHP has language constructs that I'm used to, so I'm happy. But the documentation (php.net) of "standard" functions and classes is worse than atrocious, almost comically bad. Java was best, the Javadocs usually contained a something close to a complete specification of behavior. python is not quite that good. PHP is terrible. One has to wade through the user comments to find the information needed, but of course those comments are not authoritative, they're mostly the result of experimentation. – President James K. Polk – 2020-04-13T13:04:33.073 Regardless of its merits or lack thereof, PHP is not intended to build major backend business/government IT, like payroll, accounting or tax collection. Its primary design goal is web development. – Italian Philosophers 4 Monica – 2020-04-13T18:31:30.677 9 I think that the premise of the question is not correct: there are often technical reasons to keep using legacy systems. To contribute to ideas that have already been mentioned, I want to mention two facts: • COBOL it not a stagnant language from 60 years ago. Yes, it is old and lacks many features of modern languages. But its standard was last updated in 2014. • Rewriting big systems is no minor decision. Several cases have been mentioned above, and I wanted to add that of Netscape: in 1997, when they were the clear leaders in the browser wars, they decided to rewrite their code from scratch. Three years later, when their code was finally more or less ready, their market lead was gone. This article from 20 years ago, by Joel Spolsky, mentions this and other examples. 7 People are quite insensitive to gradual degradation of something. This is not limited to IT infrastructure. Roads, bridges, dams, power grids, internet RFCs, international agreements, etc, ... are in general not redesigned and rebuilt until they fail spectacularily. Prior to failing, they are seen as "not broken, won't fix". Almost no one gets praised, re-elected or promoted for saving a disaster that didn't happen. ...which is why all the infrastructure built before concrete was known to have a ~75-year lifespan is today an enormous liability. – Charles Duffy – 2020-04-11T18:03:26.517 @CharlesDuffy the concrete is pretty much forever, given protection and maintenance. The bad thing is that even when not maintained, it outlives the average political career. Steel is much more foolproof - rust is pretty much visible and the general public is used to rusty things breaking. – fraxinus – 2020-04-11T18:55:59.223 Can you point me at a reference? Google finds http://aspirebridge.com/magazine/2009Fall/perspective_freyermuth_fall09.pdf, which points at a 150-year service life for modern (80s and 90s) construction; but that's long after the time period I was referring to earlier, wherein folks didn't realize that rebar embedded in reinforced concrete would rust. – Charles Duffy – 2020-04-11T19:00:54.173 Not sure. I am chemist and I know the major degrading processes in concrete. They are all related to water. You keep the water out and the concrete lives on. Same for the steel-reinforced concrete. Well, pre-tensioned concrete elements (common in bridges) are another matter that I am not competent in. – fraxinus – 2020-04-11T19:16:12.713 1The difference with bridges is that, in public IT systems, usually software never breaks down. What happens is that politicians want to change systems of administration, usually to introduce more market bureaucracy, and for ideological reasons they are often unwilling to accept that this increases complexity and cost. – Steve – 2020-04-12T11:52:13.240 There are over 600,000 bridges in the US, approximately 0.01% of which have collapsed in the past 70 years. I find your claim that most critical infrastructure isn't repaired or replaced until it fails catastrophically rather dubious. Those are certainly the most memorable instances, but that's because "regular maintenance performed on local bridge" isn't a particularly sensational headline. – Nuclear Hoagie – 2020-04-13T15:27:15.990 Exaggeration still is not a false claim. Then again, a "failed infrastructure" is not only a collapsed bridge. How about a bridge that is perfectly healthy from the construction standpoint, but always congested? You don't know of any? You lucky bastard! – fraxinus – 2020-04-13T15:39:21.317 @fraxinus: Depends on who made the concrete. My concrete driveway, poured no earlier than the 1960s, is crumbling to bits, while Europe is full of ~2000 year old concrete structures, some of which look like they were finished a few weeks ago. – jamesqf – 2020-04-15T04:21:11.397 @jamesqf I live in Europe. Europe is also full of concrete structures crumbling to bits, made in the last century. Those that are ~2000 years old are result of a rather expensive restoration (and few of them - of ~2000 years of continuous maintenance, yes, it is possible). – fraxinus – 2020-04-15T07:34:10.880 @fraxinus: I don't think so. Some might have been restored, others show no sign of it. For instance the foundations of the forum in Lausanne http://www.archeoplus.ch/en/archaeo/lausanne/lausanne-en-vidy-forum-1.htm or the remains of Aventicum https://en.wikipedia.org/wiki/Aventicum – jamesqf – 2020-04-15T17:44:40.643 4 Information Systems used by the state and in general IT in the public sector is a very tricky subject and always have been. From the strategic R&D perspective, you need to formulate requirements on what do you want to achieve by having given IT infrastructure. On a wide level, as you can find in the public sector, it's a very complicated thing to do. First point, you have an existing IT system and infrastructure. If you want to change some part of the legacy system, it may be extremely hard to do, because of it's complexity, unknow documentation and requirements, people working and managing this area of IT system not being there anymore, or even when you have people who made those changes, they won't anymore remember exactly what was the requirement and need for their implementation in the first place, because it was a long time ago. Another thing you can do is to change the entire IT system, which has its own problems: The second point, you have political cycles, which means some administration is going to formulate the need for IT innovation. Now, when we are talking about a whole area like defence, social security, taxation, healthcare... Those things take a long time and money to be implemented. A lot of times, however, it spans administrations or people in charge of those innovation project. After new administration or strategic unit emerges, for example as a result of an election or even a simple change in management of the given institution, the new unit doesn't want to continue in given innovation project, they cut funding and stay on the same level or formulate different requirements. This is usually a more dominant problem of public sector than a private one. Thirdly, you can very often find old methodologies used for developing existing systems. They are very inflexible, usually not very effective, often going back to beginning of the 2000s when the level of IT project management was usually quite frankly catastrophic. So when the strategic unit wants to hire a new company which uses modern methodologies of software development, they don't want to go into changing these old legacy systems because in general, it won't work and won't be worth their time and effort, so they will want to go from scratch, which usually takes time and a lot of money, where we are getting back to the problem of political cycles and strategic endeavour discussed in the previous point. 8"usually not very effective, often going back to beginning of the 2000s" ??? Are you saying that people which went to universities in the 1990s and started to develop systems in the late 1990s did it in an ineffective way ? THAT is ageism ! Hint: a lot of advanced software projects went very well in the 1960s, 70s, 80s and 90s ! Yes i know about scrum for example : but : IT systems in state systems implements policy, regulations and judicial law (for example taxation.) – Stefan Skoglund – 2020-04-10T13:15:08.120 4@StefanSkoglund Indeed. Every new development fad always seems to be justification to support bean-counters' desires to "do more with less". Thirty years of experience has taught me that all the "process" in the world can't cover up for crappy programmers working to low standards on oversubscribed equipment, and whatever process you choose isn't very relevant for good programmers held to high standards with a properly-capitalized environment. Today's "smart" people could never figure out how the ancient Egyptians were able to get the pyramids almost perfectly square... – Just Me – 2020-04-10T15:48:52.063 2 @StefanSkoglund a lot of advanced software projects went very well in the 1960s, 70s, 80s and 90s Maybe. beware of survivor bias. – Italian Philosophers 4 Monica – 2020-04-10T19:43:53.823 Well pardon me, I'm not saying that every IT project in the 90s was unsuccessful. But as somebody who worked in the sphere of public IT domain, I've seen a lot of pain in legacy IT systems, especially in the US, where after some company today used rapid Business Process Reengineering and have managed to save up to 2 mil dollars peer week after developing new and modern IT systems. It's, for example, a case of the Department of Defense in the US. Globally you can find this pattern all over again, so sorry but from the practical point of view I stay behind my answer. – Patrick – 2020-04-11T10:09:58.957 @Patrick The US Department of Defense saved$2 million/week? How much did they have to spend to "save" that much? There are lies, damn lies, and statistics - and then there is government "accounting". And even if there really is that much savings, the "process" didn't do it, the people who did the work did it. A cookbook doesn't feed you. A good cook can make the shiite in a Martha Stewart cookbook edible, a bad cook can take a Julia Child recipe and make something a dog won't eat. – Just Me – 2020-04-11T15:20:35.570

Some people who specialise only on the technical side of things and not on business have your kind of attitude. You need to take a look at what do you have in a moment, what do you want and how much it's gonna cost. If you don't have good requirements and IT strategy, governance in a public institution, which you very often don't have, you are going to have huge inefficiencies and are going to be burning cash wildly. – Patrick – 2020-04-11T16:53:25.920

1@JustMe And if you want to talk about the statistics, there were numerous studies done by for example McKinsey or other consulting companies in the 90s and at the beginning of 2000s where 60% of IT projects have used more financial resources or time than planned and around 20% ended up so badly that financially endangered the project customer because they either completely failed in the process or failed to fulfil user requirements in any meaningful way. Which was one of the core moving forces behind agile manifesto which lead to the level of IT project development which we have today. – Patrick – 2020-04-11T16:55:38.403

1

@Patrick Hmmm, agile had its genesis in the CCC Project guess how well that went? You think Agile is going to make a big government IT project an automatic success? I got a bridge to sell you in Florida,. And, sorry, saving 2 mil a week, every so often, will hardly put a dent in \$1B+ project failures, which I would bet are more frequent.

– Italian Philosophers 4 Monica – 2020-04-11T18:08:30.153

@Patrick Good cooks don't need cookbooks as crutches. It's the "business side" that decides they can read something from Gartner and learn how The New Whoop-Diddee-Doo Process can compellingly overwhelm adaptive infomediaries". Ask Boeing how well that works.

– Just Me – 2020-04-12T01:01:12.253

Agile does not address the fundamental reason why projects go over budget or schedule - the inability to accurately assess the time and effort needed to complete a future task because complete information to create an accurate assessment simply does not exist. And in a complex project with many tasks, random variations in individual tasks will lengthen the time it takes to complete the entire project and cause cost overruns. To finish early and under budget, everything has to go right. To finish late and over budget, only one thing needs to go wrong. PowerPoint slides can't make that go away. – Just Me – 2020-04-12T01:09:09.310

@JustMe, the problem is also that the standards have been set higher. Up until the mid-90s, computerisation was worthwhile at any cost and on any schedule, because as long as it worked in the end then it replaced armies of clerical staff, so it didn't matter if it took twice as long. Nowadays, there is often no business case for replacement of existing computerised systems. – Steve – 2020-04-12T12:13:47.880

@JustMe Look, I wasn't trying to be absolute, the world is stochastic and always will be. I was just trying to answer a generalised question and I gave my personal opinion and expertise as an answer. I don't want to say all older IT projects were bad or anything, it was just my experience from working within the public sector IT sphere and also having some knowledge about IT systems design & development. – Patrick – 2020-04-12T12:17:00.393

@Steve well you have for example Robotic Process Automation, which is from the side of commercialization quite a new concept and it will very probably make some waves in the administrative employment like in the public sector or other areas. – Patrick – 2020-04-12T12:18:20.357

@Patrick, the only distinguishing feature I can see about so-called RPA, is that the RPA program interacts with other programs much as the user would (through UI operations), rather than through other means. It doesn't seem like a new concept - screen scraping has been around a long time - it just sounds like another wave of marketing hype. – Steve – 2020-04-12T12:39:16.687

@Steve In its core it may not be a completely new system but given the fact that the market for RPA 10 years ago had very marginal value and today is already worth around 100 bill dollars and is used by multinational companies, some of which managed to reduce up to 100 thousand administrative working hours per year I would say it has a drive and will probably keep growing together with RPA capabilities enhanced by AI, Machine Learning... A huge thing is legislation because some administrative processes are currently not possible to automate due to legislations... – Patrick – 2020-04-12T21:11:48.190

1@Patrick, is not just "not completely new", it's not new at all. The vagaries of the market demand for such techniques says nothing about their novelty - interest in space travel has been surging of late, but it's old hat technologically. As for AI, that's the name the marketing department give to a technique every time they find an application in an area which was previously thought to be recalcitrant to mechanisation - they were saying the same things when they first automated the payroll calculations with computers in the 1960s. – Steve – 2020-04-12T21:51:41.843

@Patrick, also with regard to legislation, the usual requirement is that bureaucracies are accountable, and in the final resort that a judge is satisfied as to the propriety of their behaviour. The problem with many so-called AI pattern recognition techniques is that they are effective in some cases at finding patterns, but ineffective in articulating the conceptual structure to those patterns, leaving no way for anyone to assess the legitimacy and reliability of the process. – Steve – 2020-04-12T22:05:56.813

@Steve What I meant with legislation was for example accounting RPA software, which calculated salaries which should have been paid automatically. In most countries, there is a problem with legislation wherein a case of error you cannot have software to be responsible. In countries like the Netherlands, there are already laws dealing with this kind of problem and you can have fully automated RPA systems without any human input. – Patrick – 2020-04-13T09:36:40.163

@Patrick, I'm not aware of such legal restrictions myself. Payrolls in the UK have been computerised for decades, even in the era when it was a criminal offence to withhold wages. I'd be interested to understand what the position is in other countries. Also, with "software being responsible", there is always ultimately human liability for choosing to apply incorrect software to a particular problem. – Steve – 2020-04-13T12:32:06.593

4

COBOL is a pain to write. But it comes from a time when storage and processing were slow and expensive. Brutally simple, ruthlessly optimized. Not a bad choice for systems where speed and transparency matter.

Banks suffer few of the government contract issues mentioned yet they mostly still use COBOL for things like ATM transaction processing.

FORTRAN still has its niche too. For complex math problems, everything else is either 1) even harder to use ( C ) or 2) slower (everything else).

2

The only reason this is a political question at all is because the services are important and such a change is expensive.

Unfortunately the political process is pretty much exactly wrong process for deciding how and when to do such a migration. There isn’t a constituency for it, there’s no immediate benefit. Also, because of the above the projects that do get undertaken have a tendency to fail, which raises yet another barrier to doing so.

2

It's not just government institutions, any private company that is old will have the same issues, e.g. Ticketmaster: https://www.cio.com/article/3448036/ticketmaster-tackles-tech-debt-with-streaming-data-platform.html

We had tech debt that was older than most of the companies I’d worked at,” he says.

Ticketmaster had achieved early data-science successes through custom data integrations with its various IT systems — and there were plenty of those. After 40 years of acquisitions and internal software development, the company had around 300 IT systems, each on its own island of data.

I happened to meet a guy at a tech Meetup, who worked at TicketMaster and he said they don't even dare breathe on the old code that's at the core of their ticket allocation algorithm, because it's so complicated that no-one is sure they understand all the intricacies and edge cases and they just emulate the hardware it was designed to run on.

One of my favorite parts of Vernor Vinge's awesome sci-fi novel "A Deepness in the Sky", is that the most in-demand, yet demanding profession is "Programmer Archaeologist", i.e. someone who can dig through 5 thousand years worth of software dependencies and have a smidgen of a clue of what it's actually doing:

There were programs here that had been written five thousand years ago, before Humankind ever left Earth. The wonder of it—the horror of it, Sura said—was that unlike the useless wrecks of Canberra’s past, these programs still worked! And via a million million circuitous threads of inheritance, many of the oldest programs still ran in the bowels of the Qeng Ho system. Take the Traders’ method of timekeeping. The frame corrections were incredibly complex—and down at the very bottom of it was a little program that ran a counter. Second by second, the Qeng Ho counted from the instant that a human had first set foot on Old Earth’s moon. But if you looked at it still more closely. . .the starting instant was actually some hundred million seconds later, the 0-second of one of Humankind’s first computer operating systems.(i.e. they still run on Unix time, presumably the Year 2038 problem has been solved)

2

The political aspect of the "why" : I guess there is nothing to gain, politically.

At first, we assume we have a system, which has been set up to do its job, and works without major errors. Time passes, and the system continues to work, yet is increasingly becoming outdated.

Now, x years after the system has been set up; enter a new official keen to update outdated systems. This can go either way: the migration to the new system works, or it doesn't. If the migration does not work, massive negative PR ensues. If the migration works, the new system works just as intended as the old system did. In the best case, there are no side-effects; in the worst case, the official gets reprimanded for wasting money on updating a system which still did its job.

Politicians usually don't get praise, when nothing is happening. Social security didn't break down yesterday, good job!

1

Budgets, if you want the most simple answer. While less efficient on a longer-term timescale, maintaining old code is cheaper on a year to year basis, and legislative budgets, authorizations, funding and appropriations operate on a yearly basis. When projects span more than a single government fiscal year, if money appropriated for one budget year isn't spent by the end of that year (what project, government or private, doesn't run behind schedule), those funds are often lost unless specially "encumbered" or authorized to carry over into the next fiscal year. With every two-year election cycle, there runs the danger that the powers that control the purse strings might change to someone who does not view that kind of spending as necessary, or a priority.

It's very difficult to get politicians who are concerned with their short-term popularity to approve something that will show a fiscal benefit by the six, seventh or eighth year after implementation, with huge up-front price tags, inevitable cost-overruns (which will be paid, because once you sink X dollars/Euros in, you're committed to seeing it through) and glitches when these massive, complex systems are brought online.

There is no competition or economy of scale to bring to bear because these systems are mostly custom-built to perform very singular functions. As long as they can keep something functioning with bale wire and spit, they will do so until they absolutely can't.