Eye on Policy

Tom Temin

“Eye on Policy” is a monthly article by Tom Temin, who offers his expert insights on the latest government IT developments, trends, and challenges to the DGI audience. Tom is the former host of “The Federal Drive” on Federal News Network, and a respected journalist covering federal technology and policy. With his deep understanding of federal operations and technology, his analysis will be an invaluable resource for professionals navigating the evolving landscape.

AI Software Economics

The next debate in artificial intelligence (AI) centers on whether so called artificial general intelligence has arrived or will arrive. Some of the mega AI companies are already claiming they have it.

In the more practical domains, AI has already caused change. No longer do organizations dutifully promise, while already knowing better, that AI will not replace people, only enhance what they do. That fig leaf has fallen away to reveal the fact that AI is, in fact, replacing people.

At one major contractor whose products operate in every agency office, software development employees are undergoing a thorough weeding out. Some groups that had twenty-five people now have ten, maybe fewer. The explicit instructions to remaining supervisors: use generative AI to do the coding.

Resulting code still needs functional, quality and security testing, but the implications are clear. AI is changing not only the procedural norms but also the economics of software development.

If it has not already, this shift will change the normative expectations about what is a reasonable cost basis for federal projects—or any projects for that matter. These refreshed expectations will flow down through primes to subcontractors.

Do not take my word for it. One of the generative AI platforms returned this in response to my carefully crafted prompt: “The rise of AI necessitates a re evaluation of traditional software economics, including investment strategies and risk management.”

Translation: Buyers, including federal buyers, will expect step-function reductions in the bids they receive for development work.

What is old is old, again

In a characteristically dry assessment of 11 large agency systems, the Government Accountability Office commented on how agencies had fully documented plans for modernizing only eight of them.

Oddly, the GAO cited “sensitivity” in referring to the systems, but not the agency owners, by numbers. I read into this that the systems carry high risk for serious data loss or economic damage should they be hacked.

Without reiterating the whole report, it is fair to say the modernization drive never ends. Even after years of experience with low-code, no-code strategies, refactoring for cloud and modular DevSecOps, industry still has a large business opportunity in modernization.

GAO urges agencies to take the basic step of creating detailed modernization plans. Given the many chief information officers and other senior staff that have left agencies, the Trump administration now owns the legacy system question.

How will it approach this challenge? Veterans Affairs provides a clue. IT budget cuts at VA, coupled with a doubling down on efforts towards the troubled electronic health records project, point to an “all the wood behind a single arrowhead” approach.

But do not expect freely flowing billions. Between the above-mentioned AI emergence and the administration’s overt squeeze on some of the largest contractors, expect instead a strategy of how little—and not how much—it will take to get exponential improvement in the old systems.

Here comes a streamlined FedRAMP

For some dozen years, FedRAMP has vexed the government and cloud services providers alike. FedRAMP, which stands for Federal Risk and Authorization Management Program, has always had a simple concept. The government, via the FedRAMP program office at the General Services Administration, assures a cloud service or set of services are secure.

Once authorized, any agency can confidently use that service or those services. In practice, FedRAMP has been slow and expensive. The FedRAMP 20X initiative has, in effect, rebuilt FedRAMP in place. Explaining it in detail at DGI’s 13th annual 930gov Conference, FedRAMP Director Peter Waterman described an authorization process a year shorter than before for “low” authorization, which covers the bulk of companies. The new way—for details check GSA’s site—has authorized four companies, Waterman said, and GSA expects a fast ramp, no pun intended, to thousands of other CSPs.

He described a future state of “cryptographically validated, computer-based third-party assessments.” And an environment in which the hyperscaler’s will offer other vendors tools to get them to a FedRAMP-worthy state.

Translation: If FedRAMP was a checkbox compliance program before, it is about to become a real cybersecurity program. With validated tools and frameworks for vendors who themselves use cloud-hosted systems instead of their own data centers, the program will push out what Waterman called “bespoke systems.”

Companies trying to hand-craft secure systems will be left behind, Waterman said, “and that’s by design.”