There are plenty of great arguments against all of the points I make here, and if you'd like to make them I'd be happy to hear them.
One of the most interesting things about programming today is the way that it is almost entirely unregulated. The primary reasons for this, in my opinion, are
- What we use to get the job done changes drastically all of the time, and there is very little consensus on the best way to do anything.
- A vast majority of software is simply unimportant or “low impact” enough that it doesn’t really matter if it breaks or has a defect.
Both points are changing rather drastically these days, and the future to which those changes point is fascinating. For point #1, the first thing it points to is that we are still basically cave people when it comes to building software. I don't say this to belittle the genius and accomplishments of my peers and those that came before us, only to point out just how far we have left to go. If we had found some mainline of software effectiveness, some undeniably better toolchain, we wouldn't have the massive amount of tools we have today.
For example, the way that a vast majority of houses are built out of the same set of materials, using the same practices. Ultimately, once your tools get good enough, you start hitting on tangible constraints to improvement, constraints that require "quantum leaps" (maybe a bit dramatic) instead of gradual improvement over time. Until we could mass produce steel beams, a skyscraper was never going to happen, no matter how smart the architect or effective the construction team was.
At the moment, it seems that we're experiencing a sort of Cambrian explosion of technology and, carrying on the analogy, I'd expect at some point a "dominant species" (or species') will emerge. And in all reality, I don't expect that to be a programming language. I suspect it will be a platform of some kind. A black swan event of no-code platforms, if you will. Our industry, after that point, will look nothing like it looks today.
As for point #2, we're seeing more and more just how much a simple application has the ability to pry into our private lives, manipulate our lizard brains, and pose a very real threat to our lives and safety. In a massively networked world, a single bad program could pluck one load bearing beam out of the massive house of cards our digital structures are built on, causing cascading failures that actually have the potential to result in massive loss of life. Right now, regulation in any serious capacity would utterly hamstring the software development process. There are places where we do regulate software, like the software that runs in airplanes, or nuclear power plants. In most cases, we're just doing the best we can to build reliable systems on top of unreliable communication methods, unreliable hardware and unreliable humans. This isn't a slight on people, or the people who make hardware. It is just a known fact that things can go wrong at any point with no explanation.
Things will have to change when all of our systems are so massively spiderwebbed together that failures cascade in unpredictable and potentially massive ways, and when each individual piece of software has the capability to practically ruin people's lives. Consider how many applications you've typed your bank card into, or your address, or who have access to your location or even health data. This stuff is already all getting leaked, really proving how few people actually care about the security of their information. But what happens when some pizza delivery app somewhere messes up and sends bad information to its driverless car fleet? More and more people have more and more access to the raw unadulterated power of digital automation, and that power itself is only growing over time. The black swan event in this case happens when the increase in leverage that any given piece of software has makes the risk outweigh the rewards.
Right now, we might say that it is unreasonable to regulate the software industry overall. But in the future, I suspect we'll be saying that we can't afford not to.