Menu

Can Programs Write Other Programs

Can Programs Write Other Programs

We always hear about this "doomsday" scenario in the near future. Where programs will learn to write other programs. Where for loops will eat while loops. And where, if you're a fellow programmer, you will be out of a job and begging for scraps in some futuristic unemployment line where Asimo's will tell you to get back in it. So today, I thought I'd explore this scenario, if it's possible, and if more importantly, we're anywhere near close to it happening. Stay tuned for some good news. And maybe some bad news at the end. This will focus more on a practical perspective as someone's who's worked in the software industry for over a decade. It is by no means intended to be a thesis on the future of Artificial Intelligence, machine learning or decision trees. But more like, do I personally see this phenomenon sneak its way into my own work life and into yours.

Can Programs Write Other Programs

Can programs write other programs

Well, technically yes. And they have been doing so for decades, we just don't think of it as such. A program can do whatever it is you program it to do. It can write other programs, it can run other programs and it can delete other programs, hopefully at someones instructions, if it is so instructed to do so.This idea isn't anything new for the most part. Automatic programming as it is known is a concept that has been around since the 1940's. And it was mainly used to describe higher level programming languages that handle much of the older punch-card systems and mainframes of a time long gone. They are in essence the programming languages that we presently use today. Credit is given to Alick Glennie as being the father of what we now call the compiler. Thanks Alick!

Where we see it now

  • Compilers
  • ORM generating tools
  • IDE's generating HTML/CSS/JavaScript
  • Code transcribers

Technically, while not very useful, and it's easier to just write the code than it is to run this thing each time, the following will create code given my instructions. And my instructions were the actual code.


string thecodes = "";
thecodes = "for(var i = 0; i < 10; i++){console.log(i);}";
WriteToFile(thecodes);

But the concept is what I wanted to show. And that concept/idea is there and it's realistic, to some extent.

Right now we declare variables such as integers with a single line of code. That gets compiled however, and the result is something more machine readable. Essentially, the compiler is writing another program for the computer, in a similar fashion to what's above, except many times over in complexity. Memory is being allocated and bits are being shifted around, all without our knowledge. So the next question becomes, how far are we from having yet another level of abstraction come along, at a much higher point, and make the job of variable declaration into something more trivial. And the answers aren't as simple as some news articles would make it out to be.

it's more expensive than having a programmer around

Before I talk about the logistics of it, and the current research being done into it, let's talk about something more concrete. In order to have a program be fully capable, and fully "allowed" I may add to write or our programs, we need to tackle the resource barrier. And the truth is, if your current work machine is having a hard time with 12 browser tabs, it's probably not yet ready to handle the pseudo AI task of building the next Windows OS.

And of course the biggest barrier of all, the one that stops most advancement nowadays, including getting back from Mars, is power consumption. Needless to say the more operations we're performing, the higher the required wattage. For now, maybe one day, the operations will power themselves in a harmonious event. But for now, we need more juice, and we need more storage.

Can Programs Write Other Programs

It's also still a very "scientific" scene I'm afraid, much like programming was back in the day. PHd's getting together over strong coffee to discuss decision trees and memory allocation and such. Which is a good thing, because it means that someone is still keeping the torch alive. But that also means we've got some time before our jobs are in danger.

we're not going to see it commercially

And I'll explain why. Computers normally deal in the realm of A to B, sometimes C, and if C, then D but only after noon. Very specific instructions essentially. Humans, as we call know, offer no such instructions. We deal in C to A, then R, but then not R and we go S instead. And essentially, that's probably one of the hardest parts about programming. Not the coding itself, but the human element. The manager who changes his mind hourly. The client to makes bizarre requests, that by all machine logic no one should follow. But we do, because that's our job. And that's hard to quantify programmatically.

I don't even want to fathom how much energy will be required for a computer to figure that one out. For one, it doesn't know context. It would need to be told explicitly which modal exactly. Secondly, it doesn't know what a 'few seconds' are. But maybe it's smart enough to guess. Maybe it'll do 5 seconds, and then based on traffic patters on the site it will adjust accordingly.

Why it's not so bad

I'm gonna finish this post with a few words on why we need to have programs write our programs. And I say this, because I write programs, to build me programs sometimes. Nowhere near the level of anything sentient or scientific. But as a programmer would. As someone who sees patterns all the time, and decides to write a script to handle the dirty work for him. Things that I find myself doing all the time include creating a CRUD or processing data feeds. So I see the benefits immediately. What used to take a tedious amount of hours, now can take seconds with a carefully crafted "recipe" as input.

And assuming that we get to a point in AI, error checking, validation, etc in which we will begin to trust our robot brethren, then these are a few points as to why it wouldn't be so bad.

Reduction in errors

Programmers make mistakes. Like alot of mistakes. Sometimes tiny and sometimes they accidentally leak out millions of records. Other times they flood people with radiation "accidentally" (I hope). And this will continue to happen, the more we saturate the market with "programmers" who are ill-advised as to what programming really is. FYI, it is not telling a marker to move up twice and left once. At least, it's not suppose to be.

Time is a commodity

Programming can be time intensive, for sure. Much of that time goes into implementation. Writing pseudo code or a text based specification can take a senior level programmer minutes to some hours. But then taking those specs and writing code can take days to weeks. And while for most day to day jobs, that doesn't really have an impact on anything, in a more scientific scenario, for example preparing for a Mars mission, this could be the difference between our generation breaking the planet habitation barrier, or our kids or grand kids doing so. And I for one, want to beat my future kids to Mars.

We need to evolve again

Just as I mentioned earlier that higher level programs needed to be created in order to remove the tedium that was assembly language and punch card systems and the "Mark 1", "Mark 2" computers which have very cool names, we too are now in need of that next level. For the most part, all that we have gotten in the past decade as far as programming paradigm shifts go, is 3rd party support. Which is the easy way out. We don't program, because we "use" someone else's program instead. It's not quite the same as taking hundreds or thousands of basic operations and transcribing them into 1 or 2 lines.

Just how 60 years ago it was tedious to deal with punch cards and 2 ton machines to gobble them up, it is now somewhat tedious to read and write data to a datastore, and to create login systems, and to output tabular data essentially. So i'm not too sure what the future of automatic programming looks like. But if we can learn anything from the past, then by the time it gets here, the term "programmer" will have a whole different meaning.

Walter G. author of blog post
Walter Guevara is a Computer Scientist, software engineer, startup founder and previous mentor for a coding bootcamp. He has been creating software for the past 20 years.

Get the latest programming news directly in your inbox!

Have a question on this article?

You can leave me a question on this particular article (or any other really).

Ask a question

Community Comments

No comments posted yet

Add a comment