2009-03-31

EDVAC, von Neumann, Turing

The machine on the right is EDVAC, the first based on von Neumann's architecture ever designed.* I put this image here as a reference point for the rest of my blog. The speed with which that machine has evolved into what you're using right now, these gadgets that have infiltrated pretty much every aspect of our lives, is simply astounding. The most amazing thing is that the basic architecture hasn't changed much. You are using, essentially, a von Neumann machine right now. Sure you might have quad cores, fancy pipelines and other newfangled stuff in your CPU (which, admittedly, are modifications of VNA possibly making your computer slightly non-von Neumann) but beyond those details it's mostly a von Neumann machine. It's that big hunk of metal on the right, just (much) faster and smaller.

The main concept behind a von Neumann architecture is the stored program, the idea that data and instructions can be treated equivalently. Therefore programs can write, read and execute other programs by treating them as data. One thereby gets a kind of generality such that anything that can be made into instructions (really, anything that you could "instruct" somebody to do) can be done by such a machine by reading in those instructions as data and running them using it's internal (von Neumann) architecture designed to do exactly that.

In fact von Neumann wasn't the first to come up with this idea. The credit for the invention really goes to Alan Turing, a man to whom we owe a debt of gratitude for many reasons (including playing a huge role in helping defeat the Nazis through his code breaking work). His invention, purely a thought experiment until the 1940s, is now called the Universal Turning Machine. It introduced the concept stored program, that is the data/instruction equivalence, in mathematical terms. So the von Neumann architecture can be considered the first architectural implementation of a Universal Turing Machine.

We modern programmers have become so accustomed to our routine that we don't even stop to marvel. We write coded instructions as simple character data. That data is read by another program, itself made with similar data constructs, which either interprets our code or compiles into another instruction set and ultimately reduces our own instructions to a different set of (this time binary) instructions that a von Neumann machine can interpret and follow. This was Turing's idea in a nutshell, his purely mathematical concept, that millions of programmers now implement daily to do real work.

So what I'm trying to impress upon the reader is this: that all this stuff we take for granted, these online social networks, the Wii, your ipod, whatever, they all have their source with these guys and some other intellectual giants who came up with these ideas over a short period a little more than 50 years ago. It's an amazing story, and they are some fascinating characters to boot.

The question still to be answered is, how far will this VNA/UTM stuff take us? These men thought it would lead to genuinely intelligent machines with all the capacities of human thought. Whether this will happen is still unknown (I have opinions on this that I'll share later). It has certainly taken longer than they thought it would. Then again I don't think they envisioned ATMs, online shopping, Twittering, DVRs, Ipods, WoW, netbooks, texting, blogs... yikes.


* Although not the first built. Due to budgetary constraints that distinction goes to a British team that built the Manchester Mark I in 1949 [1]. The EDVAC itself became operational in 1951.

2009-03-30

A Project Log (plog)

In the course of these postings I intend to update readers on the progress of a project I am undertaking. The gist of my project is to create a web application to facilitate user creatable online courses. I intend it to become a synthesis of an online university, a blog and a social networking site. As such it will contain functionality for creating course materials in the form of blog-type postings, designing quizzes both multiple choice and free form and common social network features similar to those found on linkedin, twitter, etc. I think it's an ambitious project and I don't know where it will end up but it will at least be a good learning experience. I plan to release it under GPL and since I will be sharing my step by step design experiences it should be easily implementable by anybody who wishes to steal my idea (giving me deserved credit, no doubt).

Anyway I have already started on the project but I will in the course of posting about it back up and start from the beginning thereby giving the full design cycle. It will be a Java web application using Spring MVC as the framework and JPA for data access. I started the project on Struts 2, with which I'm already familiar, but decided to switch to Spring since I wanted to learn it. So far I haven't regretted it although there are things I like about Struts that I miss in Spring MVC (more later) and it has been a steep learning curve. But, hey, learning is what it's all about.

So, in subsequent posts I will lay out my basic design and then move through implementation details as I encounter (encountered) them. Hopefully it will be an interesting experience for the reader, as it has been for me.

2009-03-29

Cilly C

I'm mostly a Java programmer with a smattering of other languages thrown in. I've recently come to the position that any programmer worthy of the title should be at least comfortable in C/C++. I have long forgotten most of my C so I am now making the effort to relearn it.

I will here relate my experience with my first C program in this curriculum. Since I despise Hello World programs I decided, instead, to write a program to compute and display a Fibonacci sequence. BTW, I love Fibonacci numbers, but that's for another post.

The basic program was simple enough. For those who don't know a Fibonacci sequence consists of all numbers that are the sum of the previous two numbers in the sequence with a starting condition of 0, 1. So one just needs a simple loop to perform that calculation.

Where it got interesting is when I decided I wanted my sequence to be 50 numbers long. One of the properties of Fibonacci numbers is that they get really big really fast (they were originally invented to model rabbit population explosions.) So a little before 50 one gets an integer overflow and nasty twos complement artifacts.

Now, being a Java programmer, I say "no big deal, I'll just use a long." How naive. Silly me, I had forgotten that C doesn't give any guarantees about the size of primitives across implementations. gcc apparently treats a long as a 32 bit integer, just like an int so I still got exactly the same behavior.

Digging through the gcc compiler documentation I found the solution. There's a data type long long int. I love it. At least those C compiler writers have a sense of humor. Anyway it works great... unless of course you want 100 Fibonacci numbers.

Here's my final program.

#include

int main(void) {
int i=0;
long long int fib1=0LL;
long long int fib2=1LL;
long long int temp=0LL;
printf("%d\n",fib1);
printf("%d\n",fib2);
for(i=2;i<93;i++) { //93 is overflow
temp=fib2;
fib2+=fib1;
fib1=temp;
printf("%lld\n",fib2);
}
return 0;
}

Inaugural Post

This is the inaugural post of my new personal and professional blog. I plan to focus primarily on information technology and computing issues. From the perspective of a professional blog I will describe my own projects as well as my views on trends in information technology. I will also examine computing from a historical perspective, speculate on the future and make observations of a more academic nature in the field of Computer Science.

I am passionate about information technology and Computer Science. The power of these machines and the ways they have permeated our lives is phenomenal. I want to track how this came about and so there will be a number of postings of a historical nature, examining the giants of the field and their ideas and creations. I think most people have little idea just how remarkable a journey this has been. Also speculating on where this will all end up can only give one pause. It's my firm belief that we have barely scratched the surface of what information processing and computing machines, aka Turing Machines, are capable.

Finally I'm an actively practicing technologist. From that perspective this will be a professional blog. I am working on a number of projects and I would like to describe them for those who are interested and who might find my experiences useful in their own professional endeavors. I work mostly as a Java and Oracle programmer although I am lately branching out into new fields. More on that will follow.

Welcome. I hope you find something interesting or useful in these pages.